Read Bill Ministerial Extracts
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Stevenson of Balmacara
Main Page: Lord Stevenson of Balmacara (Labour - Life peer)Department Debates - View all Lord Stevenson of Balmacara's debates with the Department for Business and Trade
(2 months, 1 week ago)
Lords ChamberMy Lords, it is a feature of your Lordships’ House that certain topics and Bills within them tend to attract a small and very intense group of persons, who get together to talk a language that is not generally understood by the rest of the world—certainly not by the rest of this House—and get down to business with an enthusiasm and attitude which is very refreshing. I am seeing smiles from the other side of the House. This is not meant to be in any way a single-party point—just a very nice example of the way in which the House can operate.
I have already been struck today, as I am sure have others in the group that I am talking about—who know who they are—by the recognition that we have perhaps been a little narrow in our thinking. A couple of the speeches today have brought a new thought and a new sense of engagement with this particular subject and the others we deal with. We need to be aware of that, and I am very grateful to those noble Lords. In addition, I am grateful to the repeating by the noble Lord, Lord Knight, of the speeches he had to make in 2018 and subsequent dates, and also the wonderfully grumpy speech from the noble Baroness, Lady Kidron. We have also got to take into account what we got wrong on joining the European market—which I certainly look forward to. It is a serious point.
I am also very grateful to my noble friend the Minister for setting out the new Government’s vision for data protection, for her letters—which have been very useful—and for her help in setting up the meeting I had with her officials, which I found very useful indeed. Our Minister has done a really good job in getting the Bill ready so quickly. It is good that some of the more egregious measures included in the previous Bill—particularly the changes on direct marketing during elections and the extensive access to bank account details—have gone. There are indeed some good extras as well.
We have already had some excellent speeches setting out some concerns. I have one major concern about the current Bill and three rather lesser issues which I suspect will need further debate and discussion in Committee. I will cover them quite briefly. My major concern is that, although the Bill has the intention to boost growth and productivity, and also makes a valiant attempt to provide a unified set of rules and regulations on data processing, it may in the process have weakened the protections that we want to see here in the exploitation of personal data. Data, as other noble Lords have said, is of course not just for growth and prosperity. There will be, as we have heard, clear, practical benefits in making data work for the wider social good and for the empowerment of working people. There is huge potential for data to revitalise the public services. Indeed, I liked the point made by the noble Lord, Lord Knight, that data is in some way an asset missing from the balance sheet on many operations, and we need to think carefully about how best we can configure that to make sure that the reality comes to life.
There has been, of course, a huge change. We have moved into the age of AI, but we do not have the Bill in front of us that will deal with that. The GDPR needs a top-to-toe revision so that we can properly regulate data capture, data storage, and how it may be best shared in the public interest. As an example of that, following the Online Safety Act we have a new regulator in Ofcom with the power to regulate technology providers and their algorithmic impacts. The Digital Markets, Competition and Consumers Act has given the Competition and Markets Authority new and innovative powers to regulate commercial interests, which we heard about yesterday at an all-party group. However, this Bill has missed the opportunity to strengthen the role of the ICO so we can provide a third leg capable of regulating the use of data in today’s AI-dominated world. This is a gap that we need to think very carefully about.
I hope my noble friend the Minister will acknowledge that there is a long way to go if this legislation is to earn public confidence and if our data protection regime is to work not just for the tech monopolies but for small businesses, consumers, workers and democracy. We must end the confusion, empower the regulators, and in turn empower Parliament.
There are three specific issues, and I will go through them relatively quickly. The first is on Clauses 67 and 68, already referred to, where the Bill brings in wording from Recital 159 of the GDPR—as we inherited it from the EU. This sets out how the processing of personal data for scientific research purposes should be interpreted. The recital is drafted in extraordinarily broad terms, including
“technological development and demonstration, fundamental research, applied research and privately funded research”.
It specifically mentions that:
“Scientific research purposes should also include studies conducted in the public interest in the area of public health”.
The latest ICO guidance, which contains a couple of references to commercial scientific research, says that such research
“can also include research carried out in commercial settings, and technological development, innovation and demonstration”.
However, we lack a definition, and it is rather curious that the definition of research does exist elsewhere in statute in the UK laws. It is necessary in order to fund the research councils, for example. It is also part of the process of the tax code in order to get research benefits and tax benefits for research. So, we have a definition somewhere else, but somehow the Bill avoids that and tries to go down a clarification route of trying to bring forward into the current legislation that which is already the law—according to those who have drafted it—but which is of course so complicated that it cannot be understood. I think the Government’s thinking is to provide researchers with consistency, and they say very firmly that the Bill does not create any new permissions for using or reusing data for research purposes. In my meeting with officials, they were insistent that these clauses are about fine-tuning the data protection framework, making clarifications and small-scale changes but reducing uncertainties.
I agree that it is helpful to have the key provisions—currently buried, as they are, in the recitals—on the face of the Bill, and it may be that the new “reasonableness” test will give researchers greater clarity. Of course, we also retain the requirement that research must be in the public interest. But surely the issue that we need to address is whether the Bill, by incorporating new language and putting in this new “reasonableness” test, will permit changes to how data held by the NHS, including patients’ medical records, could be used and shared. It may be that the broad definition of “scientific research”, which can be “publicly or privately funded” and “commercial or non-commercial” inadvertently waters down consent protections and removes purpose-limitation safeguards. Without wishing to be too alarmist, we need to be satisfied that these changes will not instigate a seismic change in the rules currently governing NHS data.
It is relevant to note that the Government have stated in a separate way an intention to include in the next NHS 10-year plan significant changes as to how patients’ medical records are held and how NHS data is used. Launching a “national conversation” about the plans, the Secretary of State, my right honourable friend Wes Streeting MP, highlighted a desire to introduce electronic health records called “patient passports” and to work “hand in hand” with the private sector to use data to develop new treatments. He acknowledged that these plans would raise concerns about privacy and about how to get the
“best possible deal for the NHS in return”
for private sector access to NHS data. The details of this are opaque. As currently drafted, the Bill is designed to enable patient passports and sharing of data with private companies, but to my mind it does not address concerns about patient privacy or private sector access to health data. I hope we can explore that further in Committee and be reassured.
My second point concerns the unlicensed use of data created by the media and broader creative industries by developers of the large language models—this has already been referred to. UK copyright law is absolutely clear that AI developers must obtain a licence when they are text or data mining—the technique used to train AI models. The media companies have suggested that the UK Government should introduce provisions to ensure that news publishers and others can retain control over their data; that there must be significant penalties for non-compliance; and that AI developers must be transparent about what data their crawlers have “scraped” from websites—a rather unpleasant term, but that is what they say. Why are the Government not doing much more to stop what seems clearly to be theft of intellectual property on a mass scale, and if not in this Bill, what are their plans? At a meeting yesterday of the APPG which I have already referred to, it was clear that the CMA does not believe that it is the right body to enforce IP law. But if it is not, who is, and if there is a gap in regulatory powers, should this Bill not be used to ensure that the situation is ameliorated?
My third and final point is about putting into statute the previous Government’s commitments about regulating AI, as outlined in the rather good Bletchley declaration. Does my noble friend not agree that it would be at least a major statement of intent if the Bill could begin to address
“the protection of human rights, transparency and explainability, fairness, accountability, regulation, safety, appropriate human oversight, ethics, bias mitigation, privacy and data protection”?
These are all points raised in the Bletchley declaration. We will need to address the governance of AI technologies in the very near future. It does not seem wise to delay, even if the detailed approach has yet to be worked through and consulted upon. At the very least, as has been referred to, we should be picking up the points made by the Ada Lovelace Institute about: the inconsistent powers across regulators; the absence of regulators to enforce the principles such as recruitment and employment, or diffusely regulated areas of public service such as policing; the absence of developer-focused obligations; and the absence and high variability of meaningful recourse mechanisms when things go wrong, as they will.
When my noble friend Lord Knight of Weymouth opened the Second Reading of the last Government’s data protection Bill, he referred to his speech on the Second Reading during the passage of the 2018 Act—so he has been around for a while. He said:
“We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data”.—[Official Report, 19/12/23; col. 2164.]
For me, that remains a vision that we need to realise. It concerns me that the Bill will not achieve that.
My Lords, I thank all noble Lords for what has genuinely been a fascinating, very insightful debate. Even though I was part, I think, of my noble friend Lord Stevenson’s gang that has been working on this for some time, one learns new things, and I have learned new things again today about some of the issues that are challenging us. So I thank noble Lords for their contributions this evening, and I am very pleased to hear that a number of noble Lords have welcomed the Government’s main approach to the Bill, though of course beyond that there are areas where our concerns will diverge and, I am sure, be subject to further debate. I will try to clarify the Government’s thinking. I am sure noble Lords will understand, because we have had a very wide-ranging discussion, that if I am not able to cover all points, I will follow those up in writing.
I shall start with smart data. As was raised by my noble friend Lord Knight of Weymouth, and other noble Lords, the Government are keen to establish a smart data economy that brings benefits to consumers across all sectors.
Through the Smart Data Council, the Government are working closely to identify areas where smart data schemes might be able to bring more benefits. I think the point was made that we are perhaps not using it sufficiently at the moment. The Government intend to communicate where and in what ways smart data schemes can support innovation and growth and empower customers across a spectrum of markets—so there is more work to be done on that, for sure. These areas include providing the legislative basis for the fuel finder service announced by the Department for Energy Security and Net Zero, and supporting an upcoming call for evidence on the smart data scheme for the energy sector. Last week, the Government set out their priorities for the future of open banking in the national payments vision, which will pave the way for the UK to lead in open finance.
I turn now to digital identity, as raised by the noble Earl, Lord Erroll, and a number of other noble Lords. The measures in the Bill aim to help people and businesses across Britain to use innovative digital identity technologies and to realise their benefits with confidence. As the noble Lord, Lord Arbuthnot, said, the Bill does not make digital identities mandatory. The Bill will create a legislative structure of standards, governance and oversight for digital verification services that wish to appear on a government register, so that people will know what a good digital identity looks like. It is worth saying that a lot of these digital verification schemes already exist; we are trying to make sure that they are properly registered and have oversight. People need to know what a good digital identity looks like.
The noble Lord, Lord Arbuthnot, raised points about Sex Matters. Digital verification services can be used to prove sex or gender in the same way that individuals can already prove their sex using their passport, for example. Regarding the concerns of the noble Lord, Lord Vaux, about the inclusion of non-digital identity, the Government are clear that people who do not want to use digital identity or the digital verification services can continue to access services and live their daily lives referring to paper documents when they need to. Where people want to use more technology and feel left behind, DSIT is working hard to co-ordinate government work on digital inclusion. This is a high priority for the Government, and we hope to come back with further information on that very soon.
The Office for Digital Identities and Attributes has today published its first digital identity inclusion monitoring report. The results show a broadly positive picture of inclusion at this early stage of the markets, and its findings will inform future policy interventions.
I would like to reassure the noble Lord, Lord Markham, and the noble Viscount, Lord Camrose, that NUAR takes advantage of the latest technologies to ensure that data is accessed only for approved purposes, with all access audited. It also includes controls, developed in collaboration with the National Protective Security Authority, the National Cyber Security Centre and the security teams of asset owners themselves.
We had a very wide-ranging debate on data protection issues, and I thank noble Lords for their support for our changes to this legislation. The noble Viscount, Lord Camrose, and others mentioned delegated powers. The Government have carefully considered each delegated power and the associated parliamentary procedure and believe that each is proportionate. The detail of our rationale is set out in our delegated powers memorandum.
Regarding the concerns of the noble Lord, Lord Markham, and the noble Viscount, Lord Camrose, about the effect of the legislation on SMEs, we believe that small businesses would have struggled with the lack of clarity in the term “high-risk processing activities” in the previous Bill, which could have created more burdens for SMEs. We would prefer to focus on how small businesses can be supported to comply with the current legislation, including through user-friendly guidance on the ICO’s small business portal.
Many noble Lords, including the noble Viscount, Lord Camrose, the noble and learned Lord, Lord Thomas, and the noble Lord, Lord Vaux, raised EU adequacy. The UK Government recognise the importance of retaining our personal data adequacy decisions from the EU. I reassure the noble Lord, Lord Vaux, and my noble friend Lord Bassam that Ministers are already engaging with the European Commission, and officials will actively support the EU’s review process in advance of the renewal deadline next year. The free flow of personal data between the UK and the EU is one of the underpinning actions that enables research and innovation, supports the improvement of public services and keeps people safe. I join the noble Lord, Lord Vaux, in thanking the European Affairs Committee for its work on the matter. I can reassure him and the committee that the Secretary of State will respond within the required timeframe.
The noble Lord, Lord Bethell, and others raised international data transfers. Controllers and processors must take reasonable and proportionate steps to satisfy themselves that, after the international transfer, the level of protection for the data subject will be “not materially lower” than under UK data protection law. The Government take their responsibility seriously to ensure that data and its supporting infrastructure are secure and resilient.
On the question from the noble Viscount, Lord Colville, about the new recognised legitimate interest lawful ground, the entire point of the new lawful ground is to provide more legal certainty for data controllers that they are permitted to process personal data for the activities mentioned in new Annexe 1 to the UK GDPR. However, the processing must still be necessary and proportionate and meet all other UK GDPR requirements. That includes the general data protection principles in Article 5 of the UK GDPR, and the safeguards in relation to the processing of special category data in Article 9.
The Bill has significantly tightened up on the regulation-making power associated with this clause. The only processing activities that can be added to the list of recognised legitimate interests are those that serve the objectives of public interest, as described in Article 23(1) of the UK GDPR. The Secretary of State would also have to have regard to people’s rights and the fact that children may be less aware of the risks and consequences of the processing of their data before adding new activities to the list.
My noble friends Lord Davies of Brixton and Lord Stevenson of Balama—do you know, I have never had to pronounce his full name—Balmacara, raised NHS data. These clauses are intended to ensure that IT providers comply with relevant information standards in relation to IT use for health and adult social care, so that, where data is shared, it can be done in an easier, faster and cheaper way. Information standards create binding rules to standardise the processing of data where it is otherwise lawful to process that data. They do not alter the legal obligations that apply in relation to decisions about whether to share data. Neither the Department of Health and Social Care nor the NHS sells data or provides it for purely commercial purposes such as insurance or marketing purposes.
With regard to data assets, as raised by the noble Baroness, Lady Kidron, and my noble friend Lord Knight of Weymouth, the Government recognise that data is indeed one of the most valuable assets. It has the potential to transform public services and drive cutting-edge innovation. The national data library will unlock the value of public data assets. It will provide simple, secure and ethical access to our key public data assets for researchers, policymakers and businesses, including those at the frontier of AI development, and make it easier to find, discover and make connections across those different databases. It will sit at the heart of an ambitious programme of reform that delivers the incentives, investment and leadership needed to secure the full benefits for people and the economy.
The Government are currently undertaking work to design the national data library. In its design, we want to explore the best models of access so that public sector data benefits our society, much in the way that the noble Baroness, Lady Kidron, outlined. So, decisions on its design and implementation will be taken in due course.
Regarding the concerns of the noble Lord, Lord Markham, about cybersecurity, as announced in the King’s Speech, the Government will bring forward a cybersecurity and resilience Bill this Session. The Bill will strengthen our defences and ensure that more essential digital services than ever before are protected.
The noble Baroness, Lady Kidron, the noble Viscount, Lord Colville, and my noble friend Lord Stevenson of Balmacara, asked about the Government’s plans to regulate AI and the timing of this legislation. As set out in the King’s Speech, the Government are committed to establishing appropriate legislation for companies developing the most powerful AI systems. The Government will work with industry, civil society and experts across the UK before legislation is drawn up. I look forward to updating the House on these proposals in due course. In addition, the AI opportunities action plan will set out a road map for government to capture the opportunities of AI to enhance growth and productivity and create tangible benefits for UK citizens.
Regarding data scraping, as raised by the noble Baroness, Lady Kidron, the noble Viscount, Lord Colville of Culross, and others, although it is not explicitly addressed in the data protection legislation, any such activity involving personal data would require compliance with the data protection framework, especially that the use of data must be fair, lawful and transparent.
A number of noble Lords talked about AI in the creative industries, particularly the noble Lords, Lord Holmes and Lord Freyberg—
I am sorry to interrupt what is a very fluent and comprehensive response. I do not want to break the thread, but can I press the Minister a little bit on those companies whose information which is their intellectual property is scraped? How will that be resolved? I did not pick up from what the Minister said that there was going to be any action by the Government. Are we left where we are? Is it up to those who feel that their rights are being taken away or that their data has been stolen to raise appropriate action in the courts?
I was going to come on to some of those issues. Noble Lords talked about AI in the creative industries, which I think my noble friend is particularly concerned about. The Government are working hard on this and are developing an effective approach that meets the needs of the UK. We will announce more details in due course. We are working closely with relevant stakeholders and international partners to understand views across the creative sector and AI sectors. Does that answer my noble friend’s point?
With respect, it is the narrow question that a number of us have raised. Training the new AI systems is entirely dependent on them being fed vast amounts of material which they can absorb, process and reshape in order to answer questions that are asked of them. That information is to all intents and purposes somebody else’s property. What will happen to resolve the barrier? At the moment, they are not paying for it but just taking it—scraping it.
Perhaps I may come in too. Specifically, how does the data protection framework change it? We have had the ICO suggesting that the current framework works perfectly well and that it is the responsibility of the scrapers to let the IP holders know, while the IP holders have not a clue that it is being scraped. It is already scraped and there is no mechanism. I think we are a little confused about what the plan is.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Stevenson of Balmacara
Main Page: Lord Stevenson of Balmacara (Labour - Life peer)Department Debates - View all Lord Stevenson of Balmacara's debates with the Department for Business and Trade
(1 month, 4 weeks ago)
Grand CommitteeMy Lords, I support the amendments in the name of the noble Lord, Lord Clement-Jones. I perhaps did not say it at the beginning of my remarks on this section, but I fully support the Government’s efforts to create a trust framework. I think I started with criticism rather than with the fact that this is really important. Trust is in the name and if we cannot trust it, it is not going to be a trust framework. It is important to anticipate and address the likelihood that some will seek to abuse it. If there are not sufficient consequences for abusing it, I do not understand quite how we can have the level of trust needed for this to have wide adoption.
I particularly want to say that good systems cannot rely on good people. We know that and we see it. We are going to discuss it later in Committee, but good systems need checks and balances. In relation to this set of amendments, we need a disincentive for bad actors to mislead or give false information to government or the public. I am not going to rehearse each amendment that the noble Lord, Lord Clement-Jones, explained so brilliantly. The briefing on the trust framework is a very important one for us all. The amount of support there is for the idea, and the number of questions about what it means and how it will work, mean that we will come back to this if we do not have a full enough explanation of the disincentives for a bad actor.
My Lords, I support these amendments and applaud the noble Lord, Lord Clement-Jones, for his temerity and for offering a variety of choices, making it even more difficult for my noble friend to resist it.
It has puzzled me for some time why the Government do not wish to see a firm line being taken about digital theft. Identity theft in any form must be the most heinous of crimes, particularly in today’s world. This question came up yesterday in an informal meeting about a Private Member’s Bill due up next Friday on the vexed question of the sharing of intimate images and how the Government are going to respond to it. We were sad to discover that there was no support among the Ministry of Justice officials who discussed the Bill with its promoter for seeing it progress any further.
At the heart of that Bill is the same question about what happens when one’s identity is taken and one’s whole career and personality are destroyed by those who take one’s private information and distort it in such a way that those who see it regard it as being a different person or in some way involved in activities that the original person would never have been involved in. Yet we hear that the whole basis on which this digital network has been built up is a voluntary one, and the logic of that is that it would not be necessary to have the sort of amendments that are before us now.
I urge the Government to think very hard about this. There must be a break point here. Maybe the meeting that has been promised will help us, but there is a fundamental point about whether in the digital world we can rely on the same protections that we have in the real world—and, if not, why not?
My Lords, I will address the amendments proposed by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron. I have nothing but the deepest respect for their diligence, and indeed wisdom, in scrutinising all three flavours of the Bill as it has come out, and for their commitment to strengthening the legislative framework against fraud and other misuse of digital systems. However, I have serious reservations about the necessity and proportionality of the amendments under consideration, although I look forward to further debates and I am certainly open to being convinced.
Amendments 51 and 52 would introduce criminal sanctions, including imprisonment, for the misuse of trust marks. While the protection of trust marks is vital for maintaining public confidence in digital systems, I am concerned that introducing custodial sentences for these offences risks overcriminalisation. The misuse of trust marks can and should be addressed through robust civil enforcement mechanisms. Turning every such transgression into a criminal matter would place unnecessary burdens on, frankly, an already strained justice system and risks disproportionately punishing individuals or small businesses for inadvertent breaches.
Furthermore, the amendment’s stipulation that proceedings could be brought only by or with the consent of the Director of Public Prosecutions or the Secretary of State is an important safeguard, yet it underscores the high level of discretion required to enforce these provisions effectively, highlighting the unsuitability of broad criminalisation in this context.
Amendment 53 seeks to expand the definition of identity documents under the Identity Documents Act 2010 to include digital identity documents. While the noble Lord, Lord Clement-Jones, makes a persuasive case, the proposal raises two concerns. First, it risks pre-emptively criminalising actions before a clear and universally understood framework for digital identity verification is in place. The technology and its standards are still evolving, and it might be premature to embed such a framework into criminal law. Secondly, there is a risk that this could have unintended consequences for innovation in the digital identity sector. Businesses and individuals navigating this nascent space could face disproportionate legal risks, which may hinder progress in a field critical to the UK’s digital economy.
Amendment 54 would introduce an offence of knowingly or recklessly providing false information in response to notices under Clause 51. I fully support holding individuals accountable for deliberate deception, but the proposed measure’s scope could lead to serious ambiguities. What constitutes recklessness in this context? Are we inadvertently creating a chilling effect where individuals or businesses may refrain from engaging with the system for fear of misinterpretation or error? These are questions that need to be addressed before such provisions are enshrined in law.
We must ensure that our legislative framework is fit for purpose, upholds the principles of justice and balances enforcement with fairness. The amendments proposed, while they clearly have exactly the right intentions, risk, I fear, undermining these principles. They introduce unnecessary criminal sanctions, create uncertainty in the digital identity space and could discourage good-faith engagement with the regulatory system. I therefore urge noble Lords to carefully consider the potential consequences of these amendments and, while expressing gratitude to the noble Lords for their work, I resist their inclusion in the Bill.
My Lords, of course I welcome the fact that the Bill will enable people to register a death in person and online, which was a key recommendation from the UK Commission on Bereavement. I have been asked to table this amendment by Marie Curie; it is designed to achieve improvements to UK bereavement support services, highlighting the significant administrative burden faced by bereaved individuals.
Marie Curie points to the need for a review of the existing Tell Us Once service and the creation of a universal priority service register to streamline death-related notifications across government and private sectors. It argued that the Bill presents an opportunity to address these issues through improved data-sharing and online death registration. Significant statistics illustrate the scale of the problem, showing a large percentage of bereaved people struggling with numerous administrative tasks. It urges the Government, as I do, to commit to implementing those changes to reduce the burden on bereaved families.
Bereaved people face many practical and administrative responsibilities and tasks after a death, which are often both complex and time sensitive. This Bill presents an opportunity to improve the way in which information is shared between different public and private service providers, reducing the burden of death administration.
When someone dies, the Tell Us Once service informs the various parts of national and local government that need to know. That means the local council stops charging council tax, the DVLA cancels the driving licence, the Passport Office cancels the passport, et cetera. Unfortunately, Tell Us Once is currently not working across all Government departments and does not apply to Northern Ireland. No updated equality impact assessment has ever been undertaken. While there are death notification services in the private sector, they are severely limited by not being a public service programme—and, as a result, there are user costs associated, adding to bereaved people’s financial burden and penalising the most struggling families. There is low public awareness and take-up among all these services, as well as variable and inconsistent provision by the different companies. The fact that there is not one service for all public and private sector notifications means that dealing with the deceased’s affairs is still a long and painful process.
The Bill should be amended to require Ministers to carry out a review into the current operation and effectiveness of the Tell Us Once service, to identify any gaps in its operation and provisions and make recommendations as to how the scope of the service could be expanded. Priority service registers are voluntary schemes which utility companies create to ensure that extra help is available to certain vulnerable customers. The previous Government recognised that the current PSRs are disjointed, resource intensive and duplicative for companies, carrying risks of inconsistencies and can be “burdensome for customers”.
That Government concluded that there is “significant opportunity to improve the efficiencies and delivery of these services”. The Bill is an opportunity for this Government to confirm their commitment to implementing a universal priority services register and delivering any legislative measures required to facilitate it. A universal PSR service must include the interests of bereaved people within its scope, and charitable voluntary organisations such as Marie Curie, which works to support bereaved people, should be consulted in its development.
I have some questions to the Minister. First, what measures does this Bill introduce that will reduce the administrative burden on bereaved people after the death of a loved one? Secondly, the Tell Us Once service was implemented in 2010 and the original equality impact assessment envisaged that its operation should be kept under review to reflect the changing nature of how people engage with public services, but no review has ever happened. Will the Minister therefore commit the Government to undertake a review of Tell Us Once? Thirdly, the previous Government’s Smarter Regulation White Paper committed to taking forward a plan to create a “shared once” support register, which would bring together priority service registers. Will the Minister commit this Government to taking that work forward? I beg to move.
My Lords, it occurred to me when the noble Lord was speaking that we had lost a valuable member of our Committee. This could not be the noble Lord, Lord Clement-Jones, who was speaking to us just then. It must have been some form of miasma or technical imposition. Maybe his identity has been stolen and not been replaced. Normally, the noble Lord would have arrived with a short but punchy speech that set out in full how the new scheme was to be run, by whom, at what price, what its extent would be and the changes that would result. The Liberal future it may have been, but it was always delightful to listen to. I am sad that all the noble Lord has asked for here is a modest request, which I am sure the noble Baroness will want to jump to and accept, to carry out a review—as if we did not have enough of those.
Seriously, I once used the service that we have been talking about when my father-in-law died, and I found it amazing. It was also one that I stumbled on and did not know about before it happened. Deaths did not happen often enough in my family to make me aware of it. But, like the noble Lord, Lord Clement-Jones, I felt that it should have done much more than what it did, although it was valuable for what it did. It also occurred to me, as life moved on and we produced children, that there would be a good service when introducing a new person—a service to tell you once about that, because the number of tough issues one has to deal with when children are born is also extraordinary and can be annoying, if you miss out on one—particularly with the schooling issues, which are more common these days than they were when my children were being born.
I endorse what was said, and regret that the amendment perhaps did not go further, but I hope that the Minister when she responds will have good news for us.
I thank the noble Lord, Lord Clement-Jones, for raising this, and the noble Lord, Lord Stevenson, for raising the possibility that we are in the presence of a digital avatar of the noble Lord, Lord Clement-Jones. It is a scary thought, indeed.
The amendment requires a review of the operation of the Tell Us Once programme, which seeks to provide a simpler mechanism for citizens to pass information regarding births and deaths to the Government. It considers whether the pioneering progress of Tell Us Once could be extended to non-public sector holders of data. When I read the amendment, I was more cynical than I am now, having heard what the noble Lord, Lord Clement-Jones, had to say. I look forward to hearing the Minister’s answers. I take the point from the noble Lord, Lord Stevenson, that we do not necessarily need another review—but now that I have heard about it, it feels a better suggestion than I thought it was when reading about it.
I worry that expanding this programme to non-public sector holders of data would be a substantial undertaking; it would surely require the Government to hold records of all the non-public sector organisations that have retained and processed an individual’s personal data. First, I am not sure that this would even be possible—or practicable, anyway. Secondly, I am not sure that it would end up being an acceptable level of state surveillance. I look forward to hearing the Minister’s response but I am on the fence on this one.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Stevenson of Balmacara
Main Page: Lord Stevenson of Balmacara (Labour - Life peer)Department Debates - View all Lord Stevenson of Balmacara's debates with the Department for Business and Trade
(1 month, 3 weeks ago)
Grand CommitteeMy Lords, I will speak to Amendments 59, 62, 63 and 65 in the name of my noble friend Lord Colville, and Amendment 64 in the name of the noble Lord, Lord Clement-Jones, to which I added my name. I am also very much in sympathy with the other amendments in this group more broadly.
My noble friend Lord Colville set out how he is seeking to understand what the Government intend by “scientific research” and to make sure that the Bill does not offer a loophole so big that any commercial company can avoid data protections of UK citizens in the name of science.
At Second Reading, I read out a dictionary definition of science:
“The systematic study of the structure and behaviour of the physical and natural world through observation, experimentation, and the testing of theories against the evidence obtained”—
i.e. everything. I also ask the Minister if the following scenarios could reasonably be considered scientific. Is updating or improving a new tracking app for fitness, or a bot for an airline, scientific? Is the behavioural science of testing children’s response to persuasive design strategies in order to extend the stickiness of commercial products scientific? These are practical scenarios, and I would be grateful for an answer in order to understand what is in and out of the scope of the Bill.
When I raised Clause 67 at a briefing meeting, it was said that it was, as my noble friend Lord Colville suggested, just housekeeping. The law firm Taylor Wessing suggests that what can
“‘reasonably be described as scientific’ is arguably very wide and fairly vague, so it will be interesting to see how this is interpreted, but the assumption is that it is intended to be a very broad definition”.
Each of the 14 law firm blogs and briefings that I read over the weekend described it variously as loosening, expanding or broadening. Not one suggested that it was a tightening and not one said that it was a no-change change. As we have heard, the European Data Protection Supervisor published an opinion stating that
“scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests”.
When the Minister responds, perhaps she could say whether the particular scenarios I have set out fall within the definition of scientific and why the Government have failed to reflect the critical clarification of the European Data Protection Supervisor in transferring the recital into the Bill.
I turn briefly to Amendment 64, which would limit the use of children’s personal data for the purposes of research and education by making it subject to a public interest requirement and opt-in from the child or a parent. I will speak in our debate on a later grouping to amendments that would enshrine children’s right to higher protection and propose a comprehensive code of practice on the use of children’s data in education, which is an issue of increasing scandal and concern. For now, it would be good to understand whether the Government agree that education is an area of research where a public interest requirement is necessary and appropriate and that children’s data should always be used to support their right to learn, rather than to commoditise them.
During debate on the DPDI Bill, a code of practice on children’s data and scientific research was proposed; the Minister added her name to it. It is by accident rather than by design that I have failed to lay it here, but I will listen carefully to the Minister’s reply to see whether children need additional protections from scientific research as the Government now define it.
My Lords, I have in subsequent groups a number of amendments that touch on many of the issues that are raised here, so I will not detain the Committee by going through them at this stage and repeating them later. However, I feel that, although the Government have had the best intentions in bringing forward a set of proposals in this area that were to update and to bring together rather conflicting and difficult pieces of legislation that have been left because of the Brexit arrangements, they have managed to open up a gap between where we want to be and where we will be if the Bill goes forward in its present form. I say that in relation to AI, which is a subject requiring a lot more attention and a lot more detail than we have before us. I doubt very much whether the Government will have the appetite for dealing with that in time for this Bill, but I hope that at the very least—it would be a minor concession at this stage—they will commit at the Dispatch Box to seeking to resolve these issues in the legislation within a very short period because, as we have heard from the arguments made today, it is desperately needed.
More importantly, if, by bringing together documentation that is thought to represent the current situation, either inadvertently or otherwise, the Government have managed to open up a loophole that will devalue the way in which we currently treat personal data—I will come on to this when I get to my groups in relation to the NHS in particular—that would be a grievous situation. I hope that, going forward, the points that have been made here can be accommodated in a statement that will resolve them, because they need to be resolved.
My Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as adviser to Socially Recruited, an AI business.
I support the noble Viscount, Lord Colville, in his amendments and all the other amendments in this group. They were understandably popular, to the extent that when I got my pen out, there was no space left for me to co-sign them, so I was left with the oral tradition in which to reflect my support for them. Before going into the detail, I just say that we have had three data Bills in just over three years: DPDI, DISD and this Bill. Over that period, though the names have changed, much of the meat remains the same in the legislation. Yet, in that period, everything and nothing haschanged —everything in terms of what has happened with generative AI.
Considering that seismic shift that has occurred over these three Bills, could the Minister say what in this Bill specifically has changed, not least in this part, to reflect that seismic change? Regarding “nothing has changed”, nothing has changed in terms of the incredibly powerful potential of AI for positive or negative outcomes, ably demonstrated with this set of amendments.
If you went on to Main Street and polled the public, I believe that you would get a pretty clear understanding of what they considered scientific research to be. You know it. You understand why we would want to have a specified definition of scientific research and what that would mean for the researchers and for the country.
However, if we are to draw that definition as broadly as it currently is in the Bill, why would we bother to have such a definition at all? If the Government’s intention is to enable so much to come within the perimeter, let us not have the definition at all and let us allow to continue what is happening right now, not least in the reuse of scrape data or in how data is being treated in these generative AI models.
We have seen what has happened in terms of the training, but when you look at what could be called development and improvement, as the noble Viscount has rightly pointed out, all this and more could easily fit within the scientific research definition. It could even more easily fit in when lawyers are deployed to ensure that that is so. I know we are going to come on to rehearsing a number of these subjects in the next group but, for this group, I support all the amendments as set out.
I ask the Minister these two questions. First, what has changed in all the provisions that have gone through all these three iterations of the data Bill? Secondly, what is the Government’s intention when it comes to scientific research, if it is not truly to mean scientific research, if it is not to have ethics committee involvement and if it is not to feel sound and be defined as what most people on Main Street would recognise as scientific research?
My Lords, I rise to move the amendment standing in my name and to speak to my other amendments in this group. I am grateful to the noble Baroness, Lady Kidron and the noble Lord, Lord Clement-Jones, for signing a number of those amendments, and I am also very grateful to Foxglove Legal and other bodies that have briefed me in preparation for this.
My amendments are in a separate group, and I make no apology for that because although some of these points have indeed been covered in other amendments, my focus is entirely on NHS patient data, partly because it is the subject of a wider debate going on elsewhere about whether value can be obtained for it to help finance the National Health Service and our health in future years. This changes the nature of the relationship between research and the data it is using, and I think it is important that we focus hard on this and get some of the points that have already been made into a form where we can get reasonable answers to the questions that it leaves.
If my amendments are accepted or agreed—a faint hope—they would make it clear beyond peradventure that the consent protections in the Bill apply to the processing of data for scientific research, that a consistent definition of consent is applied and that that consistent definition is the one with which researchers and the public are already familiar and can trust going forward.
The Minister said at the end of Second Reading, in response to concerns I and others raised about research data in general and NHS data in particular, that the provisions in this Bill
“do not alter the legal obligations that apply in relation to decisions about whether to share data”.—[Official Report, 19/11/24; col. 196.]
I accept that that may be the intention, and I have discussed this with officials, who make the same point very strongly. However, Clause 68 introduces a novel and, I suggest, significantly watered-down definition of consent in the case of scientific research. Clause 71 deploys this watered-down definition of consent to winnow down the “purpose limitation” where the processing is for the purposes of scientific research in the public interest. Taken together, this means that there has been a change in the legal obligations that apply to the need to obtain consent before data is shared.
Clause 68 amends the pivotal definition of consent in Article 4(11). Instead of consent requiring something express—freely given, specific, informed, and unambiguous through clear affirmative action—consent can now be imputed. A data subject’s consent is deemed to meet these strict requirements even when it does not, as long as the consent is given to the processing of personal data for the purposes of an area of scientific research; at the time the consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed; seeking consent in relation to the area of scientific research is consistent with generally recognised ethical standards relevant to the area of research; and, so far as the intended purposes of the processing allow, the data subject is given the opportunity to consent to processing for only part of the research. These all sound very laudable, but I believe they cut down the very strict existing standards of consent.
Proposed new paragraph 7, in Clause 68, then extends the application of this definition across the regulation:
“References in this Regulation to consent given for a specific purpose (however expressed) include consent described in paragraph 6.”
Thus, wherever you read “consent” in the regulation you can also have imputed consent as set out in proposed new paragraph 6 of Article 4. This means that “consent” within the meaning of proposed new paragraph 6(a)—i.e. the basis for lawful processing—can be imputed consent in the new way introduced by the Bill, so there is a new type of lawful basis for processing.
The Minister is entitled to disagree, of course; I expect him to say that when he comes to respond. I hope that, when he does, he will agree that we share a concern on the importance of giving researchers a clear framework, as it is this uncertainty about the legal framework that could inadvertently act as a barrier to the good research we all need. So my first argument today is that, as drafted, the Bill leaves too much room for different interpretations, which will lead to exactly the kind of uncertainty that the Minister—indeed, all of us—wish to avoid.
As we have heard already, as well as the risk of uncertainty among researchers, there is also the risk of distrust among the general public. The public rightly want and expect to have a say in what uses their data is put to. Past efforts to modernise how the NHS uses data, such as care.data, have been expensive failures, in part because they have failed to win the public’s trust. More than 3.3 million people have already opted out of NHS data sharing under the national data opt-out; that is nearly 8% of the adults who could have been part of surveys. We have talked about the value of our data and being the gold standard or gold attractor for researchers but, if we do not have all the people who could contribute, we are definitely devaluing and debasing that research. Although we want to respect people’s choice as to whether to participate, of course, this enormous vote against research reflects a pretty spectacular failure to win public trust—one that undermines the value and quality of the data, as I said.
So my second point is that watering down the rights of those whose data is held by the NHS will not put that data for research purposes on a sustainable, long-term footing. Surely, we want a different outcome this time. We cannot afford more opt-outs; we want people opting back in. I argue that this requires a different approach—one that wins the public’s trust and gains public consent. The Secretary of State for Health is correct to say that most of the public want to see the better use of health data to help the NHS and to improve the health of the nation. I agree, but he must accept that the figures show that the general public also have concerns about privacy and about private companies exploiting their data without them having a say in the matter. The way forward must be to build trust by genuinely addressing those concerns. There must not be even a whiff of watering down legal protections, so that those concerns can instead be turned into support.
This is also important because NHS healthcare includes some of the most intimate personal data. It cannot make sense for that data to have a lower standard of consent protection going forward if it is being used for research. Having a different definition of consent and a lower standard of consent will inevitably lead to confusion, uncertainty and mistrust. Taken together, these amendments seek to avoid uncertainty and distrust, as well as the risk of backlash, by making it abundantly clear that Article 4 GDPR consent protections apply despite the new wording introduced by this Bill. Further, these are the same protections that apply to other uses of data; they are identical to the protections already understood by researchers and by the public.
I turn now to a couple of the amendments in this group. Amendment 71 seeks to address the question of consent, but in a rather narrow way. I have argued that Clause 68 introduces a novel and significantly watered-down definition of consent in the case of scientific research; proposed new paragraph 7 deploys this watered-down definition to winnow down the purpose limitation. There are broader questions about the wisdom of this, which Amendments 70, 79 and 81 seek to address, but Amendment 71 focuses on the important case of NHS health data.
If the public are worried that their health data might be shared with private companies without their consent, we need an answer to that. We see from the large number of opt-outs that there is already a problem; we have also seen it recently in NHS England’s research on public attitudes to health data. This amendment would ensure that the Bill does not increase uncertainty or fuel patient distrust of plans for NHS data. It would help to build the trust that data-enabled transformation of the NHS requires.
The Government may well retort that they are not planning to share NHS patient data with commercial bodies without patient consent. That is fine, but it would be helpful if, when he comes to respond, the Minister could say that clearly and unambiguously at the Dispatch Box. However, I put it to him that, if he could accept these amendments, the law would in fact reflect that assurance and ensure that any future Government would need to come back to Parliament if they wanted to take a different approach.
It is becoming obvious that whether research is in the public interest will be the key issue that we need to resolve in this Bill, and Amendment 72 provides a proposal. The Bill makes welcome references to health research being in the public interest, but it does not explain how on earth we decide or how that requirement would actually bite. Who makes the assessment? Do we trust a rogue operator to make its own assessment of how its research is in the public interest? What would be examples of the kind of research that the Government expect this requirement to prevent? I look forward to hearing the answer to that, but perhaps it would be more helpful if the Minister responded in a letter. In the interim, this amendment seeks to introduce some procedural clarity about how research will be certified as being in the public interest. This would provide clarity and reassurance, and I commend it to the Minister.
Finally, Amendment 131 seeks to improve the appropriate safeguards that would apply to processing for research, archiving and scientific purposes, including a requirement that the data subject has given consent. This has already been touched on in another amendment, but it is a way of seeking to address the issues that Amendments 70, 79 and 81 are also trying to address. Perhaps the Government will continue to insist that this is addressing a non-existent problem because nothing in Clauses 69 or 71 waters down the consent or purpose limitation protections and therefore the safeguards themselves add nothing. However, as I have said, informed readers of the Bill are interpreting it differently, so spelling out this safeguard would add clarity and avoid uncertainty. Surely such clarity on such an important matter is worth a couple of lines of additional length in a 250-page Bill. If the Government are going to argue that our Amendment 131 adds something objectionable, let them explain what is objectionable about consent protections applying to data processing for these purposes. I beg to move.
My Lords, I support Amendments 70 to 72, which I signed, in the name of the noble Lord, Lord Stevenson of Balmacara. I absolutely share his view about the impact of Clause 68 on the definition of consent and the potential and actual mistrust among the public about sharing of their data, particularly in the health service. It is highly significant that 3.3 million people have opted out of sharing their patient data.
I also very much share the noble Lord’s views about the need for public interest. In a sense, this takes us back to the discussion that we had on previous groups about whether we should add that in a broader sense so not purely for health data or whatever but for scientific research more broadly, as he specifies. I very much support what he had to say.
Broadly speaking, the common factor between my clause stand part and what he said is health data. Data subjects cannot make use of their data rights if they do not even know that their data is being processed. Clause 77 allows a controller reusing data under the auspices of scientific research to not notify a data subject in accordance with Article 13 and 14 rights if doing so
“is impossible or would involve a disproportionate effort”.
We on these Benches believe that Clause 77 should be removed from the Bill. The safeguards are easily circumvented. The newly articulated compatibility test in new Article 8A inserted by Clause 71 that specifies how related the new and existing purposes for data use need to be to permit reuse is essentially automatically passed if it is conducted
“for the purposes of scientific research or historical research”.
This makes it even more necessary for the definition of scientific research to be tightened to prevent abuse.
Currently, data controllers must provide individuals with information about the collection and use of their personal data. These transparency obligations generally do not require the controller to contact each data subject. Such obligations can usually be satisfied by providing privacy information using different techniques that can reach large numbers of individuals, such as relevant websites, social media, local newspapers and so on.
My Lords, I thank noble Lords for another thought-provoking debate on consent in scientific research. First, let me set out my staunch agreement with all noble Lords that a data subject’s consent should be respected.
Regarding Amendment 70, Clause 68 reproduces the text from the current UK GDPR recitals, enabling scientists to obtain “broad consent” for an area of research from the outset and to focus on potentially life-saving research. This has the same important limitations, including that it cannot be used if the researcher already knows its specific purpose and that consent can be revoked at any point.
I turn to Amendments 71 and 72, in the name of my noble friend Lord Stevenson, on assessments for research. Requiring all research projects to be submitted for assessments could discourage or delay researchers in their important work, as various noble Lords mentioned. However, I understand that my noble friend’s main concern is around NHS data. I assure him that, if NHS data is used for research, individual patients cannot be identified unless either a patient has specifically agreed for that data to be shared or the Health Research Authority has approved an application for this information to be used, informed by advice from the independent and expert Confidentiality Advisory Group. Research projects using confidential patient data are always subject to rigorous governance, including the approval of an ethics committee; the Minister, my noble friend Lady Jones, mentioned this earlier. There are also strict controls around who can see the data and how it is used and stored. Nothing in this clause will change that approach.
I turn to Amendments 81 and 131 on consent. I understand the motivations behind adding consent as a safeguard. However, organisations such as the Health Research Authority have advised researchers against relying on consent under the UK GDPR; for instance, an imbalance of power may mean that consent cannot truly be “freely given”.
On Amendment 79, I am happy to reassure my noble friend Lord Stevenson that references to “consent” in Clause 71 do indeed fall under the definition in Article 4.11.
Lastly, I turn to Clause 77, which covers the notification exemption; we will discuss this in our debates on upcoming groups. The Government have identified a gap in the UK GDPR that may disproportionately affect researchers. Where data is not collected from the data subject, there is an exemption from notifying them if getting in contact would mean a disproportionate amount of effort. This does not apply to data collected from the data subject. However, in certain studies, such as those of degenerative neurological conditions, it can be impossible or involve a disproportionate effort to recontact data subjects to inform them of any change in the study. The Bill will therefore provide a limited exemption with strong safeguards for data subjects.
Numerous noble Lords asked various questions. They touched on matters that we care about very much: trust in the organisation asking for data; the transparency rules; public interest; societal value; the various definitions of “consent”; and, obviously, whether we can have confidence in what is collected. I will not do noble Lords’ important questions justice if I stand here and try to give answers on the fly, so I will do more than just write a letter to them: I will also ask officials to organise a technical briefing and meeting so that we can go into everyone’s concerns in detail.
With that, I hope that I have reassured noble Lords that there are strong protections in place for data subjects, including patients; and that, as such, noble Lords will feel content to withdraw or not press their amendments.
My Lords, I thank those who participated in this debate very much indeed. It went a little further than I had intended in drafting these amendments, but it has raised really important issues which I think we will probably come back to, if not later in Committee, certainly at Report.
At the heart of what we discussed, we recognise, as the noble Baroness, Lady Kidron, put it, that our data held by the NHS—if that is a better way of saying it—is valuable both in financial terms and because it should and could bring better health in future. Therefore, we value it specifically among some of the other datasets that we are talking about, because it has a returning loop in it. It is of benefit not just to the individual but to the UK as a whole, and we must respect that.
However, the worry that underlies framing it in that way is that, at some point, a tempting offer will be made by a commercial body—perhaps one is already on the table—which would generate new funding for the NHS and our health more generally, but the price obtained for that will not reflect the value that we have put into it over the years and the individual data that is being collected. That lack of trust is at the heart of what we have been talking about. In a sense, these amendments are about trust, but they are also bigger. They are also about the whole question of what it is that the Government as a whole do on our behalf in holding our data and what value they will obtain for that—something which I think we will come back to on a later amendment.
I agree with much of what was said from all sides. I am very grateful to the noble Lords, Lord Kamall and Lord Holmes, from the Opposition for joining in the debate and discussion, and their points also need to be considered. The Minister replied in a very sensible and coherent way; I will read very carefully what he said in Hansard and we accept his kind offer of a technical briefing on the Bill—that would be most valuable. I beg leave to withdraw the amendment.
My Lords, I put my name to the amendments from the noble Baroness, Lady Kidron, and will briefly support them. I state my interest as a governor of Coram, the children’s charity. One gets a strong sense of déjà vu with this Bill. It takes me back to the Online Safety Bill and the Victims and Prisoners Bill, where we spent an inordinate amount of time trying to persuade the Government that children are children and need to be treated as children, not as adults. That was hard work. They have an absolute right to be protected and to be treated differently.
I ask the Minister to spend some time, particularly when her cold is better, with some of her colleagues whom we worked alongside during the passage of those Bills in trying to persuade the then Government of the importance of children being specifically recognised and having specific safeguards. If she has time to talk to the noble Lords, Lord Ponsonby, Lord Stevenson and Lord Knight, and the noble Baroness, Lady Thornton —when she comes out of hospital, which I hope will be soon—she will have chapter, book and verse about the arguments we used, which I hope we will not have to rehearse yet again in the passage of this Bill. I ask her please to take the time to learn from that.
As the noble Baroness said, what is fundamental is not what is hinted at or implied at the Dispatch Box, but what is actually in the Bill. When it is in the Bill, you cannot wriggle out of it—it is clearly there, stating what it is there for, and it is not open to clever legal interpretation. In a sense, we are trying to future-proof the Bill by, importantly, as she said, focusing on outcomes. If you do so, you are much nearer to future-proofing than if you focus on processes, which by their very nature will be out of date by the time you have managed to understand what they are there to do.
Amendment 135 is important because the current so-called safeguard for the Information Commissioner to look after the interests of children is woefully inadequate. One proposed new section in Clause 90 talks of
“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”.
It is not just children; most adults do not have a clue about any of that, so to expect children to have even the remotest idea is just a non-starter. To add insult to injury, that new section begins
“the Commissioner must have regard to such of the following”—
of which the part about children is one—
“as appear to the Commissioner to be relevant in the circumstances”.
That is about as vague and weaselly as it is possible to imagine. It is not adequate in any way, shape or form.
In all conscience, I hope that will be looked at very carefully. The idea that the commissioner might in certain circumstances deem that the status and importance of children is not relevant is staggering. I cannot imagine a circumstance in which that would be the case. Again, what is in the Bill really matters.
On Amendment 94, not exempting the provision of information regarding the processing of children’s data is self-evidently extremely important. On Amendment 82, ring-fencing children’s data from being used by a controller for a different purpose again seems a no-brainer.
Amendment 196, as the noble Lord, Lord Clement-Jones, says, is a probing amendment. It seems eminently sensible when creating Acts of Parliament that in some senses overlap, particularly in the digital and online world, that the left hand should know what the right hand is doing and how two Acts may be having an effect on one another, perhaps not in ways that had been understood or foreseen when the legislation was put forward. We are looking for consistency, clarity, future-proofing and a concentration on outputs, not processes. First and foremost, we are looking for the recognition, which we fought for so hard and finally got, that children are children and need to be recognised and treated as children.
My Lords, I think we sometimes forget, because the results are often so spectacular, the hard work that has had to happen over the years to get us to where we are, particularly in relation to the Online Safety Act. It is well exemplified by the previous speaker. He put his finger on the right spot in saying that we all owe considerable respect for the work of the noble Baroness, Lady Kidron, and others. I helped a little along the way. It is extraordinary to feel that so much of this could be washed away if the Bill goes forward in its present form. I give notice that I intend to work with my colleagues on this issue because this Bill is in serious need of revision. These amendments are part of that and may need to be amplified in later stages.
I managed to sign only two of the amendments in this group. I am sorry that I did not sign the others, because they are also important. I apologise to the noble Lord, Lord Clement-Jones, for not spotting them early enough to be able to do so. I will speak to the ones I have signed, Amendments 88 and 135. I hope that the Minister will give us some hope that we will be able to see some movement on this.
The noble Lord, Lord Russell, mentioned the way in which the wording on page 113 seems not only to miss the point but to devalue the possibility of seeing protections for children well placed in the legislation. New Clause 120B(e), which talks of
“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”,
almost says it all for me. I do not understand how that could possibly have got through the process by which this came forward, but it seems to speak to a lack of communication between parts of government that I hoped this new Government, with their energy, would have been able to overcome. It speaks to the fact that we need to keep an eye on both sides of the equation: what is happening in the online safety world and how data that is under the control of others, not necessarily those same companies, will be processed in support or otherwise of those who might wish to behave in an improper or illegal way towards children.
At the very least, what is in these amendments needs to be brought into the Bill. In fact, other additions may need to be made. I shall certainly keep my eye on it.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for bringing forward amendments in what is a profoundly important group. For all that data is a cornerstone of innovation and development, as we have often argued in this Committee, we cannot lose sight of our responsibility to safeguard the rights and welfare of our children.
To be on the safe side, I will write to the noble Baroness. We feel that other bits in the provisions of the Bill cover the other aspects but, just to be clear on it, I will write to her. On Amendment 196 and the Online Safety Act—
I am sorry to interrupt but I am slightly puzzled by the way in which that exchange just happened. I take it from what the Minister is saying that there is no dissent, in her and the Bill team’s thinking, about children’s rights having to be given the correct priority, but she feels that the current drafting is better than what is now proposed because it does not deflect from the broader issues that she has adhered to. She has fallen into the trap, which I thought she never would do, of blaming unintended consequences; I am sure that she will want to rethink that before she comes back to the Dispatch Box.
Surely the point being made here is about the absolute need to make sure that children’s rights never get taken down because of the consideration of other requirements. They are on their own, separate and not to be mixed up with those considerations that are truly right for the commissioner—and the ICO, in its new form—to take but which should never deflect from the way children are protected. If the Minister agrees with that, could she not see some way of reaching out to be a bit closer to where the noble Baroness, Lady Kidron, is?
I absolutely recognise the importance of the issues being raised here, which is why I think I really should write: I want to make sure that whatever I say is properly recorded and that we can all go on to debate it further. I am not trying to duck the issue; this issue is just too important for me to give an off-the-cuff response on it. I am sure that we will have further discussions on this. As I say, let me put it in writing, and we can pick that up. Certainly, as I said at the beginning, our intention was to enhance children’s protection rather than deflect from it.
Moving on to Amendment 196, I thank the noble Lord, Lord Clement-Jones, and other noble Lords for raising this important issue and seeking clarity on how the provision relates to the categorisation of services in the Online Safety Act. These categories are, however, not directly related to Clause 122 of this Bill as a data preservation notice can be issued to any service provider regulated in the Online Safety Act, regardless of categorisation. A list of the relevant persons is provided in paragraphs (a) to (e) of Section 100(5) of the Act; it includes any user-to-user service, search service and ancillary service.
I absolutely understand noble Lords saying that these things should cross-reference in some way but, as far we are concerned, they complement each other, and that protection is currently in the Online Safety Act. As I said, I will write to noble Lords and am happy to meet if that would be helpful. In the meantime, I hope that the explanations I have given are sufficient grounds for noble Lords not to press their amendments at this stage.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Stevenson of Balmacara
Main Page: Lord Stevenson of Balmacara (Labour - Life peer)Department Debates - View all Lord Stevenson of Balmacara's debates with the Department for Business and Trade
(1 month, 2 weeks ago)
Grand CommitteeMy Lords, I rise briefly to support my noble friend Lady Kidron on Amendment 137. The final comments from the noble and learned Lord, Lord Thomas, in our debate on the previous group were very apposite. We are dealing with a rapidly evolving and complex landscape, which AI is driving at warp speed. It seems absolutely fundamental that, given the panoply of different responsibilities and the level of detail that the different regulators are being asked to cover, there is on the face of what they have to do with children absolute clarity in terms of a code of practice, a code of conduct, a description of the types of outcomes that will be acceptable and a description of the types of outcomes that will be not only unacceptable but illegal. The clearer that is in the Bill, the more it will do something to future-proof the direction in which regulators will have to travel. If we are clear about what the outcomes need to be in terms of the welfare, well-being and mental health of children, that will give us some guidelines to work within as the world evolves so quickly.
My Lords, I have co-signed Amendment 137. I do not need to repeat the arguments that have already been made by those who have spoken before me on it; they were well made, as usual. Again, it seems to expose a gap in where the Government are coming from in this area of activity, which should be at the forefront of all that they do but does not appear to be so.
As has just been said, this may be as simple as putting in an initial clause right up at the front of the Bill. Of course, that reminds me of the battle royal we had with the then Online Safety Bill in trying to get up front anything that made more sense of the Bill. It was another beast that was difficult to ingest, let alone understand, when we came to make amendments and bring forward discussions about it.
My frustration is that we are again talking about stuff that should have been well inside the thinking of those responsible for drafting the Bill. I do not understand why a lot of what has been said today has not already appeared in the planning for the Bill, and I do not think we will get very far by sending amendments back and forward that say the same thing again and again: we will only get the response that this is all dealt with and we should not be so trivial about it. Could we please have a meeting where we get around the table and try and hammer out exactly what it is that we see as deficient in the Bill, to set out very clearly for Ministers where we have red lines—that will make it very easy for them to understand whether they are going to meet them or not—and do it quickly?
My Lords, the debate on this group emphasises how far behind the curve we are, whether it is by including new provisions in this Bill or by bringing forward an AI Bill—which, after all, was promised in the Government’s manifesto. It emphasises that we are not moving nearly fast enough in thinking about the implications of AI. While we are doing so, I need to declare an interest as co-chair of the All-Party Parliamentary Group on AI and a consultant to DLA Piper on AI policy and regulation.
I have followed the progress of AI since 2016 in the capacity of co-chair of the all-party group and chair of the AI Select Committee. We need to move much faster on a whole range of different issues. I very much hope that the noble Lord, Lord Vallance, will be here on Wednesday, when we discuss our crawler amendments, because although the noble Lord, Lord Holmes, has tabled Amendment 211A, which deals with personality rights, there is also extreme concern about the whole area of copyright. I was tipped off by the noble Lord, Lord Stevenson, so I was slightly surprised that he did not bring our attention to it: we are clearly due the consultation at any moment on intellectual property, but there seems to be some proposal within it for personality rights themselves. Whether that is a quid pro quo for a much-weakened situation on text and data mining, I do not know, but something appears to be moving out there which may become clear later this week. It seems a strange time to issue a consultation, but I recognise that it has been somewhat delayed.
In the meantime, we are forced to put forward amendments to this Bill trying to anticipate some of the issues that artificial intelligence is increasingly giving rise to. I strongly support Amendments 92, 93, 101 and 105 put forward by the noble Viscount, Lord Colville, to prevent misuse of Clause 77 by generative AI developers; I very much support the noble Lord, Lord Holmes, in wanting to see protection for image, likeness and personality; and I very much hope that we will get a positive response from the Minister in that respect.
We have heard from the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lords, Lord Russell and Lord Stevenson, all of whom have made powerful speeches on previous Bills—the then Online Safety Bill and the Data Protection and Digital Information Bill—to say that children should have special protection in data protection law. As the noble Baroness, Lady Kidron, says, we need to move on from the AADC. That was a triumph she gained during the passage of the Data Protection Act 2018, but six years later the world looks very different and young people need protection from AI models of the kind she has set out in Amendment 137. I agree with the noble Lord, Lord Stevenson, that we need to talk these things through. If it produces an amendment to this Bill that is agreed, all well and good, but it could mean an amendment or part of a new AI Bill when that comes forward. Either way, we need to think constructively in this area because protection of children in the face of generative AI models, in particular, is extremely important.
This group, looking forward to further harms that could be caused by AI, is extremely important on how we can mitigate them in a number of different ways, despite the fact that these amendments appear to deal with quite a disparate group of issues.
My Lords, given the hour, I will try to be as brief as possible. I will start by speaking to the amendments tabled in my name.
Amendment 142 seeks to prevent the Information Commissioner’s Office sending official notices via email. Official notices from the ICO will not be trivial: they relate to serious matters of data protection, such as monetary penalty notices or enforcement notices. My concern is that it is all too easy for an email to be missed. An email may be filtered into a spam folder, where it sits for weeks before being picked up. It is also possible that an email may be sent to a compromised email address, meaning one that the holder has lost control of due to a hacker. These concerns led me also to table Amendment 143, which removes the assumption that a notice sent by email had been received within 48 hours of being sent.
Additionally, I suspect I am right in saying that a great many people expect official correspondence to arrive via the post. I wonder, therefore, whether there might be a risk that people ignore an unexpected email from the ICO, concerned that it might well be a scam or a hack of some description. I, for one, am certainly deeply suspicious of unexpected but official-looking messages that arrive. I believe that official correspondence which may have legal ramifications should really be sent by post.
On some of the other amendments tabled, Amendment 135A, which seeks to introduce a measure from the DPDI Bill, makes provision for the introduction of a statement of strategic priorities by the Secretary of State that sets out the Government’s data protection priorities, to which the commissioner must have regard, and the commissioner’s duties in relation to the statement. Although I absolutely accept that this measure would create more alignment and efficiency in the way that data protection is managed, I understand the concerns that it would undermine the independence of the Information Commissioner’s Office. That in itself, of course, would tend to bear on the adequacy risk.
I do not support the stand part notices on Clauses 91 and 92. Clause 91 requires the Information Commissioner to prepare codes of practice for the processing of data, which seems a positive measure. It provides guidance to controllers, helping them to control best practice when processing data, and is good for data subjects, as it is more likely that their data will be processed in an appropriate manner. As for Clause 92, which would effectively increase expert oversight of codes of practice, surely that would lead to more effective codes, which will benefit both controllers and data subjects.
I have some concerns about Amendment 144, which limits the Information Commissioner to sending only one reprimand to a given controller during a fixed period. If a controller or processor conducts activities that infringe the provisions of the GDPR and does so repeatedly, why should the commissioner be prevented from issuing reprimands? Indeed, what incentives does that give for people to commit a minor sin and then a major one later?
I welcome Amendment 145, in the name of the noble Baroness, Lady Kidron, which would ensure that the ICO’s annual report records activities and action taken by the ICO in relation to children. This would clearly give the commissioner, parliamentarians and the data and tech industry as a whole a better understanding of how policies are affecting children and what changes may be necessary.
Finally, I turn my attention to many of the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove the involvement of the Secretary of State from the functions of the commissioner and transfer the responsibility from government to Parliament. I absolutely understand the arguments the noble Lord advances, as persuasively as ever, but I am concerned even so that the Secretary of State for the relevant department is the best person to work with the commissioner to ensure both clarity of purpose and rapidity of decision-making.
I wanted to rise to my feet in time to stop the noble Viscount leaping forward as he gets more and more excited as we reach—I hope—possibly the last few minutes of this debate. I am freezing to death here.
I wish only to add my support to the points of the noble Baroness, Lady Kidron, on Amendment 145. It is much overused saw, but if it is not measured, it will not get reported.
My Lords, I thank noble Lords for their consideration of the issues before us in this group. I begin with Amendment 134 from the noble Lord, Lord Clement-Jones. I can confirm that the primary duty of the commissioner will be to uphold the principal objective: securing an appropriate level of data protection, carrying out the crucial balancing test between the interests of data subjects, controllers and wider public interests, and promoting public trust and confidence in the use of personal data.
The other duties sit below this objective and do not compete with it—they do not come at the expense of upholding data protection standards. The commissioner will have to consider these duties in his work but will have discretion as to their application. Moreover, the new objectives inserted by the amendment concerning monitoring, enforcement and complaints are already covered by legislation.
I thank the noble Lord, Lord Lucas for Amendment 135A. The amendment was a previous feature of the DPDI Bill but the Government decided that a statement of strategic priorities for the ICO in this Bill is not necessary. The Government will of course continue to set out their priorities in relation to data protection and other related areas and discuss them with the Information Commissioner as appropriate.
Amendment 142 from the noble Viscount, Lord Camrose, would remove the ICO’s ability to serve notices by email. We would argue that email is a fast, accessible and inexpensive method for issuing notices. I can reassure noble Lords that the ICO can serve a notice via email only if it is sent to an email address published by the recipient or where the ICO has reasonable grounds to believe that the notice will come to the attention of the person, significantly reducing the risk that emails may be missed or sent to the wrong address.
Regarding the noble Viscount’s Amendment 143, the assumption that an email notice will be received in 48 hours is reasonable and equivalent to the respective legislation of other regulators, such as the CMA and Ofcom.
I thank the noble Lord, Lord Clement-Jones, for Amendment 144 concerning the ICO’s use of reprimands. The regulator does not commonly issue multiple reprimands to the same organisation. But it is important that the ICO, as an independent regulator, has the discretion and flexibility in instances where there may be a legitimate need to issue multiple reprimands within a particular period without placing arbitrary limits on that.
Turning to Amendment 144A, the new requirements in Clause 101 will already lead to the publication of an annual report, which will include the regulator’s investigation and enforcement activity. Reporting will be categorised to ensure that where the detail of cases is not public, commercially sensitive investigations are not inadvertently shared. Splitting out reporting by country or locality would make it more difficult to protect sensitive data.
Turning to Amendment 145, with thanks to the noble Baroness, Lady Kidron, I agree with the importance of ensuring that the regulator can be held to account on this issue effectively. The new annual report in Clause 101 will cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. Clause 90 also requires the ICO to publish a strategy and report on how it has complied with its new statutory duties. Both of these will cover the new duty relating to children’s awareness and rights, and this should include the ICO’s activity to support and uphold its important age-appropriate design code.
I thank the noble Lord, Lord Clement-Jones, for Amendments 163 to 192 to Schedule 14, which establishes the governance structure of the information commission. The approach, including the responsibilities conferred on the Secretary of State, at the core of the amendments follows standard corporate governance best practice and reflects the Government’s commitment to safeguarding the independence of the regulator. This includes requiring the Secretary of State to consult the chair of the information commission before making appointments of non-executive members.
Amendments 165 and 167A would require members of the commission to be appointed to oversee specific tasks and to be from prescribed fields of expertise. Due to the commission’s broad regulatory remit, the Government consider that it would not be appropriate or helpful for the legislation to set out specific areas that should receive prominence over others. The Government are confident that the Bill will ensure that the commission has the right expertise on its board. Our approach safeguards the integrity and independence of the regulator, draws clearly on established precedent and provides appropriate oversight of its activities.
Finally, Clauses 91 and 92 were designed to ensure that the ICO’s statutory codes are consistent in their development, informed by relevant expertise and take account of their impact on those likely to be affected by them. They also ensure that codes required by the Secretary of State have the same legal effect as pre-existing codes published under the Data Protection Act.
Considering the explanations I have offered, I hope that the noble Lords, Lord Clement-Jones and Lord Lucas, the noble Viscount, Lord Camrose, and the noble Baroness, Lady Kidron, will agree not to press their amendments.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Stevenson of Balmacara
Main Page: Lord Stevenson of Balmacara (Labour - Life peer)Department Debates - View all Lord Stevenson of Balmacara's debates with the Department for Science, Innovation & Technology
(1 week, 3 days ago)
Lords ChamberMy Lords, I have also put my name to most of the amendments. As with the noble Baroness, Lady Harding, that some of them do not have my name on them is because I arrived too late. Between her and my noble friend Lady Kidron, they have said everything that needs to be said very powerfully. As one who has more recently become involved in a variety of Bills—the Policing and Crime Bill, the Online Safety Bill, and the Victims and Prisoners Bill—in every case trying to fight for and clarify children’s rights, I can say that it has been an uphill battle. But the reason we have been fighting for this is that we have lamentably failed to protect the interests of children for the past two decades as the world has changed around us. All of us who have children or grandchildren, nephews or nieces, or, like me, take part in the Learn with the Lords programme and go into schools, or who deal with mental health charities, are aware of the failure of government and regulators to take account, as the world changed around us, of the effect it would have on children.
In our attempts to codify and clarify in law what the dangers are and what needs to be put in place to try to prevent them, we have had an uphill struggle, regardless of the colour of government. In principle, everyone agrees. In practice, there is always a reason why it is too difficult—or, the easy way out is to say, “We will tell the regulator what our intent is, but we will leave it up to the regulator to decide”.
Our experience to date of the ability of a regulator entirely to take on board what was very clearly the will of Parliament when the Bill became an Act is not being made flesh when it comes to setting out the regulation. Unless it is in an Act and it is made manifestly clear what the desired outcomes are in terms of safety of children, the regulator—because it is difficult to do this well—will not unreasonably decide that if it is too difficult to do, they will settle for something that is not as good as it could be.
What we are trying to do with this set of amendments is to say to the Government up front, “We want this to be as effective as it possibly could be now”. We do not want to come back and rue the consequences of not being completely clear and of putting clear onus of responsibility on the regulators in two or three years’ time, because in another two or three years children will have important parts of their childhood deteriorating quite rapidly, with consequences that will stay with them for the rest of their lives.
My Lords, I was one of those who was up even earlier than the noble Baroness, Lady Harding, and managed to get my name down on these amendments. It puts me in a rather difficult position to be part of the government party but to seek to change what the Government have arrived at as their sticking position in relation to this issue in particular—and indeed one or two others, but I have learned to live with those.
This one caught my eye in Committee. I felt suddenly, almost exactly as the noble Lord, Lord Russell said, a sense of discontinuity in relation to what we thought it was in the Government’s DNA—that is, to bring forward the right solution to the problems that we have been seeking to change in other Bills. With the then Online Safety Bill, we seemed to have an agreement around the House about what we wanted, but every time we put it back to the officials and people went away with it and came back with other versions, it got worse and not better. How children are dealt with and how important it is to make sure that they are prioritised appears to be one of those problems.
The amendments before us—and I have signed many of them, because I felt that we wanted to have a good and open debate about what we wanted here—do not need to be passed today. It seems to me that the two sides are, again, very close in what we want to achieve. I sensed from the excellent speech of the noble Baroness, Lady Kidron, that she has a very clear idea of what needs to go into this Bill to ensure that, at the very least, we do not diminish the sensible way in which we drafted the 2018 Bill. I was part of that process as well; I remember those debates very well. We got there because we hammered away at it until we found a way of finding the right words that bridged the two sides. We got closer and closer together, but sometimes we had to go even beyond what the clerks would feel comfortable with in terms of government procedure to do that. We may be here again.
When he comes to respond, can the Minister commit to us today in this House that he will bring back at Third Reading a version of what he has put forward—which I think we all would say does not quite go far enough; it needs a bit more, but not that much more—to make it meet with where we currently are and where, guided by the noble Baroness, Lady Kidron, we should be in relation to the changing circumstances in both the external world and indeed in our regulator, which of course is going to go through a huge change as it reformulates itself? We have an opportunity, but there is also a danger that we do not take it. If we weaken ourselves now, we will not be in the right position in a few years’ time. I appeal to my noble friend to think carefully about how he might manage this process for the best benefit of all of us. The House, I am sure, is united about where we want to get to. The Bill does not get us there. Government Amendment 18 is too modest in its approach, but it does not need a lot to get it there. I think there is a way forward that we do not need to divide on. I hope the Minister will take the advice that has been given.
My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.
All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.
Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.
I am presuming a little here that the Minister’s lack of experience in the procedures of the House is holding him back, but I know he is getting some advice from his left. The key thing is that we will not be able to discuss this again in this House unless he agrees that he will bring forward an amendment. We do not have to specify today what that amendment will be. It might not be satisfactory, and we might have to vote against it anyway. But the key is that he has to say this now, and the clerk has to nod in agreement that he has covered the ground properly.
We have done this before on a number of other Bills, so we know the rules. If the Minister can do that, we can have the conversations he is talking about. We have just heard the noble Baroness, Lady Kidron, explain in a very graceful way that this will be from a blank sheet of paper so that we can build something that will command the consensus of the House. We did it on the Online Safety Bill; we can do it here. Please will he say those words?
I am advised that I should say that I am happy for the amendment to be brought forward, but not as a government amendment. We are happy to hear an amendment from the noble Baroness at Third Reading.
Let us be quite clear about this. It does not have to be a government amendment, but the Government Minister has to agree that it can be brought forward.
I thank the Minister for that very generous offer. I also thank the noble Lord, Lord Stevenson, for his incredible support. I note that, coming from the Government Benches, that is a very difficult thing to do, and I really appreciate it. On the basis that we are to have an amendment at Third Reading, whether written by me with government and opposition help or by the Government, that will address these fundamental concerns set out by noble Lords, I will not press this amendment today.
These are not small matters. The implementation of the age-appropriate design code depends on some of the things being resolved in the Bill. There is no equality of arms here. A child, whether five or 15, is no match for the billions of dollars spent hijacking their attention, their self-esteem and their body. We have to, in these moments as a House, choose David over Goliath. I thank the Minister and all the supporters in this House —the “Lords tech team”, as we have been called in the press. With that, I beg leave to withdraw the amendment.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Stevenson of Balmacara
Main Page: Lord Stevenson of Balmacara (Labour - Life peer)Department Debates - View all Lord Stevenson of Balmacara's debates with the Department for Science, Innovation & Technology
(1 week, 3 days ago)
Lords ChamberMy Lords, I do not think the noble Baroness, Lady Harding, lost the audience at all; she made an excellent case. Before speaking in support of the noble Baroness, I should say, “Blink, and you lose a whole group of amendments”. We seem to have completely lost sight of the group starting with Amendment 19—I know the noble Lord, Lord Holmes, is not here—and including Amendments 23, 74 and government Amendment 76, which seems to have been overlooked. I suggest that we degroup next week and come back to Amendments 74 and 76. I do not know what will happen to Amendment 23; I am sure there is a cunning plan on the Opposition Front Bench to reinstate that in some shape or form. I just thought I would gently point that out, since we are speeding along and forgetting some of the very valuable amendments that have been tabled.
I very much support, as I did in Committee, what the noble Baroness, Lady Harding, said about Amendment 24, which aims to clarify the use of open electoral register data for direct marketing. The core issue is the interpretation of Article 14 of the GDPR, specifically regarding the disproportionate effort exemption. The current interpretation, influenced by recent tribunal rulings, suggests that companies using open electoral register—OER—data would need to notify every individual whose data is used, even if they have not opted out. As the noble Baroness, Lady Harding, implied, notifying millions of individuals who have not opted out is unnecessary and burdensome. Citizens are generally aware of the OER system, and those who do not opt out reasonably expect to receive direct marketing materials. The current interpretation leads to excessive, unhelpful notifications.
There are issues about financial viability. Requiring individual notifications for the entire OER would be financially prohibitive for companies, potentially leading them to cease using the register altogether. On respect for citizens’ choice, around 37% of voters choose not to opt out of OER use for direct marketing, indicating their consent to such use. The amendment upholds this choice by exempting companies from notifying those individuals, which aligns with the GDPR’s principle of respecting data subject consent.
On clarity and certainty, Amendment 24 provides clear exemptions for OER data use, offering legal certainty for companies while maintaining data privacy and adequacy. This addresses the concerns about those very important tribunal rulings creating ambiguity and potentially disrupting legitimate data use. In essence, Amendment 24 seeks to reconcile the use of OER data for direct marketing with the principles of transparency and data subject rights. On that basis, we on these Benches support it.
I turn to my amendment, which seeks a soft opt-in for charities. As we discussed in Committee, a soft opt-in in Regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 allows organisations to send electronic mail marketing to existing customers without their consent, provided that the communication is for similar products and services and the messages include an “unsubscribe” link. The soft opt-in currently does not apply to non-commercial organisations such as charities and membership organisations. The Data & Marketing Association estimates that extending the soft opt-in to charities would
“increase … annual donations in the UK by £290 million”.
Extending the soft opt-in as proposed in both the Minister’s and my amendment would provide charities with a level playing field, as businesses have enjoyed this benefit since the introduction of the Privacy and Electronic Communications Regulations. Charities across the UK support this change. For example, the CEO of Mind stated:
“Mind’s ability to reach people who care about mental health is vital. We cannot deliver life changing mental health services without the financial support we receive from the public”.
Oxfam’s individual engagement director noted:
“It’s now time to finally level the playing field for charities too and to allow them to similarly engage their passionate and committed audiences”.
Topically, too, this amendment is crucial to help charities overcome the financial challenges they face due to the cost of living crisis and the recent increase in employer national insurance contributions. So I am delighted, as I know many other charities will be, that the Government have proposed Amendment 49, which achieves the same effect as my Amendment 50.
My Lords, I declare an interest that my younger daughter works for a charity which will rely heavily on the amendments that have just been discussed by the noble Lord, Lord Clement-Jones.
I want to explain that my support for the amendment moved by the noble Baroness, Lady Harding, was not inspired by any quid pro quo for earlier support elsewhere —certainly not. Looking through the information she had provided, and thinking about the issue and what she said in her speech today, it seemed there was an obvious injustice happening. It seemed wrong, in a period when we were trying to support growth, that we cannot see our way through it. It was in that spirit that I suggested we should push on with it and bring it back on Report, and I am very happy to support it.
I do not want to try the patience of the House at this late hour. I am unhappy about Clause 77 as a whole. Had I had the opportunity, we could have debated it in Committee; unfortunately, I was double-booked, so was unable. Now we are on Report, which does not really provide a platform for discussing the exclusion of the clause.
However, the noble Baroness has provided an opportunity for me to make the point that combining data is the weak point, the point at which we lose control. For that reason, I am unhappy about this amendment. We need to keep high levels of vigilance with regard to the ability to take data from one area and apply it in another, because that is when personal privacy disappears.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Stevenson of Balmacara
Main Page: Lord Stevenson of Balmacara (Labour - Life peer)Department Debates - View all Lord Stevenson of Balmacara's debates with the Department for Science, Innovation & Technology
(3 days, 1 hour ago)
Lords ChamberMy Lords, I strongly support Amendments 44A and 61 to 65 in the name of the noble Baroness, Lady Kidron, who is to be congratulated on raising this incredibly important and timely subject, her doughty leadership on these issues, and an absolutely first-class speech. I regret that I was unable to take part in Committee.
I will talk about the profound significance of these amendments for the media, although they are equally important across all the creative industries, which I know we will hear about. I declare my interest as deputy chairman of the Telegraph Media Group and note my other interests in the register.
The key point is that an effective, enforceable and comprehensive copyright regime is absolutely fundamental to the sustainability of a free, independent media. Without it, the media cannot survive. Publishers have to invest huge amounts of money in high-quality journalism, investigative reporting, world-class comment and content. That they can do so is because copyright laws protect this content, ensuring the commercial viability of publishers —print and broadcast—as well as the livelihoods of individual journalists and freelancers.
We talk a lot in this House about the threats to the free media resulting from digital, which smashed to pieces the business model that once sustained publishing and quality journalism. Publishers from across the spectrum have found innovative ways to adapt to that and produce new paths to commercial success to maintain their investment in independent investigation and reporting, which is the very lifeblood of a democracy. Parliament, with cross-party support, has assisted through the Digital Markets, Competition and Consumers Act, which establishes a tough competition regime to control the untrammelled power of vast, unaccountable platforms. But just when the media has been successfully adapting to the new world, along comes a far graver threat—AI—and government proposals flying in the face of the DMCC Act to weaken, through a sweeping text and data-mining exception, the UK’s gold-standard copyright regime, which is the absolute bedrock of quality, independent, regulated media.
I know how strongly noble Lords opposite and from across the House value the fundamental role our free media plays in our democratic society, because without it, all of our freedom is in peril. The Bill and the connected government consultation will either help it or kill it; I am afraid it is as stark as that. Of course I welcome the Government’s apparent aim to provide transparency and facilitate licensing, but their preferred option of an exception—on which there has been no impact assessment, as the noble Baroness, Lady Kidron, said—is fundamentally flawed and wholly impractical.
Instead, we need with these amendments to ensure three things happen to make investment in journalism possible through an effective legal regime protecting copyright, creativity and innovation. That is transparency, the power of control over how news content is used, and fair remuneration. Only that will drive the dynamic licensing market that is necessary to ensure both the media and AI sectors flourish and grow. These imaginative amendments will achieve that by expanding UK copyright law to cover any AI model linked to the UK, compelling, in a strikingly simple way, AI firms to provide information about how they scrape content and what they scrape, and ensuring we have the enforcement powers necessary to make big tech—which is so adept at arrogantly ignoring what it does not like and what this House says—take notice. That is why I will support these amendments, and I am proud to do so.
I must add that I am deeply disappointed that the long-standing commitment of my party to upholding the values of a free press and supporting the sustainability of the British media has not extended to formal support for these amendments. It is incredibly short-sighted.
If these amendments pass, as I hope they will, this legislation can complete a landmark trio of laws—with the Online Safety Act and the DMCC Act—to make the giant platforms regulated and accountable. Like others in this debate, I want to make it clear that I support the noble Baroness’s absolutely vital amendments not because I am anti-AI but because I am pro free independent media, pro the creativity which fuels it, and pro the commercial foundations that support it.
If these amendments are successful, we can create a situation where the tech and AI sectors can flourish alongside the creative industries, thereby powering economic growth between them. Because of the vital role the media plays in our democracy, I genuinely believe that this is one of the most crucial debates that we will have in this Parliament. I have this stark warning: without adequate transparency, control and reward, publishers will no longer be able to invest as they have in the creation of the original, high-quality investigative content on which our democracy and the accountability of those in power are based. Without that, our democracy will die in the dark at the hands of Silicon Valley, as we become dependent on the morass of fake news and social media clickbait. I strongly urge all noble Lords to support the amendments.
I am grateful to the noble, Lord Black, for daring to respond to the wonderful speech that opened the debate; I thought I might come in immediately afterwards, but I was terrified by it, so I decided that I would shelter on these Benches and gather my strength before I could begin to respond.
I feel that I have to speak because I am a member of the governing party, which is against these amendments. However, I have signed up to them because I have interests in the media—which I declare; I suppose I should also declare that I have a minor copyright, but that is very small compared with the ones we have already heard about—and because I feel very strongly that we will get ourselves into even more trouble unless action is taken quickly. I have a very clear view of the Government’s proposals, thanks to a meeting with my noble friend the Minister yesterday, where he went through, in detail, some of the issues and revealed some of the thinking behind them; I hope that he will come back to the points he made to me when he comes to respond.
There is no doubt that the use of a copyright work without the consent of the copyright owner in the United Kingdom is an infringement, unless it is “fair dealing” under UK copyright law. However, because of the developments in technology—the crawlers, scrapers and GAI that we have been hearing about—there is a new usage of a huge number of copyright works for the training of algorithms. That has raised questions about whether, and if so how, such usage has to be legislated for as “fair dealing”—if it is to be so—or in some other way, if there is indeed one.
It is right, therefore, for the Government to have required the IPO to carry out a consultation on copyright and AI, which we have been talking about. However, given the alarm and concern evident in the creative sector, we certainly regret the delay in bringing forward this consultation and we are very concerned about its limited scope. Looking at it from a long way away, it seems that this is as much a competition issue as it is a copyright issue. It seems to me and to many others, as we have heard, that the IPO, by including in the consultation document a proposed approach described as an “exception with rights reservation”, has made a very substantial mistake.
This may just be a straw-person device designed to generate more responses, but, if so, it was a bad misjudgement. Does it not make the whole consultation exercise completely wasteful and completely pointless to respond to? When my noble friend the Minister comes to respond, I hope that he, notwithstanding that proposed approach, will confirm that, as far as the Government are concerned, this is a genuine consultation and that all the possible options outlined by the IPO—and any other solutions brought forward during the consultation—will be properly considered on their merits and in the light of the responses to the consultation.
What the creative industries are telling us—they have been united and vehement about this issue, as has already been described, in a way that I have never seen before—is that they must have transparency about what material is being scraped, the right to opt in to the TDMs taking place and a proper licensing system with fair remuneration for the copyright material used. The question of whether the GAI developers should be allowed to use copyright content, with or without the permission of the copyright owner, is a nuanced one, as a decision either way will have very wide-ranging ramifications. However, as we have heard, this issue is already affecting the livelihood of our creative sector—the one that, also as we have heard, we desperately need if we are to support a sustainable creative economy and provide the unbiased information, quality education and British-based entertainment that we all value and want to see flourish.
We understand the need to ensure that the companies that want access to high-quality data and copyright material to train their AI models respect, and will be happy to abide by, any new copyright or competition regulations that may be required. However, the proposals we have heard about today—the ones that would come from the consultation, if we have to delay—will probably be very similar to the amendments before the House, which are modest and fair. We should surely not want to work with companies that will not abide by such simple requirements.
My Lords, I support Amendments 44A and the consequential amendments in this group in the name of my noble friend Lady Kidron, whose speech has, I think, moved the whole Committee across all Benches.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Stevenson of Balmacara
Main Page: Lord Stevenson of Balmacara (Labour - Life peer)Department Debates - View all Lord Stevenson of Balmacara's debates with the Department for Science, Innovation & Technology
(3 days, 1 hour ago)
Lords ChamberMy Lords, I am pleased to follow the noble Baroness, Lady Morgan, who did so much during the Online Safety Bill—now Act—to champion the issues that are now before us. She should get full credit for the first steps she made. I think I said it before, and I will say it again in her presence, that we thought we had achieved much of what we are talking about today in the final wind-up of that Bill, but we had to swap it for a slightly bigger prize and it fell down slightly on the list, so I feel very guilty about this and want to help to redress somehow the balance of the deficit that was created.
I do not want to get, in this House, any reputation for being a person who asks geeky questions about Third Reading issues, but the Minister will know that getting access to debates at Third Reading is tricky. It often requires the graven head of the clerk to nod very slowly at an appropriate moment, and I wonder if we could just rehearse that slightly so that we are quite clear exactly what the noble Baroness, Lady Morgan, was saying.
Am I right in saying that the intention—and good intentions are great—is that there will be a government amendment at Third Reading? Since it is being produced by the Government, there is not an issue for the clerk to nod at, because that is allowed. If there is a government amendment dealing with all the issues we raised today, then we are all in a good place. It is right that this House, which has done so much to come together to create it, gets the credit for this Bill going down to the Commons. That is appropriate and something that we should get right.
In the absence of the Bill—and I recognise that there are difficulties about drafting, and it may well be that we have a very short time between Report and Third Reading—would it not be appropriate for the Minister to say to the clerk that it is his intention that, if necessary, the noble Baroness, Lady Owen, may bring forward an amendment on these issues so that at least we get, if not all of the package, the parts that are relevant and most important to it in the Bill as it leaves this House? That would be helpful all round, and it would be in accordance with the sentiment of the House.
My Lords, I share in the congratulations of my noble friend Lady Owen. It has taken me about 10 years to begin to understand how this House works and it has taken her about 10 minutes.
I want to pursue something which bewilders me about this set of amendments, which is the amendment tabled by the noble Baroness, Lady Gohir. I do not understand why we are talking about a different Bill in relation to audio fakes. Audio has been with us for many years, yet video deepfakes are relatively new. Why are we talking about a different Bill in relation to audio deepfakes?
My Lords, the opening amendment in this group is a government amendment that we are withdrawing, so we are setting up the debate. There could be a similar mechanism at Third Reading. I do not know how it will actually be worked out, but there is an undertaking on behalf of the Government to say how far we have got on the solicitation issue, with a view to moving amendments in the other place.
Before the Minister sits down, that was exactly the point I was trying to make, and I am very grateful to the noble Lord, Lord Pannick, for raising it again. It does need the Minister to say to the clerk that it is possible for the noble Baroness, Lady Owen, to bring an amendment, if necessary, at Third Reading. If the Minister could repeat that at the Dispatch Box, I think we would both be happy.
Yes. If the noble Baroness wants to bring back a similar amendment on this issue, that indeed can be debated at Third Reading.
My Lords, I speak in support of the noble Baroness, Lady Kidron, on Amendment 58, to which I have also put my name. Given the time, I will speak only about NHS datasets.
There have been three important developments since the Committee stage of this Bill in mid-December: the 43rd annual J P Morgan healthcare conference in San Francisco in mid-January, the launch of the AI Opportunities Action Plan by the Prime Minister on Monday 13 January and the announcement of the Stargate project in the White House the day after President Trump’s inauguration.
Taking these in reverse chronological order, it is not clear exactly how the Stargate project will be funded, but several US big tech companies and SoftBank has pledged tens of billions of dollars. At least $100 billion will be available to build the infrastructure for next-generation AI, and it may even rise to $500 billion in the next four years.
The UK cannot match these sums. The AI Opportunities Action Plan instead lays out how the UK can compete by using its own advantages: a long track record of world-leading AI research in our universities and some unique, hugely valuable datasets.
At the JP Morgan conference in San Francisco, senior NHS management had more than 40 meetings with AI companies. These companies all wanted to know one thing: how and when they could access NHS datasets.
It is not surprising, therefore, that it was reported in November that the national federated data platform would soon be used to train different types of AI models. The two models mentioned were Open AI’s proprietary ChatGPT and Google’s medical AI, Med-Gemini, based on Google’s proprietary large language model, Gemini. Presumably, these models will be fine-tuned using the data stored in the federated data platform.
Amendment 58 is not about restricting access to UK datasets by Open AI, Google or any other US big tech company. Instead, it seeks to maximise their long- term value, driven by strategic goals rather than short-term, opportunistic gains. By classifying valuable public sector datasets as sovereign data assets, we can ensure that the data is made available under controlled conditions, not only to public sector employees and researchers but to industry, including US big tech companies.
We should expect a financial return when industry is given access to a sovereign dataset. A first condition is a business model such that income is generated for the relevant public body, in this case the NHS, from the access fees paid by the companies that will be the authorised licence holders.
A second condition is signposted in the AI Opportunities Action Plan, whose recommendations have all been accepted by the Government. In the third section of the action plan, “Secure our future with homegrown AI”, Matt Clifford, the author of the plan, writes that
“we must be an AI maker, not just an AI taker: we need companies … that will be our UK national champions … Generating national champions will require a more activist approach”.
Part of this activist approach should be to give companies and organisations headquartered in the UK preferential terms of access to our sovereign data assets.
These datasets already exist in the NHS as minimum viable products, so we cannot afford to delay. AI companies are keen to access data in the federated data platform, which is NHS England’s responsibility, or in the secure data environments set up by the National Institute for Health and Care Research, NIHR.
I urge the Government to accept the principles of this amendment as they will provide the framework needed now to support NHS England and NIHR in their negotiations with AI companies.
I have signed Amendment 58. I also support the other amendment spoken to by the noble Baroness, although I did not get around to signing it. They both speak to the same questions, some of which have been touched on by both previous speakers.
My route into this was perhaps a little less analytic. I used to worry about the comment lots of people used to make, wittily, that data was the new oil, without really thinking about what that meant or what it could mean. It began to settle in my mind that, if indeed data is an asset, why is it not carried on people’s balance sheets? Why does data held by companies or even the Government not feature in some sort of valuation? Just like oil held in a company or privately, it will eventually be used in some way. That releases revenue that would otherwise have to be accounted for and there will be an accounting treatment. But as an accountant I have never seen any company’s assets that ever put a value on data. That is where I came from.
A sovereign data approach, which labels assets of value to the economy held by the country rather than a company, seems to be a way of trying to get into language what is more of an accounting approach than perhaps we need to spend time on in this debate. The noble Baroness, Lady Kidron, has gone through the amendment in a way that explains the process, the protection and the idea that it should be valued regularly and able to account for any returns it makes. We have also heard about the way it features in other publications.
I want to take a slightly different part of the AI Opportunities Action Plan, which talks about data and states:
“We should seek to responsibly unlock both public and private data sets to enable innovation by UK startups and researchers and to attract international talent and capital. As part of this, government needs to develop a more sophisticated understanding of the value of the data it holds, how this value can be responsibly realised, and how to ensure the preservation of public trust across all its work to unlock its data assets”.
These are very wise words.
I end by saying that I was very struck by the figures released recently about the number of people who opted out of the NHS’s data collection. I think there are Members present who may well be guilty of such a process. I of course am happy to have my data used in a way that will provide benefit, but I do recognise the risks if it is not properly documented and if people are not aware of what they are giving up or offering in return for the value that will be extracted from it.
I am sure we all want more research and better research. We want research that will yield results. We also want value and to be sure that the data we have given up, which is held on our behalf by various agencies, is properly managed. These amendments seem to provide a way forward and I recommend them.
My Lords, I support Amendments 58 and 71, which address what I consider to be a fundamental oversight in our nation’s stewardship of public data assets.
While these amendments embrace intentionally broad definitions of sovereign data assets and a national data library, their purpose is precise: to recognise, protect and optimise the public value of these critical national resources for generations to come. The amendments’ dual emphasis on robust consent mechanisms and a transparent licensing framework—one that provides preferential access to UK entities—strikes a careful balance between fostering public trust and safeguarding our national interests.
Central to these amendments is the requirement for the Secretary of State to provide comprehensive reporting on both the current value and projected returns from these assets. This addresses a striking accountability gap in our governance framework. While the National Audit Office maintains rigorous oversight of our physical infrastructure, previous Administrations have failed to adequately account for the taxpayers’ substantial investment in public data infrastructure and intangible or knowledge assets.
Consider this striking disparity: Ernst & Young’s 2019 analysis projected that a curated NHS dataset could generate £5 billion annually for the UK, while delivering £4.6 billion in patient benefits through enhanced infrastructure. Yet we lack robust mechanisms to track whether these substantial benefits materialise or are captured and flow back into our healthcare system. This speaks directly to the Tony Blair Institute’s prescient call last year, endorsed by none other than the Minister, the noble Lord, Lord Vallance, for the establishment of an NHS data trust or comparable stewardship vehicle.
As we navigate an AI revolution, we must shift our focus from simply managing risks to proactively harnessing opportunities for social impact and economic growth. This raises two fundamental questions. How can we leverage this technological transformation to maximise public benefit, and how will Parliament effectively scrutinise future trade agreements, particularly with nations like the United States, without established evaluation methodologies or transparent licencing systems of our valuable data assets?
The British public, already bearing a significant tax burden to fund public services, deserves assurance that our valuable digital assets will not be transferred today, only to be transformed into expensive treatments tomorrow, benefiting companies that pay tax overseas. Amendments 58 and 71 provide essential safeguards against the inadvertent undervaluation or transfer of these critical national assets. They ensure proper stewardship of our digital resources for the public good, and I therefore support the intentions behind these amendments.