(4 days, 6 hours ago)
Grand CommitteeMy Lords, unusually, I rise to move an amendment, Amendment 138. For the second time in Committee, I find myself heading a group when I know that the noble Baroness, Lady Kidron, will be much better qualified to introduce the subject. Indeed, she has an amendment, Amendment 141, which is far preferable in many ways to mine.
Amendment 138 is designed to ensure that the Information Commissioner produces a code of practice specific to children up to the age of 18 for the purposes of UK law and Convention 108, and pupils as defined by the Education Act 1996, who may be up to the age of 19 or, with special educational needs, up to 25 in the education sector. The charity Data, Tech & Black Communities put it this way in a recent letter to the noble Baroness, Lady Jones:
“We recently completed a community research project examining the use of EdTech in Birmingham schools. This project brought us into contact with over 100 people … including parents, school staff and community members. A key finding was the need to make it easier for those with stewardship responsibility for children’s data, to fulfil this duty. Even with current data protection rights, parents and guardians struggle to make inquiries (of schools, EdTech companies and even DfE) about the purpose behind the collection of some of their children’s data, clarity about how it is used (or re-used) or how long data will be retained for. ‘Opting out’ on behalf of their children can be just as challenging. All of which militates against nuanced decision-making about how best to protect children’s short and long-term interests … This is why we are in support of an ICO Code of Practice for Educational Settings that would enable school staff, parents and learners, the EdTech industry and researchers to responsibly collect, share and make use of children’s data in ways that support the latter’s agency over their ‘digital selves’ and more importantly, will support their flourishing”.
The duties of settings and data processers and rights appropriate to the stage of education and children’s capacity needs clarity and consistency. Staff need confidence to access and use data appropriately within the law. As the UNCRC’s General Comment No. 16 (2013) on State Obligations Regarding the Impact of the Business Sector on Children’s Rights set out over a decade ago,
“the realization of children’s rights is not an automatic consequence of economic growth and business enterprises can also negatively impact children’s rights”.
The educational setting is different from only commercial interactions or in regard to the data subjects being children. It is more complex because of the disempowered environment and its imbalance of power between the authority, the parents and the child. The additional condition is the fact that parents’ and children’s rights are interlinked, as exemplified in the right to education described in UDHR Article 26(3), which states:
“Parents have a prior right to choose the kind of education that shall be given to their children.”
A code is needed because the explicit safeguards are missing that the GDPR requires in several places but were left out of the UK Data Protection Act 2018 drafting. Clause 80 of the Bill—“Automated decision-making”—does not address the necessary safeguards of GDPR Article 23(1) for children. Furthermore, removing the protections of the balancing test under the recognised legitimate interest condition will create new risks. Clauses on additional further processing or changes to purpose limitation are inappropriately wide without child-specific safeguards. The volume, sensitivity and intrusiveness of identifying personal data collection in educational settings only increases, while the protections are only ever reduced.
Obligations specific to children’s data, especially
“solely automated decision-making and profiling”
and exceptions, need to be consistent with clear safeguards by design where they restrict fundamental freedoms. What does that mean for children in practice, where teachers are assumed to be the rights bearers in loco parentis? The need for compliance with human rights, security, health and safety, among other standards proportionate to the risks of data processing and respecting the UK Government’s accessibility requirements, should be self-evident and adopted in a code of practice, as recommended in the five rights in the Digital Futures Commission’s blueprint for educational data governance.
The Council of Europe Strategy for the Rights of the Child (2022-2027) and the UNCRC General Comment No. 25 on Children’s Rights and the Digital Environment make it clear that
“children have the right to be heard and participate in decisions affecting them”.
They recognise that
“capacity matters, in accordance with their age and maturity. In particular attention should be paid to empowering children in vulnerable situations, such as children with disabilities.”
Paragraph 75 recognises that surveillance in educational settings should not take place without the right to object and that teachers need training to keep up with technological developments.
Participation of young people themselves has not been invited in the development of this Bill and the views of young people have not been considered. However, a small sample of parent and pupil voices has been captured in the Responsible Technology Adoption Unit’s public engagement work together with the DfE in 2024. The findings back those of Defend Digital Me’s Survation poll in 2018 and show that parents do not know that the DfE already holds named pupil records without their knowledge or permission and that the data is given away to be reused by hundreds of commercial companies, the DWP, the Home Office and the police. It stated:
“There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement. Parents, in particular, stressed the need for clear and comprehensive information about pupil work and data use and any potential risks relating to data security and privacy breaches.”
A code of practice is needed to explain the law and make it work as intended for everyone. The aims of a code of practice for educational settings would be that adherence to a code creates a mechanism for controllers and processors to demonstrate compliance with the legislation or approve certification methods. It would give providers confidence in consistent and clear standards and would be good for the edtech sector. It would allow children, parents, school staff and systems administrators to build trust in safe, fair and transparent practice so that their rights are freely met by design and default.
Further, schools give children’s personal data to many commercial companies during a child’s education—not based on consent but assumed for the performance of a task carried out in the public interest. A code should clarify any boundaries of this lawful basis for commercial purposes, where it is an obligation on parents to provide the data and what this means for the child on reaching maturity or after leaving the educational setting.
Again, a code should help companies understand “data protection by design and default” in practice, and appropriate “significant legal effect”, the edges of “public interest” in data transfers to a third country, and how special categories of data affect children in schools. A code should also support children and families in understanding the effect of the responsibilities of controllers and processes for the execution or limitation of their own rights. It would set out the responsibilities of software platforms that profile users’ metadata to share with third parties, or of commercial apps signed up for in schools that offer adverts in use.
I hope that I have explained exactly why we believe that a code of conduct is required in educational settings. I beg to move.
My Lords, I support and have added my name to Amendment 138 in the name of the noble Lord, Lord Clement-Jones. I will also speak to Amendment 141 in my name and those of the noble Lords, Lord Knight and Lord Russell, and the noble Baroness, Lady Harding.
Both these amendments propose a code of practice to address the use of children’s data in the context of education. Indeed, they have much in common. Having heard the noble Lord, Lord Clement-Jones, I have much in common with what he said. I associate myself entirely with his remarks and hope that mine will build on them. Both the amendments point to the same problem that children’s data is scandalously treated in our schools and educators need support; this is a persistent and known failure that both the DfE and the ICO have failed to confront over a period of some years.
Amendment 141 seeks to give a sense of exactly what an education code should cover. In doing so, it builds on the work of the aforementioned Digital Futures for Children centre at the LSE, which I chair, the work of Defend Digital Me, the excellent work of academics at UCL, and much of the work relating to education presented to the UN tech envoy in the course of drafting the UN global digital compact.
Subsection (1) of the proposed new clause would require the ICO to prepare a code of practice in connection with the provision of education. Subsection (2) sets out what the ICO would have to take into account, such as that education provision includes school management and safeguarding as well as learning; the different settings in which it takes place; the need for transparency and evidence of efficacy; and all the issues already mentioned, including profiling, transparency, safety, security, parental involvement and the provision of counselling services.
Subsection (3) would require the ICO to have regard to children’s entitlement to a higher standard of protection—which we are working so hard in Committee to protect—their rights under the UNCRC and their different ages and stages of development. Importantly, it also refers to the need and desire to support innovation in education and the need to ensure that the benefits derived from the use of UK children’s data accrue to the UK.
Subsection (4) lists those whom the commissioner would have to consult, and subsection (5) sets out when data processors and controllers would be subject to the code. Subsection (6) proposes a certification scheme for edtech services to demonstrate compliance with UK GDPR and the code. Subsection (7) would require edtech service and product providers to evidence compliance—importantly, transferring that responsibility from schools to providers. Subsection (8) simply defines the terms.
A code of practice is an enabler. It levels the playing field, sets terms for innovators, creates sandbox or research environments, protects children and supports schools. It offers a particularly attractive environment for developing the better digital world that we would all like to see, since schools are identifiable communities in which changes and outcomes could be measured.
My Lords, I shall also speak to Amendment 198 in my name and register my support for the amendments in the name of the noble Lord, Lord Bethell, to which I have added my name. Independent research access is a very welcome addition to the Bill by the Government. It was a key recommendation of the pre-legislative scrutiny committee on the Online Safety Bill in 2021 and I know that I speak for many colleagues in the academic field, as well as many civil society organisations, who are delighted by its swift and definitive inclusion in the Bill.
The objective of these amendments is not to derail the Government’s plans, but rather to ensure that they happen and to make the regime work for children and the UK’s world-class academic institutions and stellar civil society organisations, ensuring that we can all do high-quality research about emergent threats to children and society more broadly.
Amendment 197 would ensure that the provisions in Clause 123 are acted on by removing the Government’s discretion as to whether or not they introduce regulations. It would also impose a deadline of 12 months for the Government to do so. I have said this before, but I have learnt the hard way that good intentions and warm words from the Dispatch Box are a poor substitute for clear provisions in law. A quick search of the Bill reveals that there are 119 uses of the word “must” and 262 uses of the word “may”. Clearly, they are being used to create different obligations or expectations. The Minister may say that this amendment is not needed and that, for all intents and purposes, we can take the word “may” as a “must” or a “will”, but I would prefer to see it in black and white. In fact, if the Government have reserved discretion on this point, I would like to understand exactly what that means for research.
Amendment 198 seeks to ensure that the regulations will enable independent researchers to research how online risks and harms impact different groups, especially vulnerable users including children. We have already discussed the fact that online harms are not experienced equally by users: those who are most vulnerable offline are often the most vulnerable online. In an earlier debate, I talked about the frustrations experienced when tech companies do not report data according to age groups. In failing to do so, it is possible to hide the reality that children are disproportionately impacted by certain risks and harms. This amendment would ensure that children and other vulnerable groups can be studied in isolation, rather than leaving independent researchers to pick through generalised datasets to uncover where harm is amplified and for whom.
I will leave the noble Lord, Lord Bethell, to explain his amendments, but I will just say why it is so important that we have a clear path to researcher access. It is fundamental to the success of the online safety regime.
Many will remember Frances Haugen, the Facebook whistleblower, who revealed the extent to which Meta knew, through its own detailed internal research, how harmful their platforms actually are to young people. Meta’s own research showed that:
“We make body image issues worse for one in three girls”.
Some 32% of teen girls said that, when they have felt bad about their bodies, Instagram has made them feel worse. Were it not for a whistleblower, this research would never have been made public.
After a series of evidence disclosures to US courts as a result of the legal action by attorneys-general at state level, we have heard whistleblowers suggest, in evidence given to the EU, that there will be a new culture in some Silicon Valley firms—no research and no emails. If you have something to say, you will have to say it in person so that it cannot be used against them in court. The irony of that is palpable given the struggle that we are having about user privacy, but it points to the need for our research regime to be water- tight. If the companies are not looking at the impact of their own services, we must. I hope that the Government continue their leadership on this issue and accept the amendments in the spirit that they are being put forward.
I have another point that I want the Minister to clarify. I apologise, because I raised this in a private meeting but I have forgotten the answer. Given the number of regulatory investigations, proceedings and civil litigations in which tech companies are engaged, I would like some comfort about the legal exemption in these clauses. I want to understand whether it applies only to advice from and between lawyers or exempts data that may negatively impact companies’ defence or surface evidence of safety failures or deficiencies. The best way that I have of explaining my concern is: if it is habitual for tech companies to cc a lawyer in all their communications on product safety, trust and safety, and so on, would that give them legal privilege?
Finally, I support the noble Lord, Lord Clement-Jones, in his desire for a definition of independent researchers. I would be interested to hear what the Minister has to say on that. I beg to move.
My Lords, I will speak to my Amendments 198A and 198C to 198F. I also support Amendments 197, 198 and 198B, to which I have added my name, all of which address the issue of data for researchers.
As was put very thoughtfully by the noble Baroness, Lady Kidron, platforms are not making decisions about their services with due regard to product safety or with independent oversight. Ofcom’s work enforcing the Online Safety Act will significantly shift towards accountability, in some part, but it makes no provision at the moment on researchers’ data access, despite civil society and academic researchers being at the forefront of highlighting online harms for a decade. The anecdotes that the noble Baroness just gave were a very powerful testimony to the importance of that. We are, in fact, flying completely blind, making policy and, in this Room, legislation without data, facts and insight about the performance and algorithms that we seek to address. Were it not for the whistleblowers, we would not have anything to go on and we cannot rely on whistleblowers to guide our hands.
Rectifying this admission is in the Bill, and I am enormously grateful to the Minister and to the role of my noble friend Lord Camrose for putting it in the Bill. It is particularly important, because the situation with data for researchers has deteriorated considerably, even in the last 18 months—with Meta shutting CrowdTangle and X restricting researchers’ access to its API. The noble Baroness, Lady Kidron, spoke about what the whistleblowers think, and they think that this is going to get a lot worse in the future.
I welcome the inclusion of these provisions in the Bill. They will be totally transformational to this sector, bringing a level of access to serious analysts and academics, so we can better understand the impact of the digital world, for both good and bad. A good example of the importance of robust research to inform policy-making was the Secretary of State’s recent announcement that the Government were launching a
“research project to explore the impact of social media on young people’s wellbeing and mental health”.—[Official Report, Commons, 20/11/24; col. 250.]
That project will not be very effective if the researchers cannot access the data, so I very much hope that these provisions will be enforced before they start spending money on that.
To be effective and to have the desired effect, we need to ensure that the data for researchers regime, as described in the Bill, is truly effective and cannot be easily brushed off. That is why the Government need to accept the amendments in this group: to bring some clarity and to close loopholes in the scheme as it is outlined in the Bill.
I will briefly summarise the provisions in the amendments in my name. First, we need to make researcher access regulations enforceable in the same way as other requirements in the Online Safety Act. The enforcement provisions in that Act were strengthened considerably as it passed through this House, and I believe that the measures for data for researchers need to be given the same rocket boosters. Amendment 198D will mean that regulated services will be required to adhere to the regime and give Ofcom the power to levy proper remedial action if regulated services are obfuscating or non-compliant.
Secondly, we need to ensure that any contractual provision of use, such as a platform’s terms of service, is unenforceable if it would prevent
“research into online safety matters”,
as defined in the regulations. This is an important loophole that needs to be closed. It will protect UK researchers carrying out public interest research from nefarious litigation over terms of service violations as platforms seek to obfuscate access to data. We have seen this practice in other areas.
Thirdly, we need to clarify that researchers carrying out applicable research into online safety matters in the UK will be able to access information under the regime, regardless of where they are located. This is a basic point. Amendment 198E would bring the regime in line with the Digital Services Act of the EU and allow the world’s best researchers to study potential harm to UK users.
Ensuring robust researcher access to data contributes to a great ecosystem of investigation and scrutiny that will help to enforce an effective application of the law, while also guarding against overreach in terms of moderating speech. It is time to back UK civil society and academic researchers to ensure that policy-making and regulatory enforcement is as informed as possible. That is why I ask the Minister to support these measures.
My Lords, I thank noble Lords who have welcomed the provisions in the Bill. I very much appreciate that we have taken on board the concerns that were raised in the debates on the previous legislation. I thank the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Clement-Jones, for their amendments.
I will speak first to Amendment 197, tabled by the noble Baroness, Lady Kidron, which would compel the Secretary of State to create a framework and to do so within 12 months of passage. I understand and share her desire to ensure that a framework allowing researchers access is installed and done promptly. This is precisely why we brought forward this provision. I reassure her that the department will consult on the framework as soon as possible after the publication of Ofcom’s report.
Turning to Amendments 198 and 198B, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, respectively, Clause 123 provides the Secretary of State with the power to make regulations relating to researchers’ access to data. I can reassure noble Lords that it does not limit the regulations to the non-exhaustive list of examples provided. I agree that fair and proportionate criteria for who is considered a researcher are critical to the success of the future framework. I reassure noble Lords that in the provision as currently written the Secretary of State can include in the design of the framework the specific requirements that a person must meet to be considered a researcher.
Turning to Amendments 198A and 198D, tabled by the noble Lord, Lord Bethell, while I am sympathetic to his desire to provide a future framework with the robust enforcement powers of the OSA, I assure him that as the provision is written, the Secretary of State can already use the existing enforcement powers of the OSA to support a future framework. Furthermore, should the evidence suggest that additional or different measures would be more effective and appropriate, this provision allows the Secretary of State the flexibility to introduce them.
Turning next to Amendments 198C and 198E, tabled by the noble Lord, Lord Bethell, I understand the spirit of these amendments and note the importance of this issue, given the global nature of the online world. It is entirely reasonable to allow researchers who are not based in the UK to utilise our researcher access framework, as long as the subject of their research is the experience of UK users online. I reassure him that the provisions as drafted already allow the Secretary of State to make regulations permitting non-UK-based researchers to use the framework where appropriate. We plan to use the evidence gathered through our own means and through Ofcom’s report to set out who will be eligible to use the framework in the secondary legislation.
Finally, turning to Amendment 198F, I am aware of the concern that researchers have encountered blockages to conducting research and I am sympathetic to the intentions behind the amendment. We must ensure that researchers can use the future framework without fear of legal action or other consequences. I am conscious that the noble Baroness, Lady Kidron, asked me a specific question about legal exemptions and I will write to her to make that answer much clearer. I reassure noble Lords that the Government are considering the specific issues that the noble Lord raises. For these reasons, I ask that the amendments not be pressed while the Government consider these issues further and I am of course happy to engage with noble Lords in the meantime.
My Lords, I thank the Minister and everyone who spoke. I do not think I heard an answer to the may/must issue and I think I need to say that just relying on Ofcom’s report to set the framework for the regime is not adequate, for two reasons. First, it is no news to the Committee that there is a considerable amount of disquiet about how the Online Safety Act has been reinterpreted without Parliament’s intention. During the passage of this Bill, we are trying to be really clear—we will win some and we will lose some—on the face of the Bill what Parliament’s intention is, so that the regulator really does what we agree, because that subject is currently quite contentious.
This is a new area and a lot of the issues that the Minister and, indeed, the noble Viscount, Lord Camrose, raised are here to be sorted out to make sure that we understand collectively what it will look like. Having said that, I would like the Government to have heard that we do not wish to rest on the actions of whistleblowers but we will be increasingly forced to do so if we do not have a good regime. We must understand the capacity of this sector to go to court. We are in court everywhere, all over the world; the sector has deep pockets.
Finally, I welcome the nitpicking of the noble Lord, Lord Arbuthnot. Long may he nitpick. We will make sure that he is content before Report. With that, I beg leave to withdraw the amendment.
My Lords, Amendment 203 is in my name and the names of the noble Lords, Lord Bethell, Lord Stevenson and Lord Clement-Jones. I thank noble Lords wholeheartedly for their support for this measure through two versions of this Bill. I believe that I speak for all signatories in recognising the support of a huge number of colleagues in both Houses and all parties who have expressed their support for this amendment.
It is my understanding that we are going to hear good news from the Dispatch Box. In the event that I am wrong, I shall have more to say once we have heard from the Minister. In the meantime, I want to explain what the problem is that the amendment seeks to solve.
It is illegal in the UK to possess or distribute child sexual abuse material, including AI-generated or computer-generated child sexual abuse material. The laws that the police use to enforce against CSAM are Section 1 of the Protection of Children Act 1978 and Section 160 of the Criminal Justice Act 1988, both of which create offences in respect of indecent photographs or pseudo-photographs of a child. AI content depicting child sexual abuse is illegal under these laws, but creating and distributing the software models needed to generate them is not, which means that those building and distributing software that allows paedophiles to generate bespoke child sexual abuse material have operated with impunity.
There are many services that allow anyone to take any public image and put it in a false situation. Although I have argued elsewhere that AI images should carry a mark of provenance, these services are not the subject of this amendment. This amendment is laser focused on criminalising AI models that are trained on or trained to create child sexual abuse material. They are specific, specialist and currently beyond the reach of the police. The models blend images of children—known children, stock photos, images scraped from social media, school websites or synthetic, fabricated AI depictions of children—with existing CSAM or pornography, and they allow paedophiles to generate bespoke CSAM scenarios of unimaginable depravity, as they are unmitigated by any restrictions that organise the reality of the world. If someone can think, type or say it, they can make it so.
Many of the generative models are distributed for free, but more specialist models are provided on subscription for less than £50 per month. This payment provides any child sexual offender with the ability to generate limitless—and I do mean limitless—child sexual abuse images. But while the police can take action against those who possess those images, they are unable to take action against those who make it possible to do so: the means of production.
A surprising number of people think that AI abuse is a victimless crime, and I want to make it clear that it is not. First, who would be comfortable with the image of their child or grandchild or their neighbour’s child being used in this way? Anyone, adult or child, can appear in AI-generated CSAM. I am not going to say how it can be done, because I do not want my words to be a set of instructions on the public record—but the reality is, any one of us, woman or man, though 99% are women, boy or girl, though it is mostly girls, is a potential victim. If your image is used in this way, you are a victim; if you are forced to watch or copy such imagery, you are a victim; and if you are a child whose real-life abuse is not caught because you are lost in a sea of AI-generated material, you are a victim. Then there is the normalisation of sexual violence against children, which poisons relationships—intimate, familial, across generations, genders and sexes. This is not a victimless crime.
My Lords, first, I thank the speakers for what were really powerful and largely unequivocal contributions.
I am grateful to the Minister. I was expecting something more a tiny bit expansive but I will take, on record, that we are going to make it a new offence for a person to make, adapt, possess, supply or offer to supply a CSA image generator, including any service, program or information in electronic form that is made, or adapted for use, to create or facilitate the creation of CSA material. I am expecting something that covers all that and I am expecting it shortly, as the Minister said. I again thank the Safeguarding Minister, Jess Phillips, for her tremendous efforts, as well as some of the civil servants who helped make it leap from one Government to the next. We can be content with that.
I feel less comfortable about the Minister’s answer to the noble Baroness, Lady Owen. We, women victims, experience the gaps in the law. If there are gaps in the law, it is our job, in this Committee and in the other place, to fix them. We all want the same thing; I know the Minister well enough to know that she wants the same thing. So I am going to push back and say that I will support the noble Baroness, Lady Owen, in trying to bring this measure back through this Bill. I believe that the mood of the Committee is with her so whatever mistakes there are on her patch will be fixed before Report, because this is not something that can wait. Kids and women are being hurt.
We all want to celebrate the digital world. I was an early adopter. I had one of those cameras on my computer before anyone else I knew did, so I could not speak to anyone; there was no one to talk to. We want this world to be good. We are not saying something different. On behalf of the noble Baroness, Lady Owen, who is nodding, let me just say that we will come back to this issue. I thank the Minister for her assurance on Amendment 203 and beg leave to withdraw.
My Lords, I am beginning to feel like the noble Lord, Lord Clement-Jones, but I reassure everyone that this is the last day of Committee.
I shall speak to the amendments in this group in my name and that of the noble Lords, Lord Stevenson—he is very sorry not to be in his place today—and Lord Clement-Jones, and my noble friend Lord Freyberg. I thank the News Media Association for its briefing and support. I also thank, for their wonderful and unlikely support, Sir Paul McCartney, Kate Mosse, Margaret Drabble and Richard Osman, alongside the many creative artists who have spoken, written and tweeted and are among the 37,000 people who signed a petition calling for swift action to protect their livelihoods.
I have already declared my interests for the Committee but I add, to be clear, that my husband is a writer of film, theatre and opera; and that, before I came to your Lordships’ House, I spent 30 years as a movie director. As such, I come from and live alongside a community for whom the unlicensed and illegal use of copyrighted content by generative AI developers is an existential issue. I am therefore proud to move and speak to amendments that would protect one of our most financially significant economic sectors, which contributes £126 billion in gross value added to UK GDP; employs 2.4 million people; and brings so much joy and understanding to the world.
Text and data mining without licence or permission is illegal in the UK, unless it is done specifically for research. This means that what we have witnessed over the past few years is intellectual property theft on a vast scale. Like many of the issues we have discussed in Committee, this wrongdoing has happened in plain sight of regulators and successive Governments. I am afraid that yesterday’s announcement of a consultation did not bring the relief the industry needs. As Saturday’s Times said,
“senior figures in the creative sector are scathing about the government plans”,
suggesting that the Secretary of State has drunk Silicon Valley’s “Kool-Aid” and that rights reservation is nonsense. An official at the technical briefing for the consultation said that
“rights reservation is a synonym for opt out”.
Should shopkeepers have to opt out of shoplifters? Should victims of violence have to opt out of attacks? Should those who use the internet for banking have to opt out of fraud? I could go on. I struggle to think of another situation where someone protected by law must proactively wrap it around themselves on an individual basis.
The value of our creative industries is not in question; nor is the devastation that they are experiencing as a result of non-payment of IP. A recent report from the International Confederation of Societies of Authors and Composers, which represents more than 5 million creators worldwide, said that AI developers and providers anticipate the market for GAI music and audiovisual content increasing from €3 billion to €64 billion by 2028 —much of it derived from the unlicensed reproduction of creators’ works, representing a transfer of economic value from creators to AI companies. Let there be no misunderstanding of the scale of the theft: we already know that the entire internet has been downloaded several times without the consent or financial participation of millions of copyright holders.
This transfer of economic value from writers, visual artists and composers across all formats and all genres to AI companies is not theoretical. It is straightforward: if you cannot get properly paid for your work, you cannot pay the rent or build a career. Nor should we be taken in by the “manufactured uncertainty” that Silicon Valley-funded gen AI firms and think tanks have sought to create around UK copyright law. Lobbyists and their mouthpieces, such as TechUK, speak of a lack of clarity—a narrative that may have led to Minister Chris Bryant claiming that the Government’s consultation was a “win-win”. However, I would like the Minister to explain where the uncertainty on who owns these copyrighted works lies. Also, where is the win for the creative industries in the government proposal, which in one fell swoop deprives artists of control and payment for their work—unless they actively wrap the law around them and say “no”—leaving them at the mercy of pirates and scrapers?
Last week, at a meeting in this House attended by a wide range of people, from individual artists to companies representing some of the biggest creative brands in the world, a judge from the copyright court said categorically that copyright lies with the creator. AI does not create alone; it depends on data and material then to create something else. A technological system that uses it without permission is theft. The call for a new copyright law is a tactic that delays the application of existing law while continuing to steal. Unlike the physical world, where the pursuit of a stolen masterpiece may eventually result in something of value being returned to its owner, in the digital world, once your IP is stolen, the value is absorbed and fragmented, hidden amid an infinite number of other data points and onward uses. If we continue to delay, much of the value of the creative industries’ rich dataset will be absorbed already.
The government consultation has been greeted with glee overnight by the CCIA, which lobbies for the biggest tech firms. After congratulating the Government at some length, it says that
“it will be critical to ensure that the transparency requirements are realistic and do not ask AI developers to compromise their work by giving away trade secrets and highly sensitive information that could jeopardise the safety and security of their models”.
In plain English, that means, “We have persuaded the Government to give up creatives’ copyright, and now the campaign begins to protect our own ‘sensitive business information’”. If that is not sufficiently clear to the Committee, that means they are, first, claiming their own IP while stealing others, while simultaneously pushing back at transparency, because they do not want an effective opt-out.
The government consultation does not even contain an option of retaining the current copyright framework and making it workable with transparency provisions—the provisions of the amendments in front of us. The Government have sold the creative industries down the river. Neither these amendments nor the creative community are anti-tech; on the contrary, they simply secure a path by which creatives participate in the world that they create. They ensure the continuous sustainable production of human-generated content into the future, for today’s artists and those of tomorrow. The amendments do not extend the fundamentals of the Copyright, Designs and Patents Act 1988, but they ensure that the law can be enforced on both AI developers and third parties that scrape on their behalf. They force transparency into the clandestine black box.
Amendment 204 requires the Secretary of State to set out the steps by which copyright law must be observed by web crawlers and others, making it clear that it applies during the entire lifecycle, from pretraining onwards, regardless of jurisdiction—and it must take place only with a licence or express permission.
Amendment 205 requires the Secretary of State to set out the steps by which web crawlers and general-purpose AI models are transparent. This includes but is not limited to providing a name for a crawler, identifying the legal entity responsible for it, a list of purposes for which it is engaged and what data it has passed on. It creates a transparent supply chain. Crucially, it requires operators of crawlers to disclose the businesses to which they sell the data they have scraped, making it more difficult for AI developers that purchase illegally scraped content to avoid compliance with UK copyright law, overturning current practice in which the operators of crawlers can obscure their own identity or ownership, making it difficult and time-consuming—potentially impossible—to combat illegal scraping.
Amendment 206 requires the Secretary of State to set out by regulation what information web crawlers and general-purpose models must disclose regarding copyrighted works—information such as URL, time and type of data collected and a requirement to inform the copyright holder. This level of granularity, which the tech companies are already pushing against, provides a route by which IP holders can choose or contest the ways in which their work is used, as well as provide a route for payment.
In sum, the amendments create a clear and simple process for identifying which copyright works are scraped, by whom, for what purpose and from which datasets. They provide a process by which existing law can be implemented.
I shall just mention a few more points before I finish. First, there is widespread concern that mashing up huge downloads of the internet, including the toxic, falsehoods and an increasing proportion of artificially generated or synthetic data, will cause it to degenerate or collapse, putting a block on the innovation that the Government and all of us want to see, as well as raising serious safety concerns about the information ecosystem. A dynamic licensing market would provide a continuous flow of identified human-created content from which AI can learn.
Secondly, the concept of a voluntary opt-out regime—or, as the Government prefer, rights reservation—is already dead. In the DPDI Bill, I and others put forward an amendment to make robots.txt part of the robots’ exclusion protocol opt-in. In plain English, that would have meant that the voluntary scheme in which any rights holder can put a note on their digital door saying “Don’t scrape” would have been reversed to be mandatory. Over the last few months, we have seen scrapers ignoring the agreed protocol, even when activated. I hope the Minister will explain why he thinks that creators should bear the burden and the scrapers should reap the benefit and whether the Government have done an impact assessment on how many rights holders will manage to opt out versus how many would opt in, given the choice.
As someone who has spent my life creating IP, protecting IP and sometimes giving IP away, I welcome this debate. I am extremely grateful to the noble Baroness, Lady Kidron, for a very thoughtful set of proposals. The fact that many noble Lords have spoken in this debate shows that the rapid development of AI has clearly raised concerns about how to protect the creative industries. The Government take this very seriously. As the noble Lord, Lord Lucas, pointed out, we need to get it right, which is why we have launched a very wide-ranging consultation on a package of interventions to address copyright and AI issues. It is an important first step in an area where the existing situation is clearly not working and we run the risk of many long-lasting court cases, which will not help the situation in which we find ourselves.
We are committed both to supporting human-centred creativity and to the potential of AI to unlock new horizons. Many in the creative industries use AI very widely already. Our goal is to support AI innovation in the UK while maintaining robust protection for creators and our vibrant creative industry. In response to a point that the noble Baroness, Lady Kidron, raised earlier, option 1 in the consultation refers to existing copyright law and asks for views about maintaining and increasing it. The consultation sets out the Government’s objectives for this area and proposes a range of measures on which we are seeking views. Specifically, it aims to support rights-holders to continue to exercise control over the use of their content and their ability to seek remuneration for this. As many noble Lords have pointed out, that has to be made easy and technically feasible. It also promotes greater trust and transparency and proposes mechanisms by which you can see who is looking at the data and what they are doing with it.
Finally, it aims to support the development of world-leading AI models in the UK by ensuring that access can be appropriately wide but, of course, lawful and with the approval of those it is got from. This includes the subjects of the noble Baroness’s amendments. The consultation seeks views on technological measures that can provide greater control over access to and use of the online material, as well as transparency measures that help copyright owners understand whether their work is being used by AI developers. Again, this needs to be made easy. Various technologies are coming along which can do that, including, as has been said, the watermarking approach.
Much of this needs to be wrapped into an approach to standards. It is important that this is done in a way that is reproducible and reliable. Through this consultation, we will address some of these issues and seek to continue to get input from stakeholders on all of them. We will also work towards internationally interoperable solutions, as raised by many noble Lords, including the noble Lord, Lord Freyberg, and the noble Earl, Lord Effingham.
I agree with the noble Baroness, Lady Kidron, that a vibrant and effective licensing approach—a system that works well and provides access and rights—is important. She asked about an impact assessment. I do not have the information with me now, but I will write. I look forward to updating her on this work in due course and, in the meantime, hope that she is content to withdraw her amendment.
Does the Minister recognise the characterisation of noble Lords who have said that this is theft? Currently, we have a law and copyright is being taken without consent or remuneration. Does he agree with them that this is what the creative industries and, I presume, some of his community are experiencing?
At the moment we have a system where it is unclear what the rights are and how they are being protected, and therefore things are being done which people are unable to get compensation for. We can see that in the court cases going on at the moment. There is uncertainty which needs to be resolved.
I thank the Minister for his answer and welcome him very much to the Dispatch Box—I have not yet had the pleasure of speaking with him in a debate. I hope he saw the shaking heads when he answered my question about theft and this lack of clarity. If you say “Write me the opening chapter of a Stephen King novel”, and the AI can do it, you can bet your bottom dollar that it has absorbed a Stephen King novel. We know that a lot of this material is in there and that it is not being paid for. That goes for issues big and small.
I understand that it is late and we have more to do—I have more to say on other issues—but I want to reiterate three points. First, creative people are not anti-tech; they just want control over the things they create. AI is a creation on top of a creation, and creative people want to be paid for their efforts and to be in control of them. I am not sure whether I can mention it, because it was in a private meeting, but a brand that many people in most countries will have heard of said: “We need to protect our brand. We mean something. An approximation of us is not us. It is not just the money; it is also the control”.
I also make the point that, earlier this week, Canal+ had its IPO on the London Stock Exchange. I heard the CEO answer the question, “Why is it that Canal+ decided to come and do its IPO in the UK when everybody else is scarpering elsewhere?”, by saying a lot of very warm-hearted things about Paddington Bear, then, “Because you have very good copyright laws”. That is what they said. I just want to mention that.
Finally, I am grateful to the Minister for saying that there is the option of staying with the status quo; I will look at that and try to understand it clearly. However, when he writes about the issue that I raised in terms of opting in or opting out—I am grateful to him for doing so—I would also like an answer about where the Government think the money is going to go. What is the secondary value of the AI companies, which are largely headquartered in the US? Where will the IP, which those companies have already said they want to protect—they did so in their response to the Government’s consultation; I said that it in my speech, for anyone who was not listening—go? I would like the Government to say what their plans are, if we lose the £1.6 billion and the 2.4 million jobs, to replace that money and those jobs, as well as their incredible soft power.
With that, I beg leave to withdraw the amendment.
My Lords, it is a privilege to introduce Amendment 207. I thank the noble Lords, Lord Arbuthnot and Lord Clement-Jones, and the right reverend Prelate the Bishop of St Albans, who is unfortunately unwell but wanted me to express his support.
I make it clear that, although I may use the Horizon scandal as an example, this amendment is neither focused on nor exclusive to the miscarriage of justice, malevolence and incompetence related to that scandal. It is far broader than that so, when the Minister replies, I really hope that he or she—I am not sure which yet—will not talk about the criminality of the Post Office, as previously, but rather concentrate on the law that contributed to allowing a miscarriage of justice at that scale. That is what this amendment seeks to address.
I explained during debates on the DPDI Bill that, since 1999, courts have applied
“a common law presumption, in both criminal and civil proceedings, of the proper functioning of machines—that is to say”,
the information from the computer can be presumed to be reliable. I went on to say:
“In principle, there is a low threshold for rebutting this presumption but, in practice … a person challenging evidence derived from a computer will typically have no””.—[Official Report, 24/4/24; col. GC 573.]
knowledge of the circumstance in which the system in question was operated so cannot possibly demonstrate that it failed. As Paul Marshall, the barrister who represented some of the postmasters, explains, this puts the onus on the defendant to explain to the jury the problems they encountered when all they could actually do was point to the shortfalls they had experienced—in the Horizon case, that the cash received did not match the balancing figure on the computer screen. They did not have access to the system or the record of its past failures, and they had no knowledge of what the vulnerabilities were. They only knew that it did not work.
The reality is that anyone who knows the first thing about programming or computer science knows that there are bugs in the system. Indeed, any one of us who has agreed to an update for an app or computer software understands that bug fixing is a common aspect of program maintenance. When I discussed this amendment with a computer scientist of some standing, he offered the opinion that there are likely to be 50 bugs per 1,000 lines of code; many complex systems run to tens of millions of lines of code.
Perhaps the most convincing thing of all is looking at software contracts. For the past 20 years at least, a contract is likely to contain words to this effect: “No warranty is provided that the operation of the software will be uninterrupted or error free, or that all software errors will be corrected”. This same clause applies in contracts when we say yes to a new Apple upgrade when we sign a EULA—an end-user licence agreement. In plain English, for two decades at least, those who provide software have insisted that computer information is not to be considered reliable. That is written into their commercial agreements, so the fact that computer information is not reliable is agreed by those who know about computer information.
My Lords, I thank the noble Baroness and the noble Lord, Lord Arbuthnot, for Amendment 207 and for raising this important topic. The noble Baroness and other noble Lords are right that this issue goes far wider than Horizon. We could debate what went wrong with Horizon, but the issues before us today are much wider than that.
The Government are agreed that we must prevent future miscarriages of justice. We fully understand the intention behind the amendment and the significance of the issue. We are actively considering this matter and will announce next steps in the new year. I reassure noble Lords that we are on the case with this issue.
In the meantime, as this amendment brings into scope evidence presented in every type of court proceeding and would have a detrimental effect on the courts and prosecution—potentially leading to unnecessary delays and, more importantly, further distress to victims—I must ask the noble Baroness whether she is content to withdraw it at this stage. I ask that on the basis that this is an ongoing discussion that we are happy to have with her.
I thank the Minister, in particular for understanding that this goes way beyond Horizon. I would be very interested to be involved in those conversations, not because I have the great truth but because I have access to people with the great truth on this issue. In the conversations I have had, there has been so much pushing back. A bit like with our previous group, it would have been better to have been in the conversation before the consultation was announced than after. On that basis, I beg leave to withdraw the amendment.
My Lords, the good news is that this is the last time I shall speak this evening. Amendment 211 seeks to ensure that the value of our publicly held large datasets is realised for the benefit of UK citizens.
Proposed new subsection (1) gives the Secretary of State the power to designate datasets held by public bodies, arm’s-length institutions or other sets held in the public interest as sovereign data assets.
Proposed new subsection (2) lists a number of requirements that the Secretary of State must have regard to when making a designation. Factors include the security and privacy of UK citizens, the ongoing value of the data assets, the rights of IP holders, the values, laws and international obligations of the UK, the requirement to give preferential access to UK-headquartered companies, organisations and the public sector, the requirement for data to be stored in the UK and the design of application programming interfaces facilitating access to the assets by authorised licence holders. It also sets out stakeholders whom the Secretary of State must consult when considering what datasets to designate as sovereign data assets. We heard in a previous debate that education data might be a good candidate.
Proposed new subsection (3) requires the setting up of a transparent licensing system. Proposed new subsection (4) requires those managing sovereign data assets to report annually on their value and anticipated return to UK subjects. This would include, for example, licence payments, profit share agreements and “in kind” returns, such as access to products or services built using sovereign data assets. Proposed new subsection (5) gives an oversight role to the National Audit Office, proposed new subsection (6) provides a definition, and proposed new subsections (7) and (8) specify that regulations made under the clause are subject to parliamentary approval.
When I raised this issue at Second Reading, the Minister answered positively, in that she felt that what I was suggesting was embodied in the plans for a national data library:
“The national data library will unlock the value of public data assets. It will provide simple, secure and ethical access to our key public data assets for researchers, policymakers and businesses, including those at the frontier of AI development, and make it easier to find, discover and make connections across those … databases. It will sit at the heart of an ambitious programme of reform that delivers the incentives, investment and leadership needed to secure the full benefits for people and the economy”.—[Official Report, 19/11/24; col. 196.]
That is a very valid and positive picture. My comments build on it, because since Second Reading, I have sought details about the national data library. It seems that plans are nascent and the level of funding, as I understand it, seems to match neither the ambition set out by the Minister nor what many experts think is necessary. One of my concerns—it will not surprise the Committee to hear, as it has come up a couple of times on previous groups—is that it appears to be a mechanism for facilitating access, rather than understanding, realising and protecting the value of these public data assets.
In the meantime, announcements of access to public data keep coming. We have worries about Palantir and the drip-feed of deals with OpenAI and Google, the latest of which was in the Health Services Journal, which said:
“The national Federated Data Platform will be used to train AI models for future use by the NHS, according to NHS England’s chief data and analytics officer”.
That sounds great, but the article went on to question the basis and the wrap-around. This is the question.
We in this House already understand the implications of an “adopt now, ask questions later”, approach. For example, as reported recently in Computer Weekly, Microsoft has now admitted to Scottish policing bodies that it cannot guarantee the sovereignty of UK policing data hosted on its hyperscale public cloud infrastructure. That is a huge problem for Police Scotland and one that is very likely to be mirrored across all sorts of government departments, in a technology that is central to so many departments. The proposed amendments offer a route to ask questions as you adopt technology, not after you have lost control.
My Lords, I am grateful to the noble Baroness, Lady Kidron, for her amendment. I agree with her that the public sector has a wealth of data assets that could be used to help our society achieve our missions and contribute to economic growth.
As well as my previous comments on the national data library, the Government’s recent Green Paper, Invest 2035: The UK’s Modern Industrial Strategy, makes it clear that we consider data access part of the modern business environment, so improving data access is integral to the UK’s approach to growth. However, we also recognise the value of our data assets as part of this approach. At the same time, it is critical that we use our data assets in a trustworthy and ethical way, as the noble Baroness, Lady Kidron, and the noble Lord, Lord Tarassenko, said, so we must tackle these issues carefully.
This is an active area of policy development for the Government, and we need to get it right. I must therefore ask the noble Baroness to withdraw her amendment. However, she started and provoked a debate that will, I hope, carry on; we would be happy to engage in that debate going forward.
I thank all speakers, in particular my noble friend Lord Tarassenko for his perspective. I am very happy to discuss this matter and let the Official Opposition know that this is a route to something more substantive to which they can agree. I beg leave to withdraw my amendment.
(6 days, 6 hours ago)
Grand CommitteeI reassure the noble Lord that, as he knows, we are very hopeful that we will have data adequacy so that issue will not arise. I will write to him to set out in more detail when those powers would be used.
I thank the Minister for her offer of a meeting. I could tell from the nods of my co-signatories that that would indeed be very welcome and we would all like to come. I was interested in the quote from the ICO about scraping. I doubt the Minister has it to hand, but perhaps she could write to say what volume of enforcement action has been taken by the ICO on behalf of data rights holders against scraping on that basis.
Yes, it would be helpful if we could write and set that out in more detail. Obviously the ICO’s report is fairly recent, but I am sure he has considered how the enforcement would follow on from that. I am sure we can write and give more details.
My Lords, in speaking to Amendment 137 in my name I thank the noble Baroness, Lady Harding, the noble Lord, Lord Stevenson, and my noble friend Lord Russell for their support. I also add my enthusiastic support to the amendments in the name of my noble friend Lord Colville.
This is the same amendment that I laid to the DPDI Bill, which at the time had the support of the Labour Party. I will not labour that point, but it is consistently disappointing that these things have gone into the “too difficult” box.
Amendment 137 would introduce a code of practice on children and AI. AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, the people they follow and the products they buy—and, as reported last weekend, AI is even helping farmers pick the ripest tomatoes for baked beans. But it no longer concerns simply the elective parts of life where, arguably, a child or a parent on their behalf can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors, the first of which is compulsory for children and the second of which is necessary for all of us.
The amendment has three parts. The first requires the ICO to create a code and sets out the expectations of its scope; the second considers who and what should be consulted and considered, including experts, children, and the frameworks that codify children’s existing rights; and the third part defines elements of the process, including risk assessment definitions, and sets out the principles to which the code must adhere.
When we debated this before, I anticipated that the Minister would say that the ICO had already published guidance, that we do not want to exclude children from the benefits of AI, and that we must not get in the way of innovation. Given that the new Government have taken so many cues from the previous one, I am afraid I anticipate a similar response.
I first point out, therefore, that the ICO’s non-binding guidance on AI and data protection is insufficient. It has only a single mention of a child in its 140 pages, which is a case study about child benefits. In the hundreds of pages of guidance, toolkits and sector information, nowhere are the specific needs and rights, or development vulnerabilities, of children comprehensively addressed in relation to AI. This absence of children is also mirrored in government publications on AI. Of course, we all want children to enjoy the benefits of AI, but consideration of their needs would increase the likelihood of those benefits. Moreover, it seems reckless and unprincipled not to protect them from known harms. Surely the last three decades of tech development have shown us that the experiment of a “build first, worry about the kids later—or never” approach has cost our children dearly.
Innovation is welcome but not all innovation is equal. We have bots offering 13 year-olds advice on how to seduce grown men, or encouraging them to take their own lives, edtech products that profile children to unfair and biased outcomes that limit their education and life chances, and we have gen AI that perpetuates negative, racist, misogynist and homophobic stereotypes. Earlier this month, the Guardian reported a deep bias in the AI used by the Department for Work and Pensions. This “hurt first, fix later” approach creates a lack of trust, increases unfairness, and has real-world consequences. Is it too much to insist that we ask better questions of systems that may result in children going hungry?
Why children? I am saddened that I must explain this, but from our deeply upsetting debate last week on the child protection amendments, in which the Government asserted that children are already catered for while deliberately downgrading their protections, it seems that the Government or their advisers have forgotten.
Children are different for three reasons. First, as has been established over decades, children are on a development journey. There are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony and learn different social skills. There are equally ages and stages at which they cannot do those things. The long-established consensus is that families, social groups and society more broadly, including government, step in to support them on this journey. Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces they inhabit have to be designed to be fit for childhood. Thirdly, we have a responsibility towards children that extends even beyond our responsibility to each other. This means that we cannot legitimatise profit at their expense. Allowing systems to play in the wild in the name of growth and innovation, leaving kids to pay the price, is a low bar.
It is worth noting that since we debated it, a proposal for this AI code for children that follows the full life cycle of development, deployment, use and retirement of AI systems has been drafted and has the support of multiple expert organisations and individuals around the globe. I am sure that all nations and intergovernmental organisations will have additional inputs and requirements, but it is worth saying that the proposed code, which was written with input from academics, computer scientists, lawyers, engineers and children’s rights activists, is mindful of and compatible with the EU AI Act, the White House Blueprint for an AI Bill of Rights, the Executive Order on the Safe, Secure and Trustworthy Development and Use of Artificial Intelligence, the Council of Europe’s Framework Convention on Artificial Intelligence and, of course, the UNCRC general comment no. 25.
This proposal will be launched early next year as an indication of what could and should be done. Unless the Government find their compass vis-à-vis children and tech, I suspect that another jurisdiction will adopt it ahead of the UK, making that the go-to destination for trusted tech development for child-safe products. It is perhaps worth reminding the Committee that one in three connected people is under 18, which is roughly 1 billion children. As the demographics change, the proportion and number of children will rise. It is a huge financial market.
Before I sit down, I shall briefly talk about the AADC because sometimes Ministers say that we already have a children’s code. The age-appropriate design code covers only ISS, which automatically limits it, and even the ICO by now agrees that its enforcement record is neither extensive nor impressive. It does not clearly cover the urgent area of edtech, which is the subject of another amendment, and, most pertinently to this amendment, it addresses AI profiling only, which means that it is limited in how it can look at the new and emerging challenges of generative AI. A revamp of the AADC to tackle the barriers of enforcement, account for technological advances, cover all products and services likely to be accessed by children and make our data regime AI-sensitive would be welcome, but rather than calling for a strengthening of the AADC, the ICO agreed to the downgrading of children’s data protection in the DPDI Bill and, again, has agreed to the downgrading of protections in the current Bill on ADM, scientific research, onward processing and so on. A stand-alone code for AI development is required because in this way we could be sure that children are in the minds of developers at the outset.
It is disappointing that the UK is failing to claim its place as the centre of regulated and trusted innovation. Although we are promised an AI Bill, the Government repeatedly talk of large frontier companies. AI is in every part of a child’s life from the news they read to the prices they pay for travel and goods. It is clear from previous groups that many colleagues feel that a data Bill with no AI provisions is dangerous commercially and for the communities of the UK. An AI Bill with no consideration of the daily impact on children may be a very poor next choice. Will the Minister say why a Labour Government are willing to abandon children to technology rather than building technology that anticipates children’s rights and needs?
My Lords, it is a pleasure to follow my friend the noble Baroness, Lady Kidron, and to give full-throated support to my friend the noble Viscount, Lord Colville, on all his amendments. Given that the noble Baroness mentioned it and that another week has passed since we asked the Minister the question, will we see an AI Bill or a consultation before Santa comes or at some stage in the new year? I support all the amendments in this group and in doing so, as it is the first time I have spoken today in Committee, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business.
I will speak particularly to my Amendment 211A. I have put down “image, likeness and personality” not because I believe that they stand as the most important rights that are being transgressed or that they are the most important rights which we should consider; I have put them down to give a specific focus on them because, right now, they are being largely cut across and ignored, so that all of our creatives find themselves with their works, but also image, likeness and personality, disappearing into these largely foundation AI models with no potential for redress.
Once parts of you such as your name, face or voice have been ingested, as the noble Lord, Lord Clement-Jones, said in the previous group, it is difficult then to have them extracted from the model. There is no sense, for example, of seeking an equitable remedy to put one back in the situation had the breach not occurred. It is almost “once in, forever in”, then works start to be created based on those factors, features and likenesses, which compete directly with the creatives. This is already particularly prevalent in the music industry.
What plans do the Government have in terms of personality rights, image and likeness? Are they content with the current situation where there is no protection for our great creatives, not least in the music industry? What does the Bill do for our creatives? I go back to the point made by the noble Baroness, Lady Kidron. How can we have all these debates on a data Bill which is silent when it comes to AI, and a product regulation Bill where AI is specifically excluded, and yet have no AI Bill on the near horizon—unless the Minister can give us some up-to-date information this afternoon? I look forward to hearing from her.
My Lords, I thank the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, for their amendments and consideration of this policy area. I hope noble Lords will bear with me if I save some of the points I shall make on web crawling and intellectual property for the later group, which is specifically on that topic.
Amendments 92 and 93 from the noble Viscount are about the new disproportionate effort exemption in Article 13. I can reassure noble Lords that this exemption applies only when data is collected directly from the data subject, so it cannot be used for web crawling, which is, if you like, a secondary activity. I think that answers that concern.
Amendments 101 and 105, also from the noble Viscount, are about the changes to the existing exemption in Article 14, where data is collected from other sources. Noble Lords debated this issue in the previous group, where Amendments 97 and 99 sought to remove this exemption. The reassurances I provided to noble Lords in that debate about the proportionality test being a case-by-case exercise also apply here. Disproportionate effort cannot be used as an excuse; developers must consider the rights of the data subject on each occasion.
I also draw noble Lords’ attention to another quote from the ICO itself, made when publishing its recent outcome reports. I know I have already said that I will share more information on this. It says:
“Generative AI developers, it’s time to tell people how you’re using their information”.
The ICO is on the case on this issue, and is pursuing it.
On Amendment 137 from the noble Baronesses, Lady Kidron and Lady Harding, and other noble Lords, I fully recognise the importance of organisations receiving clear guidance from regulators, especially on complex and technical issues. AI is one such issue. I know that noble Lords are particularly conscious of how it might affect children, and I am hearing the messages about that today.
As the noble Baroness will know, the Secretary of State already has the power to request statutory codes such as this from the regulator. The existing power will allow us to ensure the correct scope of any future codes, working closely with the ICO and stakeholders and including noble Lords here today, and I am happy to meet them to discuss this further. The Government are, naturally, open to evidence about whether new statutory codes should be provided for by regulations in future. Although I appreciate the signal this can send, at the moment I do not believe that a requirement for codes on this issue is needed in this legislation. I hope noble Lords are reassured that the Government are taking this issue seriously.
Amendment 211A from the noble Lord, Lord Holmes, is about prohibiting the processing of people’s names, facial images, voices or any physical characteristics for AI training without their consent. Facial images and other physical characteristics that can be used to identify a person are already protected by the data protection legislation. An AI developer processing such data would have to identify a lawful ground for this. Consent is not the only option available, but I can reassure the noble Lord that there are firm safeguards in place for all the lawful grounds. These include, among many other things, making sure that the processing is fair and transparent. Noble Lords will know that even more stringent conditions, such as safeguards applying in relation to race, sexual orientation and any biometric data that can be used to identify someone as types of a special category of data are also covered.
Noble Lords tried to tempt me once again on the timetable for the AI legislation. I said as much as I could on that when we debated this in the last session, so I cannot add any more at this stage.
I hope that reassures noble Lords that the Bill has strong protections in place to ensure responsible data use and reuse, and, as such, that they feel content not to press their amendments.
I understand the point that the Secretary of State has the power, but does he have the intention? We are seeking an instruction to the ICO to do exactly this thing. The Secretary of State’s intention would be an excellent compromise all round to activate such a thing, and to see that in the Bill is the point here.
Discussions with the ICO are taking place at the moment about the scope and intention of a number of issues around AI, and this issue would be included in that. However, I cannot say at the moment that that intention is specifically spelled out in the way that the noble Baroness is asking.
My Lords, I speak to Amendment 114 to which I have added my name. It is a very simple amendment that prevents controllers circumventing the duties for automated decision-making by adding trivial human elements to avoid the designation. So, as such, it is a very straightforward—and, I would have thought, uncontroversial—amendment. I really hope that the Government will find something in all our amendments to accept, and perhaps that is one such thing.
I am struck that previous speeches have referred to questions that I raised last week: what is the Bill for, who is it for and why is not dealing with a host of overlapping issues that cannot really be extrapolated one from another? In general, a bit like the noble Lord, Lord Holmes, I am very much with the spirit of all these amendments. They reflect the view of the Committee and the huge feeling of civil society—and many lawyers—that this sort of attack on Article 22 by Clause 80 downgrades UK data rights at a time when we do not understand the Government’s future plans and hear very little about protections. We hear about the excitements of AI, which I feel bound to say that we all share, but not at the expense of individuals.
I raise one last point in this group. I had hoped that the Minister would have indicated the Government’s openness to Amendment 88 last week, which proposed an overarching duty on controllers and processors to provide children with heightened protections. That seemed to me the most straightforward mechanism for ensuring that current standards were maintained and then threaded through new situations and technologies as they emerged. I put those two overarching amendments down on the understanding that Labour, when in opposition, was very much for this approach to children. We may need to bring back specific amendments, as we did throughout the Data Protection and Digital Information Bill, including Amendment 46 to that Bill, which sought to ensure
“that significant decisions that impact children cannot be made using automated processes unless they are in a child’s best interest”.
If the Minister does not support an overarching provision, can she indicate whether the Government would be more open to clause-specific carve-outs to protect children and uphold their rights?
My Lords, I rise briefly, first, to thank everyone who has spoken so eloquently about the importance of automated decision-making, in particular its importance to public trust and the importance of human intervention. The retrograde step of watering down Article 22 is to be deplored. I am therefore grateful to the noble Lord, Lord Clement-Jones, for putting forward that this part of the Bill should not stand part. Secondly, the specific amendment that I have laid seeks to retain the broader application of human intervention for automated decision-making where it is important. I can see no justification for that watering down, particularly when there is such uncertainty about the scope that AI may bring to what can be done by automated decision-making.
My Lords, I have Amendment 201 in this group. At the moment, Action Fraud does not record attempted fraud; it has to have been successful for the website to agree to record it. I think that results in the Government taking decisions based on distorted and incomplete data. Collecting full data must be the right thing to do.
My Lords, I had expected the noble Baroness, Lady Owen of Alderley Edge, to be in the Room at this point. She is not, so I wish to draw the Committee’s attention to her Amendment 210. On Friday, many of us were in the Chamber when she made a fantastic case for her Private Member’s Bill. It obviously dealt with a much broader set of issues but, as we have just heard, the overwhelming feeling of the House was to support her. I think we would all like to see the Government wrap it up, put a bow on it and give it to us all for Christmas. But, given that that was not the indication we got, I believe that the noble Baroness’s intention here is to deal with the fact that the police are giving phones and devices back to perpetrators with the images remaining on them. That is an extraordinary revictimisation of people who have been through enough. So, whether or not this is the exact wording or way to do it, I urge the Government to look on this carefully and positively to find a way of allowing the police the legal right to delete data in those circumstances.
My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.
Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.
As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.
Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.
I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.
Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.
Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.
I have Amendment 135A in this group. The Bill provides a new set of duties for the Information Commissioner but no strategic framework, as the DPDI Bill did. The Information Commissioner is a whole-economy regulator. To my mind, the Government’s strategic priorities should bear on it. This amendment would provide an enabling power, such as that which the Competition and Markets Authority, which is in an equivalent economic position, already has.
My Lords, I have huge sympathy for, and experience of, many of the issues raised by the noble Lord, Lord Clement-Jones, but, given the hour, I will speak only to Amendment 145 in my name and those of the noble Baroness, Lady Harding, my noble friend Lord Russell and the noble Lord, Lord Stevenson. Given that I am so critical, I want to say how pleased I am to see the ICO reporting requirements included in the Bill.
Amendment 145 is very narrow. It would require the ICO to report specifically and separately on children. It is fair to say that one of the many frustrations for those of us who spend our time advocating for children’s privacy and safety is trying to extrapolate child-specific data from generalised reporting. Often it is not reported because it is useful to hide some of the inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snapchat provides a breakdown of the violation rate by age group, even though that would provide valuable information for academics, Governments, legislators, NGOs and, of course, regulators. It was a point of contention between many civil society organisations and Ofcom that there was no evidence that children of different ages react in different ways, which, for anyone who has had children, is clearly not the case.
Similarly, for many years we struggled to understand Ofcom’s reporting because older children were included in a group that went up to 24, and it took over 10 years for that to change. It seems to me—I hope the Government agree—that since children are entitled to specific data privacy benefits, it follows that the application and enforcement of those benefits should be reported separately. I hope that the Government can give a quick yes on this small but important amendment.
(1 week, 5 days ago)
Grand CommitteeMy Lords, I will speak to Amendments 59, 62, 63 and 65 in the name of my noble friend Lord Colville, and Amendment 64 in the name of the noble Lord, Lord Clement-Jones, to which I added my name. I am also very much in sympathy with the other amendments in this group more broadly.
My noble friend Lord Colville set out how he is seeking to understand what the Government intend by “scientific research” and to make sure that the Bill does not offer a loophole so big that any commercial company can avoid data protections of UK citizens in the name of science.
At Second Reading, I read out a dictionary definition of science:
“The systematic study of the structure and behaviour of the physical and natural world through observation, experimentation, and the testing of theories against the evidence obtained”—
i.e. everything. I also ask the Minister if the following scenarios could reasonably be considered scientific. Is updating or improving a new tracking app for fitness, or a bot for an airline, scientific? Is the behavioural science of testing children’s response to persuasive design strategies in order to extend the stickiness of commercial products scientific? These are practical scenarios, and I would be grateful for an answer in order to understand what is in and out of the scope of the Bill.
When I raised Clause 67 at a briefing meeting, it was said that it was, as my noble friend Lord Colville suggested, just housekeeping. The law firm Taylor Wessing suggests that what can
“‘reasonably be described as scientific’ is arguably very wide and fairly vague, so it will be interesting to see how this is interpreted, but the assumption is that it is intended to be a very broad definition”.
Each of the 14 law firm blogs and briefings that I read over the weekend described it variously as loosening, expanding or broadening. Not one suggested that it was a tightening and not one said that it was a no-change change. As we have heard, the European Data Protection Supervisor published an opinion stating that
“scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests”.
When the Minister responds, perhaps she could say whether the particular scenarios I have set out fall within the definition of scientific and why the Government have failed to reflect the critical clarification of the European Data Protection Supervisor in transferring the recital into the Bill.
I turn briefly to Amendment 64, which would limit the use of children’s personal data for the purposes of research and education by making it subject to a public interest requirement and opt-in from the child or a parent. I will speak in our debate on a later grouping to amendments that would enshrine children’s right to higher protection and propose a comprehensive code of practice on the use of children’s data in education, which is an issue of increasing scandal and concern. For now, it would be good to understand whether the Government agree that education is an area of research where a public interest requirement is necessary and appropriate and that children’s data should always be used to support their right to learn, rather than to commoditise them.
During debate on the DPDI Bill, a code of practice on children’s data and scientific research was proposed; the Minister added her name to it. It is by accident rather than by design that I have failed to lay it here, but I will listen carefully to the Minister’s reply to see whether children need additional protections from scientific research as the Government now define it.
My Lords, I have in subsequent groups a number of amendments that touch on many of the issues that are raised here, so I will not detain the Committee by going through them at this stage and repeating them later. However, I feel that, although the Government have had the best intentions in bringing forward a set of proposals in this area that were to update and to bring together rather conflicting and difficult pieces of legislation that have been left because of the Brexit arrangements, they have managed to open up a gap between where we want to be and where we will be if the Bill goes forward in its present form. I say that in relation to AI, which is a subject requiring a lot more attention and a lot more detail than we have before us. I doubt very much whether the Government will have the appetite for dealing with that in time for this Bill, but I hope that at the very least—it would be a minor concession at this stage—they will commit at the Dispatch Box to seeking to resolve these issues in the legislation within a very short period because, as we have heard from the arguments made today, it is desperately needed.
More importantly, if, by bringing together documentation that is thought to represent the current situation, either inadvertently or otherwise, the Government have managed to open up a loophole that will devalue the way in which we currently treat personal data—I will come on to this when I get to my groups in relation to the NHS in particular—that would be a grievous situation. I hope that, going forward, the points that have been made here can be accommodated in a statement that will resolve them, because they need to be resolved.
My Lords, I have to admit that I am slightly confused by the groupings at this point. It is very easy to have this debate in the medical space, to talk about the future of disease, fixing diseases and longevity, but my rather mundane questions have now gone unanswered twice. Perhaps the Minister will write to me about where the Government see scientific research on product development in some of these other spaces.
We will come back to the question of scraping and intellectual copyright, but I want to add my support to my noble friend Lord Freyberg’s amendment. I also want to add my voice to the question of the AI Bill that is coming. Data is fundamental to the AI infra- structure; data is infrastructure. I do not understand how we can have a data Bill that does not have one eye on AI, looking towards it, or how we are supposed to understand the intersection between the AI Bill and the data Bill if the Government are not more forthcoming about their intentions. At the moment, we are seeing a reduction in data protection that looks as though it is anticipating, or creating a runway for, certain sorts of companies.
Finally, I am sorry that the noble Lord is no longer in his place, but later amendments look at creating sovereign data assets around the NHS and so on, and I do not think that those of us who are arguing to make sure that it is not a free-for-all are unwilling to create, or are not interested in creating, ways in which the huge investment in the NHS and other datasets can be realised for UK plc. I do not want that to appear to be where we are starting just because we are unhappy about the roadway that Clause 67 appears to create.
Many thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.
Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order
“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.
This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.
Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:
“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.
By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.
Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.
My Lords, I feel we are getting slightly repetitive, but before I, too, repeat myself, I should like to say something that I did not get the chance to say the noble Viscount, Lord Colville, the noble Baroness, Lady Kidron, and others: I will write, we will meet—all the things that you have asked for, you can take it for granted that they will happen, because we want to get this right.
I say briefly to the noble Baroness: we are in danger of thinking that the only good research is health research. If you go to any university up and down the country, you find that the most fantastic research is taking place in the most obscure subjects, be it physics, mechanical engineering, fabrics or, as I mentioned earlier, quantum. A lot of great research is going on. We are in danger of thinking that life sciences are the only thing that we do well. We need to open our minds a bit to create the space for those original thinkers in other sectors.
Perhaps I did not make myself clear. I was saying that the defence always goes to space or to medicine, and we are trying to ascertain the product development that is not textiles, and so on. I have two positions in two different universities; they are marvellous places; research is very important.
I am glad we are on the same page on all that.
I now turn to the specifics of the amendments. I thank the noble Lords, Lord Freyberg and Lord Holmes, and the noble Viscount, Lord Camrose, for their amendments, and the noble Lord, Lord Lucas, for his contribution. As I said in the previous debate, I can reassure all noble Lords that if an area of research does not count as scientific research at the moment, it will not under the Bill. These provisions do not expand the meaning of scientific research. If noble Lords still feel unsure about that, I am happy to offer a technical briefing to those who are interested in this issue to clarify that as far as possible.
Moreover, the Bill’s requirement for a reasonableness test will help limit the misuse of this definition more than the current UK GDPR, which says that scientific research should be interpreted broadly. We are tightening up the regulations. This is best assessed on a case-by- case basis, along with the ICO guidance, rather than automatically disqualifying or passing into our activity sectors by approval.
Scientific research that is privately funded or conducted by commercial organisations can also have a life-changing impact. The noble Lord, Lord Markham, was talking earlier about health; issues such as the development of Covid vaccines are just one example of this. It was commercial research that was absolutely life-saving, at the end of the day.
My Lords, I rise briefly to support the amendments in the name of the noble Lord, Lord Stevenson of Balmacara. I must say that the noble Lord, Lord Clement-Jones, made a very persuasive speech; I shall be rereading it and thinking about it more carefully.
In many ways, purpose limitation is the jewel in the crown of GDPR. It does what it says on the tin: data should be used for the original purpose, and if the purpose is then extended, we should go back to the person and ask whether it can be used again. While I agree with and associate myself with the technical arguments made by the noble Lord, Lord Stevenson, that is the fundamental point.
The issue here is, what are the Government trying to do? What are we clearing a pathway for? In a later group, we will speak to a proposal to create a UK data sovereign fund to make sure that the value of UK publicly held data is realised. The value is not simply economic or financial, but societal. There are ways of arranging all this that would satisfy everyone.
I have been sitting here wondering whether to say it, but here I go: I am one of the 3.3 million.
So is the noble Lord, Lord Clement-Jones. I withdrew my consent because I did not trust the system. I think that what both noble Lords have said about trust could be spread across the Bill as a whole.
We want to use our data well. We want it to benefit our public services. We want it to benefit UK plc and we want to make the world a better place, but not at the cost of individual data subjects and not at too great a cost. I add my voice to that. On the whole, I prefer systems that offer protections by design and default, as consent is a somewhat difficult concept. But, in as much as consent is a fundamental part of the current regulatory system and nothing in the Bill gets rid of it wholesale for some better system, it must be applied meaningfully. Amendments 79, 81 and 131 make clear what we mean by the term, ensure that the definition is consistent and clarify that it is not the intention of the Government to lessen the opportunity for meaningful consent. I, too, ask the Minister to confirm that it is not the Government’s intention to downgrade the concept of meaningful consent in the way that the noble Lord, Lord Stevenson, has set out.
My Lords, I support Amendment 71 and others in this group from the noble Lords, Lord Clement-Jones and Lord Stevenson. I apologise for not being able to speak at Second Reading. The noble Lord, Lord Clement-Jones, will remember that we took a deep interest in this issue when I was a Health Minister and the conversations that we had.
I had a concern at the time. We all know that the NHS needs to be digitised and that relevant health professionals need to be able to access relevant data when they need to, so that there is no need to be stuck with one doctor when you go to another part of the country. There are so many efficiencies that we could have in the system, as long as they are accessed by relevant and appropriate health professionals at the right time. But it is also important that patients have confidence in the system and that their personal data cannot be shared with commercial organisations without them knowing. As other noble Lords have said, this is an issue of trust.
For that reason, when I was in that position, I reached out to civil liberties organisations to understand their concerns. For example, medConfidential was very helpful and had conversations with DHSC and NHS officials. In fact, after those conversations, officials told me that its demands were reasonable and that some of the things being asked for were not that difficult to give and common sense.
I asked a Written Question of the noble Baroness’s ministerial colleague, the noble Baroness, Lady Merron, about whether patients will be informed of who has had access to their patient record, because that is important for confidence. The Answer I got back was that the Government were proposing a single unified health record. We all know that. She said that:
“Ensuring that patients’ confidential information remains protected and is seen only by those who need to see it will be a priority. Public engagement next month will help us understand what safeguards patients would want to see”.
Surely the fact that patients have opted out shows that they already have concerns and have raised them.
The NHS can build the best data system—or the federated data platform, as it is called—but without patient confidence it is simply a castle made of sand. As one of my heroes, Jimi Hendrix, once said, castles made of sand fall into the sea eventually. We do not want to see that with the federated data platform. We want to see a modernised system of healthcare digital records, allowing joined-up thinking on health and care right across a patient’s life. We should be able to use machine learning to analyse those valuable datasets to improve preventive care. But, for that to happen, the key has to be trust and patients being confident that their data is secure and used in the appropriate way. I look forward to the Minister’s response.
I cannot compete with that tour de force. I shall speak to Amendments 73 and 75 in the name of noble Lord, Lord Clement-Jones, to which I have added my name, Amendments 76, 83 and 90 on the Secretary of State’s powers and Amendments 85 and 86 to which I wish I had added my name, but it is hard to keep up with the noble Lord. I am in sympathy with the other amendments in the group.
The issue of recognised legitimate interest has made a frequent appearance in the many briefings I have received and despite reading the Explanatory Notes for the Bill several times, I have struggled to understand in plain English the Government’s intent and purpose. I went to the ICO website to remind myself of the definition of legitimate interest to try to understand why recognised legitimate interest was necessary. It states:
“Legitimate interests is the most flexible lawful basis for processing, but you cannot assume it will always be the most appropriate.”
and then goes on:
“If you choose to rely on legitimate interests, you are taking on extra responsibility for considering and protecting people’s rights and interests.”
That seems to strike a balance between compelling justifications for processing and the need to consider and protect individual data rights and interests. I would be very interested to hear from Minister why the new category of “recognised legitimate interest” is necessary. Specifically, why do the Government believe that when processing may have far-reaching consequences, such as national security, crime prevention and safeguarding, there is no need to undertake a legitimate interest assessment? What is the justification for the ability of any public body to demand that data from private companies for any purpose? I ask those questions to be precise about the context and purpose.
I am not suggesting that there is no legitimate interest for processing personal data without consent, but the legitimate interest assessment is a check and balance that ensures oversight and reduces the risk of overreach. It is a test, not a blocker, and does not in itself prevent processing if the balancing test determines that processing should go ahead. Amendment 85 illustrates this point in relation to vulnerable users. Given that a determination that a person is at risk would have far-reaching consequences for that person, the principles of fairness and accountability demand that those making the decision must follow a due process and that those subject to the decision are aware—if not in an emergency, certainly at some point in the proceedings.
In laying Amendment 86, the noble Lord, Lord Clement-Jones, raises an important question that I am keen to hear from Ministers on, namely, what is the Government’s plan for ensuring that a designation that an individual is vulnerable is monitored and removed when it is no longer appropriate? If a company or organisation has a legitimate interest in processing someone’s data considering the balancing interests of data subjects, it is free to do so. I ask the Minister again to give concrete examples of circumstances in which the current legitimate interest basis is insufficient, so that we understand the problem the Government are trying to solve.
At Second Reading, the Government’s curious defence of this new measure was the idea that organisations had concerns about whether they were doing the balancing test correctly, so the new measure is there to help, but perhaps the Minister can explain what benefits accrue from introducing the new measure that could not have been better achieved by the ICO providing more concrete guidance on the balancing test. Given that the measure is focused on the provision of public interest areas, such as national security and the detection of crime, how does the creation of the recognised legitimate interest help the majority of data controllers, rather than simply serving the interests of incumbents and/or government departments by removing an important check or balance?
Amendments 76, 83 and 90 seek to curb the power of the Secretary of State to override primary legislation and to modify key aspects of UK data protection law via statutory instrument. The proposed provisions in Clauses 70, 71 and 74 put one person in control, rather than Parliament. Elon Musk’s new role in the upcoming US Administration gives him legitimacy as an incoming officeholder in the Executive, but his new role is complicated by the fact that he is also CEO and majority shareholder of X. Like OpenAI, Google, Amazon, Palantir or any other tech behemoth, tech execs are not elected or bound to fulfil social goods or commitments, other than making a profit for their shareholders. They also fund many of the think tanks, reports and events in the political ecosystem, and there is a well-worn path of employment between industry, government and regulators.
No single person should be the carrier of that incredible burden. For now, Parliament is the only barrier in the increasingly confused picture of regulatory and political capture by the tech sector. We should fight to keep it that way.
My Lords, I support Amendment 74 from the noble Lords, Lord Scriven and Lord Clement-Jones, on excluding personal health data from being a recognised legitimate interest. I also support Amendment 78 on having a statement by the Secretary of State to recognise that legitimate interest and Amendments 83 and 90, which would remove powers from the Secretary of State to override primary legislation to modify data protection via an SI. There is not much to add to what I said on the previous group, so I will not repeat all the arguments made then. In simple terms, I repeat the necessity for trust—in health, particularly for patient trust. You do not gain trust simply by defining personal health data as a legitimate interest or by overriding primary legislation on the say-so of a Secretary of State, even if it is laid as a statutory instrument.
My Lords, I thought I had no speech; that would have been terrible. In moving my amendment, I thank the noble Baronesses, Lady Kidron and Lady Harding of Winscombe, and the noble Lord, Lord Russell of Liverpool, for their support. I shall speak also to Amendments 94, 135 and 196.
Additional safeguards are required for the protection of children’s data. This amendment
“seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A”.
The change to the purpose limitation in Clause 71 raises questions about the lifelong implications of the proposed change for children, given the expectation that they are less aware of the risks of data processing and may not have made their own preferences or choices known at the time of data collection.
For most children’s data processing, adults give permission on their behalf. The extension of this for additional purposes may be incompatible with what a data subject later wishes as an adult. The only protection they may have is purpose limitation to ensure that they are reconsented or informed of changes to processing. Data reuse and access must not mean abandoning the first principles of data protection. Purpose limitation rests on the essential principles of “specified” and “explicit” at the time of collection, which this change does away with.
There are some questions that I would like to put to the Minister. If further reuses, such as more research, are compatible, they are already permitted under current law. If further reuses are not permitted under current law, why should data subjects’ current rights be undermined as a child and, through this change, never be able to be reclaimed at any time in the future? How does the new provision align with the principle of acting in the best interests of the child, as outlined in the UK GDPR, the UNCRC in Scotland and the Rights of Children and Young Persons (Wales) Measure 2011? What are the specific risks to children’s data privacy and security under the revised rules for purpose limitation that may have an unforeseeable lifelong effect? In summary, a blanket exclusion for children’s data processing conforms more with the status quo of data protection principles. Children should be asked again about data processing once they reach maturity and should not find that data rights have been given away by their parents on their behalf.
Amendment 196 is more of a probing amendment. Ofcom has set out its approach to the categorisation of category 1 services under the Online Safety Act. Ofcom’s advice and research, submitted to the Secretary of State, outlines the criteria for determining whether a service falls into category 1. These services are characterised by having the highest reach and risk functionalities among user-to-user services. The categorisation is based on certain threshold conditions, which include user numbers and functionalities such as content recommender systems and the ability for users to forward or reshare content. Ofcom has recommended that category 1 services should meet either of two sets of conditions: having more than 34 million UK users with a content recommender system or having more than 7 million UK users with a content recommender system and the ability for users to forward or reshare user-generated content. The categorisation process is part of Ofcom’s phased approach to implementing codes and guidance for online safety, with additional obligations for category 1 services due to their potential as sources of harm.
The Secretary of State recently issued the Draft Statement of Strategic Priorities for Online Safety, under Section 172 of the Online Safety Act. It says:
“Large technology companies have a key role in helping the UK to achieve this potential, but any company afforded the privilege of access to the UK’s vibrant technology and skills ecosystem must also accept their responsibility to keep people safe on their platforms and foster a safer online world … The government appreciates that Ofcom has set out to government its approach to tackling small but risky services. The government would like to see Ofcom keep this approach under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online.
As the online safety regulator, we expect Ofcom to continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. All search services in scope of the Act have duties to minimise the presentation of search results which include or lead directly to illegal content or content that is harmful to children. This should lead to a significant reduction in these services being accessible via search results”.
During the parliamentary debates on the Online Safety Bill and in Joint Committee, there was significant concern about the categorisation of services, particularly about the emphasis on size over risk. Initially, the categorisation was based largely on user numbers and functionalities, which led to concerns that smaller platforms with high-risk content might not be adequately addressed. In the Commons, Labour’s Alex Davies-Jones MP, now a Minister in the Ministry of Justice, argued that focusing on size rather than risk could fail to address extreme harms present on smaller sites.
The debates also revealed a push for a more risk-based approach to categorisation. The then Government eventually accepted an amendment allowing the Secretary of State discretion in setting thresholds based on user numbers, functionalities or both. This change aimed to provide flexibility in addressing high-risk smaller platforms. However, concerns remain, despite the strategy statement and the amendment to the original Online Safety Bill, that smaller platforms with significant potential for harm might not be sufficiently covered under the category 1 designation. Overall, while the final approach allows some flexibility, there is quite some debate about whether enough emphasis will be placed by Ofcom in its categorisation on the risks posed by smaller players. My colleagues on these Benches and in the Commons have emphasised to me that we should be rigorously addressing these issues. I beg to move.
My Lords, I shall speak to all the amendments in this group, and I thank noble Lords who have added their names to Amendments 88 and 135 in my name.
Amendment 88 creates a duty for data controllers and processors to consider children’s needs and rights. Proposed new subsection (1) simply sets out children’s existing rights and acknowledges that children of different ages have different capacities and therefore may require different responses. Proposed new subsection (2) addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults will experience under the proposals. Proposed new subsection (3) simply confirms that a child is anyone under the age 18.
This amendment leans on a bit of history. Section 123 of the Data Protection Act 2018 enshrined the age-appropriate design code into our data regime. The AADC’s journey from amendment to fully articulated code, since mirrored and copied around the world, has provided two useful lessons.
First, if the intent of Parliament is clear in the Bill, it is fixed. After Royal Assent to the Data Protection Act 2018, the tech lobby came calling to both the Government and the regulator arguing that the proposed age of adulthood in the AADC be reduced from 18 to 13, where it had been for more than two decades. Both the department and the regulator held up their hands and pointed at the text, which cited the UNCRC that defines a child as a person under 18. That age remains, not only in the UK but in all the other jurisdictions that have since copied the legislation.
In contrast, on several other issues both in the AADC and, more recently, in the Online Safety Act, the intentions of Parliament were not spelled out and have been reinterpreted. Happily, the promised coroner provisions are now enshrined in this Bill, but promises from the Dispatch Box about the scope and form of the coroner provisions were initially diluted and had to be refought for a second time by bereaved parents. Other examples, such as promises of a mixed economy, age-assurance requirements and a focus on contact harm, features and functionalities as well as content are some of the ministerial promises that reflected Parliament’s intention but do not form part of the final regulatory standards, in large part because they were not sufficiently spelled out in the Bill. What is on in the Bill really matters.
Secondly, our legislation over the past decade is guilty of solving the problems of yesterday. There is departmental resistance to having outcomes rather than processes enshrined in legislation. Overarching principles, such as a duty of care, or rights, such as children’s rights to privacy, are abandoned in favour of process measures, tools that even the tech companies admit are seldom used and narrow definitions of what must and may not be taken down.
Tech is various, its contexts infinite, its rate of change giddy and the skills of government and regulator are necessarily limited. At some point we are going to have to start saying what the outcome should be, what the principles are, and not what the process is. My argument for this amendment is that we need to fix our intention that in the Bill children have an established set of needs according to their evolving capacity. Similarly, they have a right to a higher bar of privacy, so that both these principles become unavoidable.
I thank all noble Lords who have raised this important topic. I say at the outset that I appreciate and pay tribute to those who have worked on this for many years—in particular the noble Baroness, Lady Kidron, who has been a fantastic champion of these issues.
I also reassure noble Lords that these provisions are intended to build upon, and certainly not to undermine, the rights of children as they have previously been defined. We share noble Lords’ commitment to ensuring high standards of protection for children. That is why I am glad that the Bill, together with existing data protection principles, already provides robust protections for children. I hope that my response to these amendments shows that we take these issues seriously. The ICO also recognises in its guidance, after the UN Committee on the Rights of the Child, that the duties and responsibilities to respect the rights of children extend in practice to private actors and business enterprises.
Amendment 82, moved by the noble Lord, Lord Clement-Jones, would exclude children’s personal data from the exemptions to the purpose limitation principles in Schedule 5 to the Bill. The new purposes are for important public interests only, such as safeguarding vulnerable individuals or children. Broader existing safeguards in the data protection framework, such as the fairness and lawfulness principles, also apply. Prohibiting a change of purpose in processing could impede important activities, such as the safeguarding issues to which I have referred.
Amendment 88, tabled by the noble Baroness, Lady Kidron, would introduce a new duty requiring all data controllers to consider that children are entitled to higher protection than adults. We understand the noble Baroness’s intentions and, in many ways, share her aims, but we would prefer to focus on improving compliance with the current legislation, including through the way the ICO discharges its regulatory functions.
In addition, the proposed duty could have some unwelcome and unintended effects. For example, it could lead to questions about why other vulnerable people are not entitled to enhanced protections. It would also apply to organisations of all sizes, including micro-businesses and voluntary sector organisations, even if they process children’s data on only a small scale. It could also cause confusion about what they would need to do to verify age to comply with the new duty.
Amendment 94, also tabled by the noble Baroness, would ensure that the new notification exemptions under Article 13 would not apply to children. However, removing children’s data from this exemption could mean that some important research—for example, on the causes of childhood diseases—could not be undertaken if the data controller were unable to contact the individuals about the intended processing activity.
Amendment 135 would place new duties on the ICO to uphold the rights of children. The ICO’s new strategic framework, introduced by the Bill, has been carefully structured to achieve a similar effect. Its principal objective requires the regulator to
“secure an appropriate level of protection for personal data”.
This gives flexibility and nuance in the appropriateness of the level of protections; they are not always the same for all data subjects, all the time.
Going beyond this, though, the strategic framework includes the new duty relating to children. This acknowledges that, as the noble Baroness, Lady Kidron, said, children may be less aware of the risks and consequences associated with the processing of their data, as well of as their rights. As she pointed out, this is drawn from recital 38 to the UK GDPR, but the Government’s view is that the Bill’s language gives sufficient effect to the recital. We recognise the importance of clarity on this issue and hope that we have achieved it but, obviously, we are happy to talk further to the noble Baroness on this matter.
This duty will also be a consideration for the ICO and one to which the commissioner must have regard across all data protection activities, where relevant. It will inform the regulator’s thinking on everything from enforcement to guidance, including how work might need to be tailored to suit children at all stages of childhood in order to ensure that the levels of protection are appropriate.
Finally, regarding Amendment 196—
I thank the Minister for giving way. I would like her to explain why only half of the recital is in the Bill and why the fact that children merit special attention is in the Bill. How can it possibly be that, in this Bill, we are giving children adequate protection? I can disagree with some of the other things that she said, but I would like her to answer that specific question.
To be on the safe side, I will write to the noble Baroness. We feel that other bits in the provisions of the Bill cover the other aspects but, just to be clear on it, I will write to her. On Amendment 196 and the Online Safety Act—
My Lords, although it is a late hour, I want to make two or three points. I hope that I will be able to finish what I wish to say relatively quickly. It is important that in looking at the whole of this Bill we keep in mind two things. One is equivalence, and the other is the importance of the rights in the Bill and its protections being anchored in something ordinary people can understand. Unfortunately, I could not be here on the first day but having sat through most of today, I deeply worry about the unintelligibility of this whole legislative package. We are stuck with it for now, but I sincerely hope that this is the last Civil Service-produced Bill of this kind. We need radical new thinking, and I shall try to explore that when we look at automated decision-making—again, a bit that is far too complicated.
Amendment 87 specifically relates to equivalence, and I want to touch on Amendment 125. There is in what I intend to suggest a fix to the problem, if it really exists, that will also have the benefit of underpinning this legislation by rights that people understand and that are applicable not merely to the state but to private companies. The problem that seems to have arisen—there are byproducts of Brexit that from time to time surface—is the whole history of the way in which we left the European Community. We left initially under the withdrawal Act, leaving retained EU law. No doubt many of us remember the debates that took place. The then Government were wholly opposed to keeping the charter. In respect of the protection of people’s data being processed, that is probably acceptable on the basis that the rights of the charter had merged into ordinary retained EU law through the decisions of the Court of Justice of the European Union. All was relatively well until the retained Retained EU Law (Revocation and Reform) Act, which deleted most general EU retained law principles, including fundamental rights, from the UK statute book. What then happened, as I understand it, was that a fix to this problem was attempted by the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023, which tidied up the UK GDPR by making clear that any references to fundamental rights and freedoms were regarded as reference to convention rights within the meaning of the Human Rights Act.
For good and understandable reasons, the Human Rights Act applies to public authorities and in very limited circumstances to private bodies but not as a whole. That is accepted generally and certainly is accepted in the human rights memorandum in respect of this Bill. The difficulty with the Bill, therefore, is that the protections under the Human Rights Act apply only to public authorities but not to private authorities. Whereas, generally speaking, the way in which the Charter of Fundamental Rights operated was to protect, also on a horizontal basis, the processing or use of data by private companies.
This seems to cause two problems. First, it is critical that there is no doubt about this, and I look forward to hearing what the Minister has to say as to the view of the Government’s legal advisers as to whether there is a doubt. Secondly, the amendment goes to the second of the two objectives which we are trying to achieve, which is to instil an understanding of the principles so that the ordinary member of the public can have trust. I defy anyone, even the experts who drafted this, to think that this is intelligible to any ordinary human being. It is simply not. I am sorry to be so rude about it, but this is the epitome of legislation that is, because of its sheer complexity, impossible to understand.
Of course, it could be made a lot better by a short series of principles introduced in the Bill, the kind of thing we have been talking about at times today, with a short, introductory summary of what the rights are under the Bill. I hope consideration can be given to that, but that is not the purpose of my amendment. One purpose that I suggest as a fix to this—to both the point of dealing with rights in a way that people can understand and the point on equivalence—is a very simple application, for the purposes of data processing, of the rights and remedies under the Human Rights Act, extending it to private bodies. One could therefore properly point, in going through the way that the Bill operates, to fundamental rights that people understand which are applicable, not merely if a public authority is processing the data but to the processing of data by private bodies. That is what I wanted to say about Amendment 87.
I wanted to add a word of support, because it is closely allied to this on the equivalence point, to the amendment in the name of the noble Lord, Lord Clement-Jones, for whose support I am grateful in respect of Amendment 87. That relates to the need to have a thorough review of equivalence. Obviously, negotiations will take place, but it really is important that thorough attention is given to the adequacy of our legislation to ensure that there is no incompatibility with the EU regime so we do not get adequacy. Those are the two amendments to which I wished to speak in this group. There are two reasons why I feel it would be wrong for me to go on and deal with the others. Some are very narrow and some very broad, and it is probably easiest to listen to those who are speaking to those amendments in due course. On that basis, therefore, I beg to move.
My Lords, I will speak to Amendments 139, 140 and 109A—which was a bit of a late entry this morning—in my name. I express my thanks to those who have co-signed them.
(2 weeks, 5 days ago)
Grand CommitteeIn an act that I hope he is going to repeat throughout, the noble Lord, Lord Clement-Jones, has fully explained all the amendments that I want to support, so I put on record that I agree fully with all the points he made. I want to add just one or two other points. They are mainly in the form of questions for the Minister.
Some users are more vulnerable to harms than others, so Amendment 33 would insert a new subsection 2B which mentions redress. What do the Government imagine for those who may be more vulnerable and how do they think they might use this system? Obviously, I am thinking about children, but there could be other categories of users, certainly the elderly.
That led me to wonder what consideration has been given to vulnerable users more generally and how that is being worked through. That led to me to question exactly how this system is going to interact with the age-assurance work that the IC is doing as a result of the Online Safety Act and make sure that children are not forced into a position where they have to show their identity in order to prove their age or, indeed, cannot prove their identity because they have been deemed to have been dealt with elsewhere in another piece of legislation. Because, actually, children do open bank accounts and do have to have certain sorts of ID.
That led me to ask what in the framework prevents service providers giving more information than is required. I have read the Bill; someone said earlier that it is skeletal. From what we know, you can separate pieces of information, attributes, from each other, but what is to prevent a service provider not doing so? This is absolutely crucial to the trust in and workings of this system, and it leads me to the inverse, Amendment 46, which asks how we can prevent this system being forced and thrust upon people. As the noble Lord, Lord Clement-Jones, set out, we need to make sure that people have the right not to use the system as well as the right to use it.
Finally, I absolutely agree with the noble Viscount, Lord Colville, and the amendment in the name of the noble Viscount, Lord Camrose: something this fundamental must come back to Parliament. With that, I strongly associate myself with the words of the noble Lord, Lord Clement-Jones, on all his amendments.
I thank noble Lords for their comments and contributions in what has been an absolutely fascinating debate. I have a couple of points to make.
I agree with the noble Lord, Lord Clement-Jones, on his Amendment 33, on ongoing monitoring, and his Amendment 50. Where we part company, I think, is on his Amendment 36. I feel that we will never agree about the effectiveness or otherwise of five-year strategies, particularly in the digital space. I simply do not buy that his amendment will have the desirable effects that the noble Lord wants.
I do not necessarily agree with the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that we should put extra burdens around the right to use non-digital methods. In my opinion, and I very much look forward to hearing from the Minister on this matter, the Act preserves that right quite well as it is. I look forward to the Government’s comments on that.
I strongly support the noble Viscount, Lord Colville, on his very important point about international standards. I had intended to sign his amendment but I am afraid that, for some administrative reason, that did not happen. I apologise for that, but I will sign it because I think that it is so important. In my opinion, not much of the Bill works in the absence of effective international collaboration around these matters. This is so important. We are particularly going to run up against this issue when we start talking about ADM, AI and copyright issues. It is international standards that will allow us to enforce any of the provisions that we put in here, so they are so important. I am more agnostic on whether this will happen via W3C, the ITU or other international standards bodies, but we really must go forward with the principle that international standards are what will get us over the line here. I look forward to hearing the Minister’s confirmation of the importance, in the Government’s view, of such standards.
Let me turn to the amendments listed in my name. Amendment 37 would ensure parliamentary oversight of the DVS trust framework. Given the volume of sensitive data that these services providers will be handling, it is so important that Parliament can keep an eye on how the framework operates. I thank noble Lords for supporting this amendment.
Amendment 40 is a probing amendment. To that end, I look forward to hearing the Minister’s response. Accredited conformity assessment bodies are charged with assessing whether a service complies with the DVS framework. As such, they are giving a stamp of approval from which customers will draw a sense of security. Therefore, the independence of these accreditation bodies must be guaranteed. Failing to do so would allow the industry to regulate itself. Can the Minister set out how the Government will guarantee the independence of these accreditation bodies?
Amendment 49 is also a probing amendment. It is designed to explore the cybersecurity measures that the Government expect of digital verification services. Given the large volume of data that these services will be handling, it is essential that the Government demand substantial cybersecurity measures. This is a theme that we are going to come back to again and again; we heard about it earlier, and I think that we will come on to more of this. As these services become more useful and more powerful, they present a bigger attack surface that we have to defend, and I look forward to hearing how we will do that.
I might need to write to the noble Viscount, but I am pretty sure that that is happening at an official level on a fairly regular basis. The noble Viscount raises an important point. I reassure him that those discussions are ongoing, and we have huge respect for those international organisations. I will put the detail of that in writing to him.
I turn to Amendment 37, tabled by the noble Viscount, Lord Camrose, which would require the DVS trust framework to be laid before Parliament. The trust framework contains auditable rules to be followed by registered providers of digital verification services. The rules, published in their third non-statutory iteration last week on GOV.UK, draw on and often signpost existing technical requirements, standards, best practice, guidance and legislation. It is a hugely technical document, and I am not sure that Parliament would make a great deal of sense of it if it was put forward in its current format. However, the Bill places consultation on a statutory footing, ensuring that it must take place when the trust framework is being prepared and reviewed.
Amendments 36 and 38, tabled by the noble Lord, Lord Clement-Jones, would create an obligation for the Secretary of State to reconsult and publish a five-year strategy on digital verification services. It is important to ensure that the Government have a coherent strategy for enabling the digital verification services market. That is why we have already consulted publicly on these measures, and we continue to work with experts. However, given the nascency of the digital identity market and the pace of those technological developments, as the noble Viscount, Lord Camrose, said, forecasting five years into the future is not practical at this stage. We will welcome scrutiny through the publication of the annual report, which we are committed to publishing, as required by Clause 53. This report will support transparency through the provision of information, including performance data regarding the operation of Part 2.
Amendment 39, also tabled by the noble Lord, Lord Clement-Jones, proposes to exclude certified public bodies from registering to provide digital verification services. We believe that such an exclusion could lead to unnecessary restrictions on the UK’s young digital verification market. The noble Lord mentioned the GOV.UK One Login programme, which is aligned with the standards of the trust framework but is a separate government programme which gives people a single sign-on service to access public services. It uses different legal powers to operate its services from what is being proposed here. We do not accept that we need to exclude public bodies from the scrutiny that would otherwise take place.
Amendment 46 seeks to create a duty for organisations that require verification and use digital verification for that purpose to offer, where reasonably practicable, a non-digital route and ensure that individuals are made aware of both options for verification. I should stress here that the provision in the Bill relates to the provision of digital verification services, not requirements on businesses in general about how they conduct verification checks.
Ensuring digital inclusion is a priority for this Government, which is why we have set up the digital inclusion and skills unit within DSIT. Furthermore, there are already legislative protections in the Equality Act 2010 in respect of protected groups, and the Government will take action in the future if evidence emerges that people are being excluded from essential products and services by being unable to use digital routes for proving their identity or eligibility.
The Government will publish a code of practice for disclosure of information, subject to parliamentary review, highlighting best practice and relevant information to be considered when sharing information. As for Amendment 49, the Government intend to update this code only when required, so an annual review process would not be necessary. I stress to the Committee that digital verification services are not going to be mandatory. It is entirely voluntary for businesses to use them, so it is up to individuals whether they use that service or not. I think people are feeling that it is going to be imposed on people, and I would push against that proposal.
If the regulation-making power in Amendment 50 proposed by the noble Lord, Lord Clement-Jones, was used, it would place obligations on the Information Commissioner to monitor the volume of verification checks being made, using the permissive powers to disclose information created in the clause. The role of the commissioner is to regulate data protection in the UK, which already includes monitoring and promoting responsible data-sharing by public authorities. For the reasons set out above, I hope that noble Lords will feel comfortable in not pressing their amendments.
Can I double-check that nothing was said about the interaction between the Bill and the OSA in all of that? I understood the Minister to say that she would perhaps write to me about vulnerable people, but my question about how this interacts was not answered. Perhaps she will write to me on that issue as well.
Yes, the ICO is undertaking work on age assurance under the OSA at the moment. My point was about how the two regimes intersect and how children get treated under each. Do they fall between?
I will, of course, write to the noble Baroness.
My Lords, I support the amendments in the name of the noble Lord, Lord Clement-Jones. I perhaps did not say it at the beginning of my remarks on this section, but I fully support the Government’s efforts to create a trust framework. I think I started with criticism rather than with the fact that this is really important. Trust is in the name and if we cannot trust it, it is not going to be a trust framework. It is important to anticipate and address the likelihood that some will seek to abuse it. If there are not sufficient consequences for abusing it, I do not understand quite how we can have the level of trust needed for this to have wide adoption.
I particularly want to say that good systems cannot rely on good people. We know that and we see it. We are going to discuss it later in Committee, but good systems need checks and balances. In relation to this set of amendments, we need a disincentive for bad actors to mislead or give false information to government or the public. I am not going to rehearse each amendment that the noble Lord, Lord Clement-Jones, explained so brilliantly. The briefing on the trust framework is a very important one for us all. The amount of support there is for the idea, and the number of questions about what it means and how it will work, mean that we will come back to this if we do not have a full enough explanation of the disincentives for a bad actor.
My Lords, I support these amendments and applaud the noble Lord, Lord Clement-Jones, for his temerity and for offering a variety of choices, making it even more difficult for my noble friend to resist it.
It has puzzled me for some time why the Government do not wish to see a firm line being taken about digital theft. Identity theft in any form must be the most heinous of crimes, particularly in today’s world. This question came up yesterday in an informal meeting about a Private Member’s Bill due up next Friday on the vexed question of the sharing of intimate images and how the Government are going to respond to it. We were sad to discover that there was no support among the Ministry of Justice officials who discussed the Bill with its promoter for seeing it progress any further.
At the heart of that Bill is the same question about what happens when one’s identity is taken and one’s whole career and personality are destroyed by those who take one’s private information and distort it in such a way that those who see it regard it as being a different person or in some way involved in activities that the original person would never have been involved in. Yet we hear that the whole basis on which this digital network has been built up is a voluntary one, and the logic of that is that it would not be necessary to have the sort of amendments that are before us now.
I urge the Government to think very hard about this. There must be a break point here. Maybe the meeting that has been promised will help us, but there is a fundamental point about whether in the digital world we can rely on the same protections that we have in the real world—and, if not, why not?
(1 month ago)
Lords ChamberMy Lords, I declare my interests as chair of the 5Rights Foundation and as an adviser to the Institute for Ethics in AI at Oxford.
I start by expressing my support for the removal of some of the egregious aspects of the last Bill that we battled over, and by welcoming the inclusion of access to data for researchers—although I believe there are some details to discuss. I am extremely pleased finally to see provisions for the coroner’s access to data in cases where a child has died. On that alone, I wish the Bill swift passage.
However, a Bill being less egregious is not sufficient on a subject fundamental to the future of UK society and its prosperity. I want to use my time this afternoon to ask how the Government will not simply make available, but secure the full value of, the UK’s unique datasets; why they do not fully make the UK AI-ready; and why proposals that they did not support in opposition have been included and amendments that they did support have been left out.
We have a unique opportunity, as the noble Lord, Lord Markham, just described, with unique publicly held datasets, such as the NHS’s. At a moment at which the LLMs and LMMs that will power our global future are being built and trained, these datasets hold significant value. Just as Britain’s coal reserves fuelled global industrial transformation, our data reserves could have a significant role to play in powering the AI transformation.
However, we are already giving away access to national data assets, primarily to a handful of US-based tech companies that will make billions selling the products and services built upon them. That creates the spectre of having to buy back drugs and medical innovations that simply would have not been possible without the incalculably valuable data. Reimagining and reframing publicly held data as a sovereign asset accessed under licence, protected and managed by the Government acting as custodian on behalf of UK citizens, could provide direct financial participation for the UK in the products and services built and trained on its data. It could give UK-headquartered innovators and researchers privileged access to nationally held data sets, or to investing in small and medium-sized specialist LLMs, which we will debate later in the week. Importantly, it would not simply monetise UK data but give the UK a seat at the table when setting the conditions for use of that data. What plans do the Government have to protect and value publicly held data in a way that maximises its long-term value and the values of the UK?
Similarly, the smart data schemes in the Bill do not appear to extend the rights of individual data holders to use their data in productive and creative ways. The Minister will recall an amendment to the previous data Bill, based on the work of associate professor Reuben Binns, that sought to give individuals the ability to assign their data rights to a third party for agreed purposes. The power of data is fully realised only when it is combined. Creating communal rights for UK data subjects could create social and economic opportunities for communities and smaller challenger businesses. Again, this is a missed opportunity to support the Government’s growth agenda.
My second point is that the Bill fails to tackle present-day or anticipated uses of data by AI. My understanding is that the AI Bill is to be delayed until the Government understand the requirements of the new American Administration. That is concerning on many levels, so perhaps the Minister can say something about that when she winds up. Whatever the timing, since data is, as the Minister said, in the DNA of AI infrastructure, why does the Bill so spectacularly fail to ensure that our data laws are AI-ready? As the News Media Association says, the Bill is silent on the most pressing data policy issue of our time: namely, that the unlicensed use of data created by the media and broader creative industries by AI developers represents IP theft on a mass scale.
Meanwhile, a single-sentence petition that says,
“The unlicensed use of creative works for training generative AI is a major, unjust threat to the livelihoods of the people behind those works, and must not be permitted”,
has been signed by nearly 36,000 organisations and individuals from the creative community. This issue was the subject of a cross-party amendment to which Labour put its name, which would have put the voluntary web standards represented by the robots.txt protocol on a mandatory opt-in basis—likely only one of several amendments needed to ensure that web indexing does not become a proxy for theft. In 2022, it was estimated that the UK creative industries generated £126 billion in gross value added to the economy and employed 2.4 million people. Given their importance to our economy, our sense of identity and our soft power, why do we have a data Bill that is silent on data scraping?
In my own area of particular concern, the Bill does not address the impact of generative AI on the lives and rights of children. For example, instead of continuing to allow tech companies to use pupil data to build unproven edtech products based on drill-and-practice learning models—which in any other form is a return to Victorian rote learning but with better graphics—the Bill could and should introduce a requirement for evidence-based, pedagogically sound paradigms that support teachers and pupils. In the recently announced scheme to give edtech companies access to pupil data, I could not see details about privacy, quality assurance or how the DfE intends to benefit from these commercial ventures which could, as in my previous NHS example, end with schools or the DfE having to buy back access to products built on UK pupil data. There is a quality issue, a safety issue and an ongoing privacy issue in our schools, and yet nothing in the Bill.
The noble Baroness and I met to discuss the need to tackle AI-generated sexual abuse, so I will say only that each day that it is legal to train AI models to create child sexual abuse material brings incalculable harm. On 22 May, specialist enforcement officers and I, along with the noble Viscount, Lord Camrose, were promised that the ink was almost dry on a new criminal offence. It cannot be that what was possible on that day now needs many months of further drafting. The Government must bring forward in this Bill the offence of possessing, sharing, creating or distributing an AI file that is trained on or trained to create CSAM, because this Bill is the first possible vehicle to do so. Getting this on the books is a question of conscience.
My third and final point is that the Bill retains some of the deregulatory aspects of its predecessor, while simultaneously missing the opportunity of updating data law to be fit for today. For example, the Bill extends research exemptions in the GDPR to
“any research that can reasonably be described as scientific”,
including commercial research. The Oxford English Dictionary says that “science” is
“The systematic study of the structure and behaviour of the physical and natural world through observation, experimentation, and the testing of theories against the evidence obtained”.
Could the Minister tell the House what is excluded? If a company instructs its data scientists and computing engineers to develop a new AI system of any kind, whether a tracking app for sport or a bot for an airline, is that scientific research? If their behavioural scientists are testing children’s response to persuasive design strategies to extend the stickiness of their products, is that scientific research? If the answer to these questions is yes, then this is simply an invitation to tech companies to circumvent privacy protections at scale.
I hope the noble Baroness will forgive me for saying that it will be insufficient to suggest that this is just tidying up the recitals of the GDPR. Recital 159 was deemed so inadequate that the European Data Protection Supervisor formally published the following opinion:
“the special data protection regime for scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests”.
I have yet to see that the Government’s proposal reflects this critical clarification, so I ask for some reassurance and query how the Government intend to account for the fact that, by putting a recital on the face of the Bill, it changes its status.
In the interests of time, I will put on the record that I have a similar set of issues about secondary processing, recognised legitimate interests, the weakening of purpose limitation, automated decision-making protections and the Secretary of State’s power to add to the list of special category data per Clause 74. These concerns are shared variously by the ODI, the Ada Lovelace Institute, the Law Society, Big Brother Watch, Defend Digital Me, 5Rights, Connected by Data and others. Collectively, these measures look like the Government are paving a runway for tech access to the private data of UK citizens or, as the Secretary of State for DSIT suggested in his interview in the Times last Tuesday, that the Government no longer think it is possible to regulate tech giants at all.
I note the inclusion of a general duty on the ICO to consider the needs of children, but it is a poor substitute for giving children wholesale protection from any downgrading of their existing data rights and protections, especially given the unprecedented obligations on the ICO to support innovation and stimulate growth. As the Ada Lovelace Institute said,
“writing additional pro-innovation duties into the face of the law … places them on an almost equivalent footing to protecting data subjects”.
I am not sure who thinks that tech needs protection from individual data rights holders, particularly children, but unlike my earlier suggestion that we protect our sovereign data assets for the benefit of UK plc, the potential riches of these deregulation measures disproportionately accrue to Silicon Valley. Why not use the Bill to identify and fix the barriers the ICO faces in enforcing the AADC? Why not use it to extend existing children’s privacy rights into educational settings, as many have campaigned for? Why not allow data subjects more freedom to share their data in creative ways? The Data (Use and Access) Bill has little in it for citizens and children.
Finally, but by no means least importantly, is the question of the reliability of computers. At col. GC 576 of Hansard on 24 April 2024, the full tragedy of the postmasters was set out by the noble Lord, Lord Arbuthnot, who is in his place and will say more. The notion that computers are reliable has devastated the lives of postmasters wrongly accused of fraud. The Minister yesterday, in answer to a question from the noble Lord, Lord Holmes, suggested that we should all be “more sceptical” in the face of computer evidence, but scepticism is not legally binding. The previous Government agreed to find a solution, albeit not a return to 1999. If the current Government fail to accept that challenge, they must shoulder responsibility for the further miscarriages of justice which will inevitably follow. I hope the noble Baroness will not simply say that the reliability of computers and the other issues raised are not for this Bill. If they are not, why not? Labour supported them in opposition. If not, then where and how will these urgent issues be addressed?
As I said at the outset, a better Bill is not a good Bill. I question why the Government did not wait a little longer to bring forward a Bill that made the UK AI ready, understood data as critical infrastructure and valued the UK’s sovereign data assets. It could have been a Bill that did more work in reaching out to the public to get their consent and understanding of positive use cases for publicly held data, while protecting their interests—whether as IP holders, communities that want to share data for their own public good or children who continue to suffer at the hands of corporate greed. My hope is that, as we go to Committee, the Government will come forward with the missing pieces. I believe there is a much more positive and productive piece of legislation to be had.
With respect, it is the narrow question that a number of us have raised. Training the new AI systems is entirely dependent on them being fed vast amounts of material which they can absorb, process and reshape in order to answer questions that are asked of them. That information is to all intents and purposes somebody else’s property. What will happen to resolve the barrier? At the moment, they are not paying for it but just taking it—scraping it.
Perhaps I may come in too. Specifically, how does the data protection framework change it? We have had the ICO suggesting that the current framework works perfectly well and that it is the responsibility of the scrapers to let the IP holders know, while the IP holders have not a clue that it is being scraped. It is already scraped and there is no mechanism. I think we are a little confused about what the plan is.
I can certainly write to noble Lords setting out more details on this. I said in response to an Oral Question a few days ago that my honourable friend Minister Clark in DSIT and Chris Bryant, whom the noble Lord, Lord Russell, mentioned, are working jointly on this. They are looking at a proposal that can come forward on intellectual property in more detail. I hope that I can write to noble Lords and set out more detail on that.
On the question of the Horizon scandal and the validity of computers, raised, quite rightly, by the noble Lords, Lord Arbuthnot and Lord Holmes, and the noble Baroness, Lady Kidron, I think we all understand that the Horizon scandal was a terrible miscarriage of justice, and the convictions of postmasters who were wrongly convicted have been rightly overturned. Those Post Office prosecutions relied on assertions that the Horizon system was accurate and reliable, which the Post Office knew to be wrong. This was supported by expert evidence, which it knew to be misleading. The issue was not, therefore, purely about the reliability of the computer-generated evidence. Almost all criminal cases rely to some extent on computer evidence, so the implications of amending the law in this area are far- reaching, a point made by several noble Lords. The Government are aware that this is an issue, are considering this matter very carefully and will announce next steps in due course.
Many noble Lords, including the noble Lords, Lord Clement-Jones, Lord Vaux and Lord Holmes of Richmond, and the noble and learned Lord, Lord Thomas, raised automated decision-making. I noted in my opening speech how the restored accountability framework gives us greater confidence in ADM, so I will not go over that again in detail. But to explain the Bill’s drafting, I want to reassure and clarify for noble Lords that the Bill means that the organisation must first inform individuals if a legal or significant decision has been taken in relation to them based solely on automated processing, and then they must give individuals the opportunity to challenge such decisions, obtain human intervention for them and make representations about them to the controller.
The regulation-making powers will future-proof the ADM reforms in the Bill, ensuring that the Government will have the powers to bring greater legal certainty, where necessary and proportionate, in the light of constantly evolving technology. I reiterate that there will be the right to human intervention, and it will be on a personal basis.
The noble Baroness, Lady Kidron, and the noble Lords, Lord Russell of Liverpool and Lord Clement-Jones, raised concerns about edtech. The Government recognise that concerns have been raised about the amount of personal data collected by education technology used in schools, and whether this is fully transparent to children and parents. The Department for Education is committed to improving guidance and support for schools to help them better navigate this market. For example, its Get Help with Data Protection in Schools project has been established to help schools develop guidance and tools to help them both understand and comply with data protection legislation. Separately, the ICO has carried out a series of audits on edtech service providers, assessing privacy risks and potential non-compliance with data protection regulations in the development, deployment and use of edtech solutions in schools.
The creation of child sexual abuse material, CSAM, through all mediums including AI—offline or online—is and continues to be illegal. This is a forefront priority for this Government and we are considering all levers that can be utilised to fight child sexual abuse. Responsibility for the law in this area rests with the Home Office; I know it is actively and sympathetically looking at this matter and I understand that my colleague the Safeguarding Minister will be in touch with the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, ahead of Committee.
I can see that I am running out of time so, rather than testing noble Lords’ patience, will draw my comments to a close. I have not picked up all the comments that colleagues made, but I thank everybody for their excellent contributions. This is the beginning of a much longer conversation, which I am very much looking forward to, as I am to hearing all those who promised to participate in Committee. I am sure we will have a rich and interesting discussion then.
I hope I have persuaded some noble Lords that the Bill is not only wide ranging but has a clear and simple focus, which is about growing the economy, creating a modern, digital government and, most importantly, improving people’s lives, which will be underpinned by robust personal data protection. I will not say any more at this stage. We will follow up but, in the meantime, I beg to move.
(3 months, 1 week ago)
Lords ChamberMy Lords, it is an absolute pleasure to follow the noble Lord, Lord Holmes, not just because I am going to speak thematically, alongside him, but because he speaks so wonderfully.
I thank the committee for its excellent report, and the excellent introduction by its chair, the noble Lord, Lord Hollick. I will restrict my remarks to two issues: the AI skills shortage across government and regulators, and the recommendation for a Joint Committee of both Houses to oversee digital regulation. I refer the House to my interests, in particular as adviser to the Institute for Ethics in AI at Oxford and as chair of the LSE’s Digital Futures research centre.
AI technology is not new, and nor is competition for digital expertise to support public policy. However, in recent years, we have seen a raft of digital regulation across data, competition, safety, consumer products and so on, as well as a step change in the scale at which AI is being deployed across business, public services and direct to citizens. Together, these have created an eye-watering competition for AI skills. The US and China dominate the charts of AI talent, together currently employing 75% of highly skilled workers; that is up from 58% in 2019. One sector analysis found that there are only 10,000 people in the world with the kinds of skills that new applications of AI need. I ask the House to do the maths: if the US and China have 7,500 of those people, that leaves very few for the rest of us.
What was once an issue concentrated in the tech sector, or in businesses with complex delivery or service functions, is now an issue for everyone, including the regulators. Increasingly, we hear government Ministers suggest that AI is the tool by which they will reform the NHS, justice and welfare, and that it is central to their growth agenda. This requires regulators and government itself to enter an intensely competitive market for skills in which they either pay eye-watering sums to attract talent or outsource to companies, most often headquartered outside the UK, with which they frequently make data sharing and processing arrangements that accrue long-term value disproportionately away from the UK.
There are a number of actions that government might consider, from funding graduate programmes to retraining professionals in associated fields or adding digital skills to those with domain expertise, compulsory training for civil servants and government lawyers, attractive packages for foreign nationals, and so on. But without a concerted and urgent effort, our hopes to be a centre of innovation, and for the transformation of public services and functions of government, such as drafting legislation or fulfilling oversight functions, will be blighted by a lack of access to adequate skills.
This leads neatly to my second point. The pre-legislative committee on the online harms Bill, on which I was privileged to serve, recommended a Joint Committee of both Houses to oversee digital regulation, setting out five primary functions for that committee: scrutinising digital regulators; scrutinising government drafting of legislation about digital technologies; reviewing relevant codes of practice; monitoring new developments such as the creation of emerging technologies; and publishing independent research or whistleblower testimonies. During the passage of the Online Safety Bill, the data Bill and the competition Bill, the creation of a Joint Committee was supported by Members of both Houses and from all parties, most notably championed by the noble Baroness, Lady Stowell, but including the Minister herself.
There is not time in this debate to go into detail about emerging gaps between the intentions of Parliament and digital regulators, the speed of regulatory codes versus the speed of technological development, the twin evils of hacking and scraping of intellectual property, commercial access to publicly held data, or the ferocious lobbying of government and regulator by the most powerful companies in the world. However, along with the issues raised by the report of conflicting objectives, inadequate expertise in government and regulator, and the habit of information overload instead of transparency, each of these things would be well served by Parliament having oversight and expertise from dedicated committee members and staff.
This is a time in which digital solutions, particularly those driven by AI, come before the House in ever greater numbers, with unprecedented impact on every area of public and private life. If we do not ourselves grasp the problem of skills, we will squander our sovereign resources and find ourselves renters of services and products that should be built in the UK. If we do not improve our oversight of digital regulation, we will squander our chance to be a rule-maker and not a rule-taker of the new world.
(3 months, 2 weeks ago)
Lords ChamberMy Lords, Ofcom has a very wide-ranging and serious set of responsibilities. There is no suggestion that it is not carrying out its responsibilities in the run-up to the implementation of the Online Safety Act. We are working very closely with Ofcom and believe that it will carry out those additional functions that we have given it with proper scrutiny and to high standards. Yes, there is a case for looking at all regulators; we have a debate on this on Monday in the House, and I am looking forward to that, but that is a wider issue. For the moment, we have to give Ofcom all the support that we can in implementing a very difficult set of regulations.
My Lords, the crafting of the Online Safety Act was fraught with exceptions, exclusions and loopholes, the most egregious of which is that regulated companies get safe harbour if they comply with Ofcom’s codes, but Ofcom has provided us with codes that have huge gaps in known harms. What plans do the Government have to fulfil their election promise to strengthen the OSA by ensuring that it protects all children effectively, even the very young, and that it has adequate mechanisms to act swiftly in a crisis, or with an evolving new risk, to stop abuse being whipped up algorithmically and directed at minority groups?
My Lords, I think that we are in danger of downplaying the significance of the Online Safety Act. It is a trail-blazing Act; the noble Baroness was very much involved in it. Our priority has to be to get that Act implemented. Under it, all user-to-user services and search providers, regardless of size, will have to take swift and effective action against illegal content, including criminal online abuse and posts of a sexual nature. We should get behind Ofcom and the Online Safety Act, and we will then obviously have to keep that Act under review, but we have the tools to do that.
(9 months, 1 week ago)
Lords ChamberMy Lords, I declare my interest as deputy chair of the Telegraph Media Group and my other interests as set out in the register. I will focus briefly on three crucial amendments in this group—on proportionality, the appeals standard, and the Secretary of State’s powers—echoing points that have already been made strongly in this debate.
I fully support Amendments 13 and 35 in the name of the noble Lord, Lord Faulks. The amendment made to the Bill in the Commons replacing “appropriate” with “proportionate” will significantly expand the scope for SMS firms to appeal the CMA’s decision to create conduct requirements and initiate pro-competitive interventions.
As we have already heard, the Government have sought to argue that, even absent the “proportionality” wording, in most cases the SMS firms will be able to argue that their ECHR rights will be engaged, therefore allowing them to appeal on the basis of proportionality. The question arises: why then introduce the “proportionality” standard for intervention at all, particularly when the CMA has never had the scope to act disproportionately at law?
In this context, it is clear that the main potential impact of the Bill as it now stands is that a court may believe that Parliament was seeking to create a new, heightened standard of judicial review. As the Government have rightly chosen to retain judicial review as the standard of appeals for regulatory decisions in Part 1, they should ensure that this decision is not undermined by giving big tech the scope to launch expensive, lengthy legal cases. All experience suggests that that is exactly what would happen by it arguing that the Government have sought to create a new, expansive iteration of JR. I fear that, if the amendments from the noble Lord, Lord Faulks, are not adopted, we may find in a few years’ time that we introduced full merits reviews by the back door, totally undermining the purpose of this Act.
Amendments 43, 44, 46, 51 and 52 in the name of the noble Baroness, Lady Jones, are also concerned with ensuring that we do not allow full merits appeals to undermine the CMA’s ability to regulate fast-moving digital markets. Even though full merits are confined to penalty decisions, financial penalties are, after all, as we have heard, the ultimate incentive to comply with the CMA’s requirements. We know that the Government want this to be a collaborative regime but, without there being a real prospect of meaningful financial penalties, an SMS firm will have little reason to engage with the CMA. Therefore, there seems little logic in making it easier for SMS firms to delay and frustrate the imposition of penalties.
There is also a danger that full merits appeals of penalty decisions will bleed back into regulatory decisions. The giant tech platforms will undoubtedly seek to argue that a finding of a breach of a conduct requirement, and the CMA’s consideration that an undertaking has failed to comply with a conduct requirement when issuing a penalty, are both fundamentally concerned with the same decision: “the imposition” of a penalty, with the common factor being a finding that a conduct requirement has been breached. The cleanest way to deal with this is to reinstate the merits appeals for all digital markets decisions. That is why, if the noble Baroness, Lady Jones, presses her amendments, I will support them.
Finally, I strongly support Amendment 56 in the name of my noble friend Lord Lansley, which would ensure that the Secretary of State must approve CMA guidance within a 40-day deadline. This would allow the Government to retain oversight of the pro-competition regime’s operations, while also ensuring that the operationalisation of the regime is not unduly delayed. It will also be important in ensuring that updates to the guidance are made promptly; such updates are bound to be necessary to iron out unforeseen snags or to react to rapidly developing digital markets. Absent a deadline for approval, there is a possibility that the regulation of big tech firms will grind to a halt mid-stream. That would be a disaster for a sector in which new technologies and business models are developed almost daily. I strongly support my noble friend and will back him if he presses his amendment to a vote.
With the deadline to comply with the Digital Markets Act in Europe passing only last week, big tech’s machinations in the EU have provided us with a window into our future if we do not make this legislation watertight. As one noble Lord said in Committee—I think it was the noble Lord, Lord Tyrie—we do not need a crystal ball when we can read the book. We have the book, and we do not like what we see in it. We must ensure that firms with an incredibly valuable monopoly to defend and limitless legal budgets with which to do so are not able to evade compliance in our own pro-competition regime.
My Lords, I will speak to Amendments 43, 44, 46, 51 and 52, to which I have added my name, and Amendment 59. Before I do, I register my support for Amendments 13 and 35, which were brilliantly set out by my noble friend Lord Faulks and added to by others. I too shall support them if they choose to ask the opinion of the House.
I also support Amendment 56 in the name of the noble Lord, Lord Lansley. I have lived experience of waiting too long for the code to come back from the Secretary of State. Even without being a bad actor, it is in the nature of Secretaries of State to have a burgeoning in-tray, and it is in the nature of codes to be on a subject that politicians have moved on from by the time they arrive. I fully support him, and 40 days seems like a modest ask given the importance of the Bill overall.
I turn to the amendments in the name of the noble Baroness, Lady Jones. I look forward to her setting them out after I have supported them. They would reinstate judicial review as the appeal standard for penalty decisions. I thank the Minister for the generosity of his time; I know he spoke not only to me but to a number of noble Lords. However, the thing I have taken away from discussions with government and during Committee is the persistent drumbeat that asserts that we are giving huge new and untested powers to the CMA. Here, we can fill in as we like: full merits on penalty, countervailing benefits, proportionality, and Secretary of State powers have been introduced simply to give a little balance. I find that unacceptable given the power of the companies and the asymmetry we are trying to address.
The reality is that the powers given to the CMA, while much needed, are dwarfed by the power of the companies they seek to regulate. The resources available to the CMA, while welcome, are dwarfed by the resources available to a single brand of a single SMS. Most of all, the CMA’s experience of regulating digital companies is dwarfed by the experience of digital companies in dodging regulation. I am struggling to understand the imbalance of power that the Government are seeking to address.
I was in Brussels on Wednesday last week and there is a certain regret about the balancing that the EU allowed to the DMA in face of the tech lobby, only to see Apple, TikTok and Meta gleefully heading to the courts and snarling up the possibility of being regulated as intended for many years—or perhaps at all. This issue was raised by the noble Lord, Lord Black. Adding a full merits appeal on penalty will embolden the sector to use the threat of appeal to negotiate their position at earlier points in the process. It will undermine the regulator’s strength in coming to a decision. Very possibly, as other noble Lords have said, it could bleed backwards into areas of compliance and conduct requirements. It is, as the noble Baroness, Lady Harding, said, creating a hole for water to get in. The companies lobbied furiously for full merits on penalties. This is not an administrative point; it goes to the heart of the regime. Full merits give the regulated leverage over the regulator.
The most straightforward way of ensuring that the regulator does not abuse its new, enhanced power, as the Government appear to fear, is to make it accountable to Parliament, as the noble Baroness, Lady Stowell, set out in full, repeatedly and with great eloquence. I am sorry that we will not have an opportunity to make our feelings on that issue felt today, but I strongly support her saying that we should not drop this issue just because it is inconvenient to deal with at this point in the electoral cycle.
My Lords, I have four amendments in this group. Amendments 16 and 17 relate to the conduct requirements that the CMA can impose on designated undertakings, and Amendments 20 and 25 relate to countervailing benefits in relation to that conduct. I will come to that in a minute. Let me stick for a moment with Amendments 16 and 17.
Amendment 16 was helpfully introduced, to some extent, by what the noble Lord, Lord Clement-Jones, said about the activities in the run-up to the introduction of the Digital Markets Act in the European Union. There was a deadline of 7 March for that, and considerable attention has been paid to what Apple in particular has done in relation to that. The noble Lord made Apple’s position clear. It is saying, essentially, that we can either stay with our existing system, and it will charge 30% by way of fees for apps on the App Store, or we can go to this alternative which enables us to comply with the DMA, and Apple will offer an alternative but with a 17% fee for apps plus a 3% core technology fee, and, if you go beyond a million downloads, you will get a 50 cents processing charge per download. Those who fear that their app may go viral, with millions of downloads, are potentially facing enormous costs for processing them through the App Store. As far as all the potential users of the Apple App Store are concerned, this potentially restricts their opportunity for competition rather than enabling it.
My first point is to further reinforce that we have come together to design legislation in support of the Government that is more flexible than the Digital Markets Act. The DMA, in effect, puts the obligations into the originating Act. To change them will be considerably more difficult than would be the case for the Competition and Markets Authority in our regime to change the structure and the content of conduct requirements. Potentially, we have really good flexibility.
Amendment 16 is linked to whether the powers to impose conduct requirements enable the CMA to act in relation to the leveraging of market power in digital activities into other activities—the wider system of its business. Amendment 16 is absolutely about whether the conduct requirements that can be imposed under Clause 20 are sufficiently wide to enable the Competition and Markets Authority to structure them to limit activity which restricts competition in the way that these efforts are being pursued in relation to the Digital Markets Act. To that extent, Amendment 16 asks the Minister, if he would be kind enough to respond in this light, whether, if a designated undertaking were to behave in that sort of way, the CMA would have the power under the conduct requirements to respond and act, and to do so rapidly, to frustrate that kind of anti-competitive result.
Amendment 17 is slightly different, in that we discussed it in Committee. One of the European Union Digital Markets Act obligations is termed expressly to prevent others seeking to stop someone making a complaint to any public authority about non-compliance with the relevant obligations. I looked to see whether our conduct requirements, specified in Clause 20, cover a similar circumstance. In discussion in Committee, the Minister directed me to the “fair and reasonable terms” provision, which is very wide ranging but does not cover this, because these are not the terms of a contractual relationship between a designated undertaking and its users or potential users. It may not relate to that at all.
The Minister also directed me to the question of discrimination, but I do not think this is about discrimination between users; it is about preventing someone, who may be a user, a potential user or a potential competitor, from going to a public authority and saying, “This undertaking does not comply with its conduct requirements”. We know—I will not repeat the evidence that I gave in Committee—that there have, unhappily, been circumstances of intimidation of those who would complain to regulators about the conduct of organisations with significant market power. I return to this simply to say to the Minister that I am not yet convinced. Can he convince us that this kind of activity is covered by the conduct requirements? If it is not, will he undertake to ensure that the necessary changes are made to Clause 20, which the legislation would permit?
I will also speak to the amendments about counter-vailing benefits exemptions. Amendments 23 and 24 revert the Bill to its original wording, which would be better than where we are now. I have looked at Clause 29 from my point of view and I cannot find a good reason for it, so I thought it better to leave it out. If there is a conduct investigation and there are countervailing benefits, they should be presented to the CMA when it makes representations to a conduct investigation. Why would they be left to any other time or specified separately in the legislation?
I thought it better to amend Clause 27 such that, when making representations, the designated undertaking may give details of the benefits associated with its conduct to form part of that investigation. At that point, it should come forward if it is prepared to make commitments that the CMA could accept, without necessarily making a finding, to close that investigation.
All this should take place in Clause 27 on representations, because that is where the sequence lies. I do not understand why Clause 29 has been added at what appears to be a later stage in the sequence of the legislation. As it is a separate clause, it appears as though the benefits can be presented at an entirely separate point.
As I have also discussed with the Minister, there is an analogy with the exempt anti-competitive agreements under the Competition Act 1998. I was on the stand when that Bill was in Committee and this is a very different kettle of fish. The 1998 Act set out broad descriptions of agreements that would be deemed anti-competitive and therefore void, except if undertakings came to the Competition and Markets Authority; then the burden is on it to demonstrate that they have, in effect, countervailing benefits, such as to innovation, the consumer and the like, without an adverse effect on competition.
That is ex post regulation. That is agreements and obligations that are broad-ranging and apply across industry. Here, we are talking about conduct requirements that are optimised and designed in relation to that undertaking in the first place. This is ex ante regulation. You cannot compare ex post provisions in the Competition Act with ex ante regulation under this legislation. They are not the same kind of thing.
Therefore, again, I come back to the argument: let us not have exemptions. The use of “exemption” seems wholly inappropriate. We have here a very straightforward process. Conduct requirements require, in themselves, under Clause 24, for there to be a consultation. The undertaking should tell the CMA what the benefits associated with its conduct are at that stage.
There is a forward-looking process; the conduct requirement is supposed to look forward five years, but none the less, circumstances change. The CMA can review a conduct requirement, and the designated undertaking should come to the CMA if circumstances change and there are countervailing benefits and ask for the conduct requirement to be reviewed. Even if, under all these circumstances, a conduct investigation notice is issued, the undertaking should come forward and express what the benefits are at that point. Under none of these circumstances is there a requirement for the use of “exemption” or for an additional clause that offers countervailing benefits as such.
I dare say I will not press this, because there is probably more to be said for Amendment 23 and going back to the original wording, but it afforded me the opportunity, I hope, to explain why I think the whole proposition in Clause 29 seems misplaced.
My Lords, I find myself in a slightly awkward position because my name is listed in support of Amendments 23 and 24, but I find the argument of the noble Lord, Lord Lansley, incontrovertible, and maybe he should press his amendment.
On the wording, I want to put on the record the view of Which?:
“This is a legal loophole for big tech to challenge conduct requirements through lengthy, tactical, legal challenges. It would tie up CMA (i.e., taxpayer) resources and frustrate the intent of the legislation. Whilst we agree with the intent of this provision, which is to encourage innovation that will benefit consumers, it is critical that these provisions do not inadvertently give designated firms a get out of jail free card from DMU decisions”
by presenting opaque consumer benefits.
I put that on the record because it is so measured in comparison with many of the emails and representations I have had, and still is absolutely categoric that this is a get out of jail card. Like the noble Lord, Lord Lansley, I do not understand why the regulator duty to be
“proportionate, accountable, consistent, transparent and targeted”,
within the context of coming to the conduct requirements and taking up any countervailing benefits at that point, is not adequate. So I will support the noble Baroness, Lady Jones, and, indeed, the noble Lord, Lord Lansley, should he change his mind in the next few minutes.
I also add my support to Amendment 60, tabled by the noble Lord, Lord Fox. I am an enthusiastic supporter of international standards. They provide for soft law and, having worked with the IEEE on a number of standards over the last few years, I see how brilliantly they work to bring disparate people together and provide practical steps for those tasked with implementation. I declare an interest in relation to the IEEE, which gives some funding to 5Rights Foundation, of which I am chair.
The point I leave with the House is that, toward the end of 2022, I had two conversations with companies that will certainly be SMS about why they were now recruiting for employees to work on standards full-time. I believe the CMA should be in the standards-writing game.
(10 months, 3 weeks ago)
Grand CommitteeMy Lords, as well as speaking to Amendment 80, I will say a few words about Amendment 83A in my name, which is in some ways related.
The point just made was extremely important and correct: in whose interests are these bodies acting? The answer should always be people—all of us. Commissioner Vestager, responsible for competition in Brussels, made exactly this point in evidence on several occasions and in a couple of major speeches. She is a far-sighted and bold competition Commissioner. In practice, we are all consumers, so the word “consumer” should probably catch it, but it may not convey quite as much to the public as we would like.
My amendment was triggered by an exchange that I had with the noble Lord, Lord Vaizey, earlier in the scrutiny of the Bill. In response to a question of his to the Minister, I suggested that the CMA always operates under a duty to be proportionate. When I said that, I had in mind not so much the implications of the Human Rights Act for its effect on proportionality but a more general duty to respect best regulatory practice, under which specialist regulators operate, as far as I know. Usually, this is understood to mean transparency, accountability, proportionality, consistency and, where relevant, action targeted only at cases that really require it. Some people talk about efficiency and economy in the same breath. Although I have not found that in any statute, I expect that it is to be found in various statutes.
I have subsequently checked some of this out with the House of Commons Library and others. First, a duty such as I describe is written into the Water Act, the Gas Act, the Electricity Act and the Communications Act, among others, with very similar wording to that which I have just cited. In other words, Ofwat, Ofgem and Ofcom are all subject to such a duty. I have also checked that these duties are justiciable.
Secondly, I made another, unexpected, discovery. As a result of this legislation, the CMA will become an outlier among these specialist regulators. By this legislation, we are giving the CMA specific specialist responsibilities for the digital sector. In other words, it becomes a sector regulator. But, unlike with the other specialist regulators that I have just listed, no such statutory duty to adhere to the principles of best regulatory practice will be required of it. My amendment would correct that omission.
Late last week I discovered that the City of London Law Society had made roughly the same point in its submission on the Bill. The wording in my amendment is pretty much taken from that submission. At the time I tabled it, I had not discussed it with the City of London Law Society and, since then, I have had time only for a couple of minutes with it on the phone. I cannot think of a good reason for not applying this duty to the CMA, but I can think of plenty of reasons why it should be applied.
These duties on public bodies can appear to be little more than motherhood and apple pie but, as I have discovered over the years, they can influence behaviour in powerful public bodies in quite a big way, and usually for the better. I will illustrate that. Take an accounting officer who comes under pressure to do something that he or she considers inappropriate. That happens not infrequently, as those of us who have been on the inside, or on both sides, of the public body fence will know. With a statutory duty in place, the accounting officer is much better protected and placed to be able to say, “I’m not going to go ahead with that”. That is no doubt one of several reasons why these specialist regulators have these duties imposed on them: they serve as a reminder, a backstop, for securing good conduct from those at the top of organisations, particularly those with a high degree of statutory independence.
Now, the Government—on advice, no doubt—will point in response, probably in just a moment, to codes of conduct, guidelines and other documents that already require good regulatory practice. I can see the Minister smiling. I know most of these documents quite well—as a matter of fact, I contemplated reading them out myself, but I will spare the Committee that pain and leave it to him to take the flak. The department’s impact assessments should work, in principle, to provide some of the heavy lifting as well, and they are audited by the NAO. I have seen that scrutiny in action, and it does far less to improve behaviour than a statutory obligation. It is the latter that really concentrates the mind.
More and more as we examine the Bill, the absence of a general duty on the CMA seems to be of a piece with the approach taken right across the draft legislation. We are creating a body with unprecedented powers and unprecedentedly feeble avenues for the securing of accountability. We are creating ideal conditions for executive overreach. All the necessary ingredients are being put in place as we legislate here.
First, there is the long history of patchy to poor scrutiny by Parliament, particularly by the Commons, of the CMA. As I may have pointed out on more than one occasion, I was its very first chairman ever to appear before the BEIS Select Committee, and I secured my audience by request—I said that I really would like to come along—which gives you an idea of the distance between the committee and the activities of the CMA. Of course—and I do not mean this disparagingly to anybody in this House—it is the Commons Select Committee that really counts when it comes to delivering punchy cross-examination and accountability, or at least counts most.
Parliament could do a better job, which I think was the point that the noble Baroness, Lady Stowell, made on Monday, but it would be a profound mistake, even if we got the improvements that she is proposing, to rely exclusively on Parliament to do the heavy lifting.
The first reason why we need this amendment is that we do not have much parliamentary scrutiny. Secondly, we have a body with a historically weak board, with most of the important decisions already delegated to the most senior executives, mixed-quality governance at best and a history of patchy to poor non-executive challenge of executive decisions. I realise that it is concerning that an ex-chairman should feel the need to put that on record, but it is necessary. Thirdly, as things stand, we are protecting the CMA from any substantive review at all of decisions on digital, which is a discussion we had earlier with respect to JR.
A fourth reason why this amendment is needed is that it now seems that the body is to be exempted from the core duties to conform to best regulatory practice which have been considered essential for all other sector regulators that I have checked out. My amendment would rectify that problem at least. I hope that the Minister will look favourably on the suggestion.
My Lords, I support Amendment 80, to which I added my name. I will also say a few words about Amendment 83A in the name of the noble Lord, Lord Tyrie.
I fear that the word “citizens” might meet the same fate as the word “workers”. The argument will be made that it extends the CMA’s remit in ways that might overburden, create a lack of focus or overlap. However, the digital world has several characteristics that support the amendment in the name of the noble Baroness, Lady Jones, which would add “citizens” to “consumers”.
I am sorry to intervene a second time. When the Minister is looking for counter- examples, I would be grateful if he kept to the major sector regulators, which are the direct comparator. There are more than 500 significant quangos, and I am sure I would be able to find a few quite quickly.
Before the Minister stands up, may I ask him whether, if he cannot find a counterexample, this amendment may find some favour with the Government?
I will actively seek a counterexample and consider the implications of my results.
The CMA has a strong track record of following best regulatory practice across all its functions as an experienced regulator. The Government’s view is therefore that it makes sense to legislate only when it is necessary to do so, and that here there does not appear to be a problem that requires a legislative solution. For these reasons, I hope the noble Baroness feels able to withdraw her amendment.
(10 months, 3 weeks ago)
Grand CommitteeThe noble Lord, Lord Knight, has said so much of my speech that I will be very rapid. There are two points to make here. One is that regulatory co-operation is a theme in every digital Bill. We spent a long time on it during the passage of the Online Safety Act, we will do it again in the Data Protection and Digital Information Bill, and here it is again. As the noble Lord, Lord Knight, said, if the wording or the approach is not right, that does not matter, but any move to bring regulators together is a good thing.
The second point, which may come up again in amendments in a later group that looks at citizens, is that it is increasingly hard to understand what a user, a worker or a citizen is in this complicated digital system. As digital companies have both responsibilities and powers across these different themes, it is important, as I argued last week, to ensure that workers are not forgotten in this picture.
My Lords, it is with great trepidation that I rise to speak to these amendments because, I think for the first time in my brief parliamentary career, I am not complete ad idem with the noble Lord, Lord Knight, and the noble Baroness, Lady Kidron, on digital issues where normally we work together. I hope they will forgive me for not having shared some of my concerns with them in advance.
I kicked myself for not saying this last week, so I am extremely grateful that they have brought the issue back this week for a second run round. My slight concern is that history is littered with countries trying to stop innovation, whether we go back to the Elizabethans trying to stop looms for hand knitters or to German boatmen sinking the first steamboat as it went down the Rhine. We must be very careful that in the Bill we do not encourage the CMA to act in such a way that it stops the rude competition that will drive the innovation that will lead to growth and technology. I do not for a moment think that the noble Lord or the noble Baroness think that, but we have to be very cautious about it.
We also learn from history that innovation does not affect or benefit everybody equally. As we go through this enormous technology transformation, it is important that as a society we support people who do not necessarily immediately benefit or who might be considerably worse off, but I do not think that responsibility should lie with the CMA. Last week, the noble Lord, Lord Knight, challenged with, “If not in this Bill, where?” and I feel similarly about this amendment. It is right that we want regulators to co-operate more, but it is important that our regulators have very clear accountabilities. Having been a member of the Court of the Bank of England for eight years in my past life, I hate the fact that there are so many that the Bank of England must take note of in its responsibilities. We have to be very careful that we do not create a regime for the CMA whereby it has to take note of a whole set of issues that are really the broad responsibility of government. Where I come back into alignment with the noble Lord, Lord Knight, is that I think it is important that the Government address those issues, just probably not in this Bill.
My Lords, I want to support Amendment 76, to which I have added my name, with some brief remarks because the noble Viscount, Lord Colville, has put the case with great power and eloquence. I also support Amendment 77 in the name of my noble friend Lady Stowell, which is a clever solution to the issue of accountability.
I support Amendment 76 in particular, both because I do not believe the requirement is necessary and because—this is a consistent theme in our Committee debates—it builds into the legislation a completely avoidable delay and poses a very real threat to the rapid enforcement of it. Quite apart from the issues of principle, which are significant, this is also intensely practical. The CMA’s guidance on the Bill, published earlier this month, set out the expected timetable for the consultation phase on the Bill’s implementation, running through to October 2024, which could be a very busy month. It is almost certainly when we will have a general election or be in the midst of one.
It seems highly unlikely that the Secretary of State will be able to approve guidance during the purdah of an election campaign and if, after the election—whoever wins it—we have a new Secretary of State, there will inevitably be a further delay while he or she considers the guidance before approving it. The Bill therefore ought to be amended to remove the requirement for the Secretary of State’s approval, or, at the very least, set a strict timetable for it, such as the draft guidance being automatically approved after 30 days unless it is specifically rejected. That would ensure that there is not unnecessary delay, which could run into many months, before the new regime takes effect—especially if there is, as a number of noble Lords have made clear, intense lobbying of the Secretary of State behind the scenes.
My Lords, I support both amendments in this group. This seems to be fundamentally a question of what happens in private and what happens in public. I was struck by the number of exchanges in the second day in Committee last week in which noble Lords raised the asymmetry of power between the regulator and the companies that may be designated SMS. The right reverend Prelate the Bishop of Manchester said,
“let us get this right so that Davids have a chance amid the Goliaths”.—[Official Report, 24/1/24; col. GC 230.]
I urge the noble Baroness to stay for the debate on the next group of amendments, in which we will talk about parliamentary accountability. I think she will find that the committee I am proposing is perhaps not quite as modest as she has just described it.
My Lords, I promise I will speak briefly to associate myself with the remarks of my noble friend Lady Stowell and support her Amendment 77 and Amendment 76 in the name of the noble Viscount, Lord Colville.
Despite the fact that there are fewer of us here than there have been in the debates on some of the other quite contentious issues, this is an extremely important amendment and a really important principle that we need to change in the Bill. To be honest, I thought that the power granted to the Secretary of State here was so egregious that it had to have been inserted as part of a cunning concession strategy to distract us from some of the other more subtle increases in powers that were included in the other place. It is extremely dangerous, both politically and technocratically, to put an individual Secretary of State in this position. I challenge any serious parliamentarian or politician to want to put themselves in that place, as my noble friend Lady Stowell said.
On its own, granting the Secretary of State this power will expose them to an enormous amount of lobbying; it is absolutely a lobbyist’s charter. This is about transparency, as the noble Baroness, Lady Kidron, said, and parliamentary scrutiny, which we will come to properly in our debate on the next group of amendments. However, it is also about reducing the risk of lobbying from the world’s most powerful institutions that are not Governments.
For those reasons, I have a slight concern. In supporting Amendment 77, I do not want the Government or my noble friend the Minister to think that establishing parliamentary scrutiny while maintaining the Secretary of State’s powers would be a happy compromise. It would be absolutely the wrong place for us to be. We need to remove the Secretary of State’s powers over guidance and establish better parliamentary scrutiny.
My Lords, I will be brief. It is an honour to follow the noble Lord, Lord Fox, and his passionate exposé about the importance of interoperability while reminding us that we should be thinking globally, not just nationally. I did not come expecting to support his amendment but, as a result of that passion, I do.
I rise to support my noble friend Lady Stowell. She set out extremely clearly why stronger parliamentary oversight of the digital regulators is so important. I speak having observed this from all possible angles. I have been the chief executive of a regulated company, I have chaired a regulator, in the form of NHS Improvement, I have been on the board of a regulator, in the form of the Bank of England and I am a member of my noble friend’s committee. I have genuinely seen this from all angles, and it is clear that we need to form a different approach in Parliament to recognise the enormous amounts of power we are passing to the different regulators. Almost all of us in Committee today talked about this when the Online Safety Bill was passing through our House, and it was clear then that we needed to look at this. We have given enormous power to Ofcom in the Online Safety Act; this Bill looks at the CMA and very soon, in this same Room, we will be looking at changing and increasing the powers of the ICO, and if we think that that is it, we have not even begun on what AI is going to do to touch a whole host of regulators. I add my voice to my noble friend’s and congratulate her on the process that she seems to be well advanced in in gathering support not just in this House but in the other place.
I also express some support for Amendment 83. I am concerned that if we are not careful, the easiest way to ensure that the CMA is not bold enough is to not resource it properly. Unlike the passage of the Online Safety Act, where we got to see how far advanced Ofcom was in bringing in genuine experts from the technology and digital sector, it has not yet been so obvious as this Bill has progressed. That may be just because of the stage we are at, but I suspect it is also because the resourcing is not yet done in the CMA. Therefore, I ask the Minister for not so much an annual update as a current update on where the CMA is in resourcing and what support the Government are giving it to ensure it is able to meet a timetable that still looks painfully slow for this Bill.
My Lords, I rise mainly to correct the record that I called the amendment in the name of the noble Baroness modest and also to celebrate the fact that I am once again back on the side of the noble Baroness, Lady Harding; it was very uncomfortable there for a moment.
I was on both committees that the noble Baroness, Lady Stowell, referred to. We took evidence, and it was clear from all sorts of stakeholders that they would like to see more parliamentary engagement in the new powers we are giving to regulators. They are very broad and sometimes novel powers. However, the point I want to make at this moment is about the sheer volume of what is coming out of regulators. I spent a great deal of my Christmas holiday reading the 1,500 pages of consultation material on illegal harms for the Online Safety Act, and that was only one of three open consultations. We need to understand that we cannot have sufficient oversight unless someone is properly given that job. I challenge the department and Secretary of State to have that level of oversight and interest in things that are already passed. So, the points that the noble Baroness made about resource and capacity are essential.
My other, very particular, point is on the DRCF. I went to a meeting—it was a private meeting, so I do not want to say too much, but fundamentally people were getting together and those attending were very happy with their arrangements. They were going to publish all sorts of things to let the world know how they, in their combination, saw various matters. I asked, “Is there an inbox?” They looked a little quizzical and said, “What do you mean?” I said, “Well, are you taking information in, as a group, as well as giving it out?” The answer was no, of course, because it is not a described space or something that has rules but is a collection of very right-minded people. But here in Committee, we make the point that we need good processes, not good people. So I passionately support this group of amendments.
I briefly turn to the amendment tabled by the noble Lord, Lord Fox, in which there is an unexpected interest in that I work with the IEEE, America’s largest standards organisation, and with CEN-CENELEC, which does standards for the European Union. I also have a seat on the Broadband Commission, which is the ITU’s institute that looks after the SDGs. Creating standards is, as a representative of Google once said to me, soft power. It is truly global, and as we create and move towards standards, there are often people in their pyjamas on the other side of the world contributing because people literally work in all time zones to the same effect. It is a truly consensual, global and important way forward. Everyone who has used the wifi today has used an IEEE standard.
Just a short while ago, I decided that there was so much to say that I would say very little indeed. I completely agree with everything that the noble Baroness, Lady Stowell, said. As politicians, we should all be worried about a serious and growing problem that we are handing over huge powers to regulators on a monthly basis, and they will appear to the public to be accountable to nobody. If there is one book that is worth a good read, it is Unelected Power by Paul Tucker, who addresses exactly this set of issues with respect to finance and central banking. Come to think about it, it is a rather fat book, so, although I have read a large part of it myself, I suggest that the introduction and the conclusion will give noble Lords a good feel.
I will briefly join up a number of the debates we have just heard. On the one hand, we have been saying to ourselves, “We’ve got to empower David because David’s up against Goliath”, and on the other hand, it was said a moment ago that we have these huge overmighty regulators that must be held to account. There is an answer to that apparent clash of thoughts which s that while regulators have the capacity to wield huge power, many of them retreat into a comfort zone in which they do not do all the things they should. Rather, they do what they feel they can do relatively straightforwardly. Specifically, they do not wield the huge soft power they often have available to them.
Since I am going to give a long speech, I will digress momentarily to illustrate that point. When Covid struck, I was the chairman of the CMA. The hand sanitiser market started to be cornered at great speed by a small number of players, who then jacked up the price so that Mrs Wiggins, who wanted to go down to the corner shop to buy some at the only moment she dared go out, found that, instead of paying the correct price, which was probably £1.80, she was going to pay £12, £9 or something like that. I argued vigorously that we should do something about this, using consumer protection powers. I was told, “We don’t have a chance. We’ll be ignored. In any case, we might well lose the case. It’s all very complicated in terms of whether we have the power to intervene in a case like this. We certainly can’t assemble the evidence in time”, and so on. After a fortnight of persistence—I am pleased to say that the current head of the CMA was on the right side of this argument—I persuaded the top of the CMA to send a warning letter out. The practice ended immediately; that is why that big issue for the public agenda, which was leading newspaper coverage for several days, was taken away and a major problem for the Government was removed. Soft power is available to regulators in many ways but they often fail to address it.
The case for better scrutiny of regulators, digital or otherwise, has something to do with the need to hold regulators to account for the way in which they wield—or fail to wield—their power. That case has been made extensively elsewhere. In fact, I have written it down in places and published it, so I will not rehearse any of those arguments now.
I want to touch on two further points. If we are to do this job meaningfully, we need to have in place a number of things that, for example, the banking commission—I chaired it some time ago—found essential when assembling a technically competent team at pace to deal with the Libor scandal. A new body must have significantly greater resources and expertise than we currently provide to Select Committees. That will cost money. It is worth pointing out that the total cost of the work of the top eight regulators, which are meant to scrutinise the businesses on which they keep an eye, is in excess of £2 billion at the moment; that is the bill just to pay for the regulators. A few million pounds spent by Parliament to improve its oversight of those who are meant to be doing that scrutiny work would be money well spent.
The second thing that we must develop in Parliament is institutional memory, which is largely missing at the moment. There is very little institutional memory in our scrutiny bodies. It requires a group of officials who will stay the course for a significant time and are certainly not dispersed every time there is an election, which is what happens to a large number of Select Committee teams in both the Lords and the Commons, including the clerks and deputy clerks.
The third thing that we must do, which may seem obvious but is not always done—indeed, it is often not done—is keep good records. The body must have high-quality record-keeping. It has been a major bugbear of mine that, on the whole, records are not kept by Select Committees across Parliaments—that is, after an election, they start again as if everything is fresh. Incidentally, one of the reasons why the Treasury Committee has done better than other Select Committees in scrutinising across Parliaments is that it has one specialist adviser—I will not embarrass him by naming him—who works on monetary policy and the Bank of England and has been there for about 15 years. He loves his job and does only that job. He used to work in the Bank of England and knows a huge amount about it. That tiny fragment of institutional memory has dramatically improved the performance of the Treasury Committee over the years and does so today.