Artificial Intelligence (Regulation) Bill [HL]

Baroness Kidron Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I too congratulate the noble Lord, Lord Holmes, on his wonderful speech. I declare my interests as an adviser to the Oxford Institute for Ethics in AI and the UN Secretary-General’s AI Advisory Body.

When I read the Bill, I asked myself three questions. Do we need an AI regulation Bill? Is this the Bill we need? What happens if we do not have a Bill? It is arguable that it would be better to deal with AI sector by sector—in education, the delivery of public services, defence, media, justice and so on—but that would require an enormous legislative push. Like others, I note that we are in the middle of a legislative push, with digital markets legislation, media legislation, data protection legislation and online harms legislation, all of which resolutely ignore both existing and future risk.

The taxpayer has been asked to make a £100 million investment in launching the world’s first AI safety institute, but as the Ada Lovelace Institute says:

“We are concerned that the Government’s approach to AI regulation is ‘all eyes, no hands’”,


with plenty of “horizon scanning” but no

“powers and resources to prevent those risks or even to react to them effectively after the fact”.

So yes, we need an AI regulation Bill.

Is this the Bill we need? Perhaps I should say to the House that I am a fan of the Bill. It covers testing and sandboxes, it considers what the public want, and it deals with a very important specific issue that I have raised a number of times in the House, in the form of creating AI-responsible officers. On that point, the CEO of the International Association of Privacy Professionals came to see me recently and made an enormously compelling case that, globally, we need hundreds of thousands of AI professionals, as the systems become smarter and more ubiquitous, and that those professionals will need standards and norms within which to work. He also made the case that the UK would be very well-placed to create those professionals at scale.

I have a couple of additions. Unless the Minister is going to make a surprise announcement, I think we are allowed to consider that he is going to take the Bill on in full. In addition, under Clause 2, which sets out regulatory principles, I would like to see consideration of children’s rights and development needs; employment rights, concerning both management by AI and job displacement; a public interest case; and more clarity that material that is an offence—such as creating viruses, CSAM or inciting violence—is also an offence, whether created by AI or not, with specific responsibilities that accrue to users, developers and distributors.

The Stanford Internet Observatory recently identified hundreds of known images of child sexual abuse material in an open dataset used to train popular AI text-to-image models, saying:

“It is challenging to clean or stop the distribution of publicly distributed datasets as it has been widely disseminated. Future datasets could use freely available detection tools to prevent the collection of known CSAM”.


The report illustrates that it is very possible to remove such images, but that it did not bother, and now those images are proliferating at scale.

We need to have rules upon which AI is developed. It is poised to transform healthcare, both diagnosis and treatment. It will take the weight out of some of the public services we can no longer afford, and it will release money to make life better for many. However, it brings forward a range of dangers, from fake images to lethal autonomous weapons and deliberate pandemics. AI is not a case of good or bad; it is a question of uses and abuses.

I recently hosted Geoffrey Hinton, whom many will know as the “godfather of AI”. His address to parliamentarians was as chilling as it was compelling, and he put timescales on the outcomes that leave no time to wait. I will not stray into his points about the nature of human intelligence, but he was utterly clear that the concentration of power, the asymmetry of benefit and the control over resources—energy, water and hardware—needed to run these powerful systems would be, if left until later, in so few hands that they, and not we, would be doing the rule setting.

My final question is: if we have no AI Bill, can the Government please consider putting the content of the AI regulation Bill into the data Bill currently passing through Parliament and deal with it in that way?

Moved by
2: Clause 1, page 2, line 16, leave out “and (3)” and insert “, (3) and (3A)”
Member's explanatory statement
This amendment, and another to Clause 1 in the name of Baroness Kidron, would ensure that controllers have a duty to identify when a user is or may be a child to give them the data protection codified by the Data Protection Act 2018.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I speak to Amendments 2, 3, 9 and 290 in my name. I thank the noble Baronesses, Lady Jones and Lady Harding, and the noble Lord, Lord Clement-Jones, for their support.

This group seeks to secure the principle that children should enjoy the same protections in UK law after this Bill passes into law as they do now. In 2018, this House played a critical role in codifying the principle that children merit special, specific protection in relation to data privacy by introducing the age-appropriate design code into the DPA. Its introduction created a wave of design changes to tech products: Google introduced safe search as its default; Instagram made it harder for adults to contact children via private messaging; Play Store stopped making adult apps available to under-18s; and TikTok stopped sending notifications through the night and hundreds of thousands of underage children were denied access to age-inappropriate services. These are just a handful of the hundreds of changes that have been made, many of them rolled out globally. The AADC served as a blueprint for children’s data privacy, and its provisions have been mirrored around the globe. Many noble Lords will have noticed that, only two weeks ago, Australia announced that it is going to follow the many others who have incorporated or are currently incorporating it into their domestic legislation, saying in the press release that it would align as closely as possible with the UK’s AADC.

As constructed in the Data Protection Act 2018, the AADC sets out the requirements of the UK GDPR as they relate to children. The code is indirectly enforceable; that is to say that the action the ICO can take against those failing to comply is based on the underlying provisions of UK GDPR, which means that any watering down, softening of provisions, unstable definitions—my new favourite—or legal uncertainty created by the Bill automatically waters down, softens and creates legal uncertainty and unstable definitions for children and therefore for child protection. I use the phrase “child protection” deliberately because the most important contribution that the AADC has made at the global level was the understanding that online privacy and safety are interwoven.

Clause 1(2) creates an obligation on the controller or processor to know, or reasonably to know, that an individual is an identifiable living individual. Amendments 2 and 3 would add a further requirement to consider whether that living individual is a child. This would ensure that providers cannot wilfully ignore the presence of children, something that tech companies have a long track record of doing. I want to quote the UK Information Commissioner, who fined TikTok £12.7 million for failing to prevent under-13s accessing that service; he said:

“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws … TikTok should have known better. TikTok should have done better … They did not do enough to check who was using their platform”.


I underline very clearly that these amendments would not introduce any requirement for age assurance. The ICO’s guidance on age assurance in the AADC and the provisions in the Online Safety Act already detail those requirements. The amendments simply confirm the need to offer a child a high bar of data privacy or, if you do not know which of your users are children, offer all users that same high bar of data privacy.

As we have just heard, it is His Majesty’s Government’s stated position that nothing in the Bill lessens children’s data privacy because nothing in the Bill lessens UK GDPR, and that the Bill is merely an exercise to reduce unnecessary bureaucracy. The noble Lords who spoke on the first group have perhaps put paid to that and I imagine that this position will be sorely tested during Committee. In the light of the alternative view that the protections afforded to children’s personal data will decline as a result of the Bill, Amendment 9 proposes that the status of children’s personal data be elevated to that of “sensitive personal data”, or special category data. The threshold for processing special category data is higher than for general personal data and the specific conditions include, for example, processing with the express consent of the data subject, processing to pursue a vital interest, processing by not-for-profits or processing for legal claims or matters of substantial public interest. Bringing children’s personal data within that definition would elevate the protections by creating an additional threshold for processing.

Finally, Amendment 290 enshrines the principle that nothing in the Bill should lead to a diminution in existing levels of privacy protections that children currently enjoy. It is essentially a codification of the commitment made by the Minister in the other place:

“The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code”.—[Official Report, Commons, 17/4/23; col. 101.]


Before I sit down, I just want to highlight the Harvard Gazette, which looked at ad revenue from the perspective of children. On Instagram, children account for 16% of ad revenue; on YouTube, 27%; on TikTok, 35%; and on Snap, an extraordinary 41.4%. Collectively, YouTube, Instagram and Facebook made nearly $2 billion from children aged nought to 12, and it will not escape many noble Lords that children aged nought to 12 are not supposed to be on those platforms. Instagram, YouTube and TikTok together made more than $7 billion from 13 to 17 year-olds. The amendments in this group give a modicum of protection to a demographic who have no electoral capital, who are not developmentally adult and whose lack of care is not an unfortunate by-product of the business model, but who have their data routinely extracted, sold, shared and scraped as a significant part of the ad market. It is this that determines the features that deliberately spread, polarise and keep children compulsively online, and it is this that the AADC—born in your Lordships’ House—started a global movement to contain.

This House came together on an extraordinary cross-party basis to ensure that the Online Safety Bill delivered for children, so I say to the Minister: I am not wedded to my drafting, nor to the approach that I have taken to maintain, clause by clause, the bar for children, even when that bar is changed for adults, but I am wedded to holding the tech sector accountable for children’s privacy, safety and well-being. It is my hope and—if I dare—expectation that noble Lords will join me in making sure that the DPDI Bill does not leave this House with a single diminution of data protection for children. To do so is, in effect, to give with one hand and take away with the other.

I hope that during Committee the Minister will come to accept that children’s privacy will be undermined by the Bill, and that he will work with me and others to resolve these issues so that the UK maintains its place as a global leader in children’s privacy and safety. I beg to move.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Okay. The Government feel that, in terms of the efficient and effective drafting of the Bill, that paragraph diminishes the clarity by being duplicative rather than adding to it by making a declaration. For the same reason, we have chosen not to make a series of declarations about other intentions of the Bill overall in the belief that the Bill’s intent and outcome are protected without such a statement.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, before our break, the noble Baroness, Lady Harding, said that this is hard-fought ground; I hope the Minister understands from the number of questions he has just received during his response that it will continue to be hard-fought ground.

I really regret having to say this at such an early stage on the Bill, but I think that some of what the Minister said was quite disingenuous. We will get to it in other parts of the Bill, but the thing that we have all agreed to disagree on at this point is the statement that the Bill maintains data privacy for everyone in the UK. That is a point of contention between noble Lords and the Minister. I absolutely accept and understand that we will come to a collective view on it in Committee. However, the Minister appeared to suggest—I ask him to correct me if I have got this wrong—that the changes on legitimate interest and purpose limitation are child safety measures because some people are saying that they are deterred from sharing data for child protection reasons. I have to tell him that they are not couched or formed like that; they are general-purpose shifts. There is absolutely no question but that the Government could have made specific changes for child protection, put them in the Bill and made them absolutely clear. I find that very worrying.

I also find it worrying, I am afraid—this is perhaps where we are heading and the thing that many organisations are worried about—that bundling the AADC in with the Online Safety Act and saying, “I’ve got it over here so you don’t need it over there” is not the same as maintaining the protections for children from a high level of data. It is not the same set of things. I specifically said that this was not an age-verification measure and would not require it; whatever response there was on that was therefore unnecessary because I made that quite clear in my remarks. The Committee can understand that, in order to set a high bar of data protection, you must either identify a child or give it to everyone. Those are your choices. You do not have to verify.

I will withdraw the amendment, but I must say that the Government may not have it both ways. The Bill cannot be different or necessary and at the same time do nothing. The piece that I want to leave with the Committee is that it is the underlying provisions that allow the ICO to take action on the age-appropriate design code. It does not matter what is in the code; if the underlying provisions change, so does the code. During Committee, I expect that there will be a report on the changes that have happened all around the world as a result of the code, and we will be able to measure whether the new Bill would be able to create those same changes. With that, I beg leave to withdraw my amendment.

Amendment 2 withdrawn.
--- Later in debate ---
I know that this is a bit like the prosecution: the Minister will protest his innocence throughout the passage of the Bill, with “Not me, guv” or something to that effect. I look forward to his reply, but I think we will really have to dig under the surface as we go through. I very much hope that the Minister can clarify whether this is new. I certainly believe that the addition of commercial purposes is potentially extremely dangerous, needs to be qualified and is novel. I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I speak to Amendments 8, 21, 23 and 145 in my name and thank the other noble Lords who have added their names to them. In the interests of brevity, and as the noble Lord, Lord Clement-Jones, has done some of the heavy lifting on this, I will talk first to Amendment 8.

The definition of scientific research has been expanded to include commercial and non-commercial activity, so far as it

“can reasonably be described as scientific”,

but “scientific” is not defined. As the noble Lord said, there is no public interest requirement, so a commercial company can, in reality, develop almost any kind of product on the basis that it may have a scientific purpose, even—or maybe especially—if it measures your propensity to impulse buy or other commercial things. The spectre of scientific inquiry is almost infinite. Amendment 8 would exclude children simply by adding proposed new paragraph (e), which says that

“the data subject is not a child or could or should be known to be a child”,

so that their personal data cannot be used for scientific research purposes to which they have not given their consent.

I want to be clear that I am pro-research and understand the critical role that data plays in enabling us to understand societal challenges and innovate towards solutions. Indeed, I have signed the amendment in the name of the noble Lord, Lord Bethell, which would guarantee access to data for academic researchers working on matters of public interest. Some noble Lords may have been here last night, when the US Surgeon- General Vice Admiral Dr Murthy, who gave the Lord Speaker’s lecture, made a fierce argument in favour of independent public interest research, not knowing that such a proposal has been laid. I hope that, when we come to group 17, the Government heed his wise words.

In the meantime, Clause 3 simply embeds the inequality of arms between academics and corporates and extends it, making it much easier for commercial companies to use personal data for research while academics continue to be held to much higher ethical and professional standards. They continue to require express consent, DBS checks and complex ethical requirements. Not doing so, simply using personal data for research, is unethical and commercial players can rely on Clause 3 to process data without consent, in pursuit of profit. Like the noble Lord, Lord Clement-Jones, I would prefer an overall solution to this but, in its absence, this amendment would protect data from being commoditised in this way.

Amendments 21 and 23 would specifically protect children from changes to Clause 6. I have spoken on this a little already, but I would like it on the record that I am absolutely in favour of a safeguarding exemption. The additional purposes, which are compatible with but go beyond the original purpose, are not a safeguarding measure. Amendment 21 would amend the list of factors that a data controller must take into account to include the fact that children are entitled to a higher standard of protection.

Amendment 23 would not be necessary if Amendment 22 were agreed. It would commit the Secretary of State to ensuring that, when exercising their power under new Article 8A, as inserted by Clause 6(5), to add, vary or omit provisions of Annex 2, they take the 2018 Act and children’s data protection into account.

Finally, Amendment 145 proposes a code of practice on the use of children’s data in scientific research. This code would, in contrast, ensure that all researchers, commercial or in the public interest, are held to the same high standards by developing detailed guidance on the use of children’s data for research purposes. A burning question for researchers is how to properly research children’s experience, particularly regarding the harms defined by the Online Safety Act.

Proposed new subsection (1) sets out the broad headings that the ICO must cover to promote good practice. Proposed new subsection (2) confirms that the ICO must have regard to children’s rights under the UNCRC, and that they are entitled to a higher standard of protection. It would also ensure that the ICO consulted with academics, those who represent the interests of children and data scientists. There is something of a theme here: if the changes to UK GDPR did not diminish data subjects’ privacy and rights, there would be no need for amendments in this group. If there were a code for independent public research, as is so sorely needed, the substance of Amendment 145 could usefully form a part. If commercial companies can extend scientific research that has no definition, and if the Bill expands the right to further processing and the Secretary of State can unilaterally change the basis for onward processing, can the Minister explain, when he responds, how he can claim that the Bill maintains protections for children?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I will be brief because I associate myself with everything that the noble Baroness, Lady Kidron, just said. This is where the rubber hits the road from our previous group. If we all believe that it is important to maintain children’s protection, I hope that my noble friend the Minister will be able to accept if not the exact wording of the children-specific amendments in this group then the direction of travel—and I hope that he will commit to coming back and working with us to make sure that we can get wording into the Bill.

I am hugely in favour of research in the private sector as well as in universities and the public sector; we should not close our minds to that at all. We need to be realistic that all the meaningful research in AI is currently happening in the private sector, so I do not want to close that door at all, but I am extremely uncomfortable with a Secretary of State having the ability to amend access to personal data for children in this context. It is entirely sensible to have a defined code of conduct for the use of children’s data in research. We have real evidence that a code of conduct setting out how to protect children’s rights and data in this space works, so I do not understand why it would not be a good idea to do research if we want the research to happen but we want children’s rights to be protected at a much higher level.

It seems to me that this group is self-evidently sensible, in particular Amendments 8, 22, 23 and 145. I put my name to all of them except Amendment 22 but, the more I look at the Bill, the more uncomfortable I get with it; I wish I had put my name to Amendment 22. We have discussed Secretary of State powers in each of the digital Bills that we have looked at and we know about the power that big tech has to lobby. It is not fair on Secretaries of State in future to have this ability to amend—it is extremely dangerous. I express my support for Amendment 22.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Researchers must also comply with the required safeguards to protect individuals’ privacy. All organisations conducting scientific research, including those with commercial interests, must also meet all the safeguards for research laid out in the UK GDPR and comply with the legislation’s core principles, such as fairness and transparency. Clause 26 sets out several safeguards that research organisations must comply with when processing personal data for research purposes. The ICO will update its non-statutory guidance to reflect many of the changes introduced by this Bill.

Scientific research currently holds a privileged place in the data protection framework because, by its nature, it is already viewed as generally being in the public interest. As has been observed, the Bill already applies a public interest test to processing for the purpose of public health studies in order to provide greater assurance for research that is particularly sensitive. Again, this reflects recital 159.

In response to the noble Baroness, Lady Jones, on why public health research is being singled out, as she stated, this part of the legislation just adds an additional safeguard to studies into public health ensuring that they must be in the public interest. This does not limit the scope for other research unrelated to public health. Studies in the area of public health will usually be in the public interest. For the rare, exceptional times that a study is not, this requirement provides an additional safeguard to help prevent misuse of the various exemptions and privileges for researchers in the UK GDPR. “Public interest” is not defined in the legislation, so the controller needs to make a case-by-case assessment based on its purposes.

On the point made by the noble Lord, Lord Clement-Jones, about recitals and ICO guidance, although we of course respect and welcome ICO guidance, it does not have legislative effect and does not provide the certainty that legislation does. That is why we have done so via this Bill.

Amendment 7 to Clause 3 would undermine the broader consent concept for scientific research. Clause 3 places the existing concept of “broad consent” currently found in recital 33 to the UK GDPR on a statutory footing with the intention of improving awareness and confidence for researchers. This clause applies only to scientific research processing that is reliant on consent. It already contains various safeguards. For example, broad consent can be used only where it is not possible to identify at the outset the full purposes for which personal data might be processed. Additionally, to give individuals greater agency, where possible individuals will have the option to consent to only part of the processing and can withdraw their consent at any time.

Clause 3 clarifies an existing concept of broad consent which outlines how the conditions for consent will be met in certain circumstances when processing for scientific research purposes. This will enable consent to be obtained for an area of scientific research when researchers cannot at the outset identify fully the purposes for which they are collecting the data. For example, the initial aim may be the study of cancer, but it later becomes the study of a particular cancer type.

Furthermore, as part of the reforms around the reuse of personal data, we have further clarified that when personal data is originally collected on the basis of consent, a controller would need to get fresh consent to reuse that data for a new purpose unless a public interest exemption applied and it is unreasonable to expect the controller to obtain that consent. A controller cannot generally reuse personal data originally collected on the basis of consent for research purposes.

Turning to Amendments 132 and 133 to Clause 26, the general rule described in Article 13(3) of the UK GDPR is that controllers must inform data subjects about a change of purposes, which provides an opportunity to withdraw consent or object to the proposed processing where relevant. There are existing exceptions to the right to object, such as Article 21(6) of the UK GDPR, where processing is necessary for research in the public interest, and in Schedule 2 to the Data Protection Act 2018, when applying the right would prevent or seriously impair the research. Removing these exemptions could undermine life-saving research and compromise long-term studies so that they are not able to continue.

Regarding Amendment 134, new Article 84B of the UK GDPR already sets out the requirement that personal data should be anonymised for research, archiving and statistical—RAS—purposes unless doing so would mean the research could not be carried through. Anonymisation is not always possible as personal data can be at the heart of valuable research, archiving and statistical activities, for example, in genetic research for the monitoring of new treatments of diseases. That is why new Article 84C of the UK GDPR also sets out protective measures for personal data that is used for RAS purposes, such as ensuring respect for the principle of data minimisation through pseudonymisation.

The stand part notice in this group seeks to remove Clause 6 and, consequentially, Schedule 2. In the Government’s consultation on data reform, Data: A New Direction, we heard that the current provisions in the UK GDPR on personal data reuse are difficult for controllers and individuals to navigate. This has led to uncertainty about when controllers can reuse personal data, causing delays for researchers and obstructing innovation. Clause 6 and Schedule 2 address the existing uncertainty around reusing personal data by setting out clearly the conditions in which the reuse of personal data for a new purpose is permitted. Clause 6 and Schedule 2 must therefore remain to give controllers legal certainty and individuals greater transparency.

Amendment 22 seeks to remove the power to add to or vary the conditions set out in Schedule 2. These conditions currently constitute a list of specific public interest purposes, such as safeguarding vulnerable individuals, for which an organisation is permitted to reuse data without needing consent or to identify a specific law elsewhere in legislation. Since this list is strictly limited and exhaustive, a power is needed to ensure that it is kept up to date with future developments in how personal data is used for important public interest purposes.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I am interested that the safeguarding requirement is already in the Bill, so, in terms of children, which I believe the Minister is going to come to, the onward processing is not a question of safeguarding. Is that correct? As the Minister has just indicated, that is already a provision.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Just before we broke, I was on the verge of attempting to answer the question from the noble Baroness, Lady Kidron; I hope my coming words will do that, but she can intervene again if she needs to.

I turn to the amendments that concern the use of children’s data in research and reuse. Amendment 8 would also amend Clause 3; the noble Baroness suggests that the measure should not apply to children’s data, but this would potentially prevent children, or their parents or guardians, from agreeing to participate in broad areas of pioneering research that could have a positive impact on children, such as on the causes of childhood diseases.

On the point about safeguarding, the provisions on recognised legitimate interests and further processing are required for safeguarding children for compliance with, respectively, the lawfulness and purpose limitation principles. The purpose limitation provision in this clause is meant for situations where the original processing purpose was not safeguarding and the controller then realises that there is a need to further process it for safeguarding.

Research organisations are already required to comply with the data protection principles, including on fairness and transparency, so that research participants can make informed decisions about how their data is used; and, where consent is the lawful basis for processing, children, or their parents or guardians, are free to choose not to provide their consent, or, if they do consent, they can withdraw it at any time. In addition, the further safeguards that are set out in Clause 26, which I mentioned earlier, will protect all personal data, whether it relates to children or adults.

Amendment 21 would require data controllers to have specific regard to the fact that children’s data requires a higher standard of protection for children when deciding whether reuse of their data is compatible with the original purpose for which it was collected. This is unnecessary because the situations in which personal data could be reused are limited to public interest purposes designed largely to protect the public and children, in so far as they are relevant to them. Controllers must also consider the possible consequences for data subjects and the relationship between the controller and the data subject. This includes taking into account that the data subject is a child, in addition to the need to generally consider the interests of children.

Amendment 23 seeks to limit use of the purpose limitation exemptions in Schedule 2 in relation to children’s data. This amendment is unnecessary because these provisions permit further processing only in a narrow range of circumstances and can be expanded only to serve important purposes of public interest. Furthermore, it may inadvertently be harmful to children. Current objectives include safeguarding children or vulnerable people, preventing crime or responding to emergencies. In seeking to limit the use of these provisions, there is a risk that the noble Baroness’s amendments might make data controllers more hesitant to reuse or disclose data for public interest purposes and undermine provisions in place to protect children. These amendments could also obstruct important research that could have a demonstrable positive impact on children, such as research into children’s diseases.

Amendment 145 would require the ICO to publish a statutory code on the use of children’s data in scientific research and technology development. Although the Government recognise the value that ICO codes can play in promoting good practice and improving compliance, we do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by the new codes. Clause 33 of the Bill already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that it sees fit, so this is an issue that we could return to in the future if the evidence supports it.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I will read Hansard very carefully, because I am not sure that I absolutely followed the Minister, but we will undoubtedly come back to this. I will ask two questions. Earlier, before we had a break, in response to some of the early amendments in the name of the noble Lord, Lord Clement-Jones, the Minister suggested that several things were being taken out of the recital to give them solidity in the Bill; so I am using this opportunity to suggest that recital 38, which is the special consideration of children’s data, might usefully be treated in a similar way and that we could then have a schedule that is the age-appropriate design code in the Bill. Perhaps I can leave that with the Minister, and perhaps he can undertake to have some further consultation with the ICO on Amendment 145 specifically.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

With respect to recital 38, that sounds like a really interesting idea. Yes, let us both have a look and see what the consultation involves and what the timing might look like. I confess to the Committee that I do not know what recital 38 says, off the top of my head. For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore not press them.

Returning to the questions by the noble Lord, Lord Clement-Jones, on the contents of recital 159, the current UK GDPR and EU GDPR are silent on the specific definition of scientific research. It does not preclude commercial organisations performing scientific research; indeed, the ICO’s own guidance on research and its interpretation of recital 159 already mention commercial activities. Scientific research can be done by commercial organisations—for example, much of the research done into vaccines, and the research into AI referenced by the noble Baroness, Lady Harding. The recital itself does not mention it but, as the ICO’s guidance is clear on this already, the Government feel that it is appropriate to put this on a statutory footing.

--- Later in debate ---
Moved by
10: After Clause 4, insert the following new Clause—
““Data community”In this Act, a “data community” means an entity established to facilitate the collective activation of data subjects’ data rights in Chapters III and VIII of the UK GDPR and members of a data community assign specific data rights to a nominated entity to exercise those rights on their behalf.”Member’s explanatory statement
This amendment provides a definition of “data community”. It is one of a series of amendments that would establish the ability to assign data rights to a third party.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I hope this is another lightbulb moment, as the noble Lord, Lord Clement-Jones, suggested. As well as Amendment 10, I will speak to Amendments 35, 147 and 148 in my name and the names of the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones. I thank them both. The purpose of these amendments is to move the Bill away from nibbling around the edges of GDPR in pursuit of post-Brexit opportunities and to actually deliver a post-Brexit opportunity.

These amendments would put the UK on an enhanced path of data sophistication while not challenging equivalence, which we will undoubtedly discuss during the Committee. I echo the voice of the noble Lord, Lord Allan, who at Second Reading expressed deep concern that equivalence was not a question of an arrangement between the Government and the EU but would be a question picked up by data activists taking strategic litigation to the courts.

Data protection as conceived by GDPR and in this Bill is primarily seen as an arrangement between an individual and an entity that processes that data—most often a commercial company. But, as evidenced by the last 20 years, the real power lies in holding either vast swathes of general data, such as those used by LLMs, or large groups of specialist data such as medical scans. In short, the value—in all forms, not simply financial—lies in big data.

As the value of data became clear, ideas such as “data is the new oil” and data as currency emerged, alongside the notion of data fiduciaries or data trusts, where you can place your data collectively. One early proponent of such ideas was Jaron Lanier, inventor of virtual reality; I remember discussing it with him more than a decade ago. However, these ideas have not found widespread practical application, possibly because they are normally based around ideas of micropayments as the primary value—and very probably because they rely on data subjects gathering their data, so they are for the boffins.

During the passage of the DPA 2018, one noble Lord counted the number of times the Minister said the words “complex” and “complicated” while referring to the Bill. Data law is complex, and the complicated waterfall of its concepts and provisions eludes most non-experts. That is why I propose the four amendments in this group, which would give UK citizens access to data experts for matters that concern them deeply.

Amendment 10 would define the term “data community”, and Amendment 35 would give a data subject the power to assign their data rights to a data community for specific purposes and for a specific time period. Amendment 147 would require the ICO to set out a code of conduct for data communities, including guidance on establishing, operating and joining a data community, as well as guidance for data controllers and data processors on responding to requests made by data communities. Amendment 148 would require the ICO to keep a register of data communities, to make it publicly available and to ensure proper oversight. Together, they would provide a mechanism for non-experts—that is, any UK citizen—to assign their data rights to a community run by representatives that would benefit the entire group.

Data communities diverge from previous attempts to create big data for the benefit of users, in that they are not predicated on financial payments and neither does each data subject need to access their own data via the complex rules and often obstructive interactions with individual companies. They put rights holders together with experts who do it on their behalf, by allowing data subjects to assign their rights so that an expert can gather the data and crunch it.

This concept is based on a piece of work done by a colleague of mine at the University of Oxford, Dr Reuben Binns, an associate professor in human-centred computing, in association with the Worker Info Exchange. Since 2016, individual Uber drivers, with help from their trade unions and the WIE, asked Uber for their data that showed their jobs, earnings, movements, waiting times and so on. It took many months of negotiation, conducted via data protection lawyers, as each driver individually asked for successive pieces of information that Uber, at first, resisted giving them and then, after litigation, provided.

After a period of time, a new cohort of drivers was recruited, and it was only when several hundred drivers were poised to ask the same set of questions that a formal arrangement was made between Uber and WIE, so that they could be treated as a single group and all the data would be provided about all the drivers. This practical decision allowed Dr Binns to look at the data en masse. While an individual driver knew what they earned and where they were, what became visible when looking across several hundred drivers is how the algorithm reacted to those who refused a poorly paid job, who was assigned the lucrative airport runs, whether where you started impacted on your daily earnings, whether those who worked short hours were given less lucrative jobs, and so on.

This research project continues after several years and benefits from a bespoke arrangement that could, by means of these amendments, be strengthened and made an industry-wide standard with the involvement of the ICO. If it were routine, it would provide opportunity equally for challenger businesses, community groups and research projects. Imagine if a group of elderly people who spend a lot of time at home were able to use a data community to negotiate cheap group insurance, or imagine a research project where I might assign my data rights for the sole purpose of looking at gender inequality. A data community would allow any group of people to assign their rights, rights that are more powerful together than apart. This is doable—I have explained how it has been done. With these amendments, it would be routinely available, contractual, time-limited and subject to a code of conduct.

As it stands, the Bill is regressive for personal data rights and does not deliver the promised Brexit dividends. But there are great possibilities, without threatening adequacy, that could open markets, support innovation in the UK and make data more available to groups in society that rarely benefit from data law. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I think this is a lightbulb moment—it is inspired, and this suite of amendments fits together really well. I entirely agree with the noble Baroness, Lady Kidron, that this is a positive aspect. If the Bill contained these four amendments, I might have to alter my opinion of it—how about that for an incentive?

This is an important subject. It is a positive aspect of data rights. We have not got this right yet in this country. We still have great suspicion about sharing and access to personal data. There is almost a conspiracy theory around the use of data, the use of external contractors in the health service and so on, which is extremely unhelpful. If individuals were able to share their data with a trusted hub—a trusted community—that would make all the difference.

Like the noble Baroness, Lady Kidron, I have come across a number of influences over the years. I think the first time many of us came across the idea of data trusts or data institutions was in the Hall-Pesenti review carried out by Dame Wendy Hall and Jérôme Pesenti in 2017. They made a strong recommendation to the Government that they should start thinking about how to operationalise data trusts. Subsequently, organisations such as the Open Data Institute did some valuable research into how data trusts and data institutions could be used in a variety of ways, including in local government. Then the Ada Lovelace Institute did some very good work on the possible legal basis for data trusts and data institutions. Professor Irene Ng was heavily engaged in setting up what was called the “hub of all things”. I was not quite convinced by how it was going to work legally in terms of data sharing and so on, but in a sense we have now got to that point. I give all credit to the academic whom the noble Baroness mentioned. If he has helped us to get to this point, that is helpful. It is not that complicated, but we need full government backing for the ICO and the instruments that the noble Baroness put in her amendments, including regulatory oversight, because it will not be enough simply to have codes that apply. We have to have regulatory oversight.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I thank the co-signatories of my amendments for their enthusiasm. I will make three very quick points. First, the certain rights that the Minister referred to are complaints after the event when something has gone wrong, not positive rights. The second point of contention I have is whether these are so far-reaching. We are talking about people’s existing rights, and these amendments do not introduce any other right apart from access to put them together. It is very worrying that the Government would see these as a threat when data subjects put together their rights but not when commercial companies put together their data.

Finally, what is the Bill for? If it is not for creating a new and vibrant data protection system for the UK, I am concerned that it undermines a lot of existing rights and will not allow for a flourishing of uses of data. This is the new world: the world of data and AI. We have to have something to offer UK citizens. I would like the Minister to say that he will discuss this further, because it is not quite adequate to nay-say it. I beg leave to withdraw.

Amendment 10 withdrawn.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I have had a number of arguments about “proportionate” in the decade that I have been in this House. In fact, I remember that the very first time I walked into the Chamber the noble Lord, Lord Pannick, was having a serious argument with another noble Lord over a particular word. It went on for about 40 minutes and I remember thinking, “There is no place for me in this House”. Ten years later, I stand to talk about “proportionate”, which has played such a big part in my time here in the Lords.

During the passage of the DPA 2018, many of us tried to get “proportionate” into the Bill on the basis that we were trying to give comfort to people who thought data protection was in fact government surveillance of individuals. The Government said—quite rightly, as it turned out—that all regulators have to be

“proportionate, accountable, consistent, transparent, and targeted”

in the way in which they discharge their responsibilities and they pushed us back. The same thing happened on the age-appropriate design code with the ICO, and the same point was made again. As the noble Baroness, Lady Harding, just set out, we tried once more during the passage of the Online Safety Bill. Yet this morning I read this sentence in some draft consultation documents coming out of the Online Safety Act:

“Provisionally, we consider that a measure recommending that users that share CSAM”—


that is, for the uninitiated, child sexual abuse material—

“have their accounts blocked may be proportionate, given the severity of the harm. We need to do more work to develop the detail of any such measure and therefore aim to consult on it”.

This is a way in which “proportionate” has been weaponised in favour of the tech companies in one environment and it is what I am concerned about here.

As the noble Lord said, using “proportionate” introduces a gap in which uncertainty can be created, because some things are beyond question and must be considered, rather than considered on a proportionate basis. I finish by saying that associating the word specifically in relation to conduct requirements or making pro-competitive interventions must create a legal uncertainty if a regulator can pick up that word and put it against something so absolute and illegal and then have to discuss its proportionality.

Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- Hansard - - - Excerpts

I wonder if I can just slip in before Members on the Front Bench speak, particularly those who have signed the amendment. I refer again to my register of interests.

I support the principle that lies behind these amendments and want to reinforce the point that I made at Second Reading and that I sort of made on the first day in Committee. Any stray word in the Bill when enacted will be used by those with the deepest pockets—that is, the platforms—to hold up action against them by the regulator. I read this morning that the CMA has resumed its inquiry into the UK cloud market after an eight-month hiatus based on a legal argument put by Apple about the nature of the investigation.

It seems to me that Clause 19(5) is there to show the parameters on which the CMA can impose an obligation to do with fair dealing and open choices, and so on. It therefore seems that “proportionate”—or indeed perhaps even “appropriate”—is unnecessary because the CMA will be subject to judicial review on common-law principles if it makes an irrational or excessive decision and it may be subject to a legal appeal if people can argue that it has not applied the remedy within the parameters set by paragraphs (a), (b) and (c) of Clause 19(5). I am particularly concerned about whether there is anything in the Bill once enacted that allows either some uncertainty, which can be latched on to, or appeals—people refer to “judicial review plus” or appeals on the full merits, which are far more time-consuming and expensive and which will tie the regulator up in knots.

--- Later in debate ---
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

If “indispensable” and purely “benefit” are the same, why was the change made on Report in the Commons?

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I was really interested in the introduction of the word “unknown”. The noble Lord, Lord Lansley, set out all the different stages and interactions. Does it not incentivise the companies to call back information to this very last stage, and the whole need-for-speed issue then comes into play?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I will revert first to the questions about the word “indispensable”. As I have said, the Government consulted very widely, and one of the findings of the consultation was that, for a variety of stakeholders, the word “indispensable” reduced the clarity of the legislation.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I cannot give a full account of the individual stakeholders right now; I am happy to ask the department to clarify further in that area. My contention is that the effect of the two sentences are the same, with the new one being clearer than the old one. I am very happy to continue to look at that and listen to the arguments of noble Lords, but that is the position. Personally, when I look at the two sentences, I find it very difficult to discern any difference in meaning between them. As I say, I am very happy to receive further arguments on that.

With respect to the participative arrangements by which a decision is reached around, for example, a conduct requirement, during the period of conduct requirement design, and during the decision-making period, it is, as my noble friend Lord Lansley has stated, highly to be expected that firms will make representations about the consumer benefits of their product. During a breach investigation, on the other hand, later on in the process, a consumer benefits exemption can be used as a safeguard or defence against a finding of breach.

Sorry, but there were so many questions that I have completely lost track. Perhaps the noble Baroness, Lady Kidron, will restate her question.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I think the Minister was in the middle of answering it and saying why something might be “unknown” right at the last.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As many noble Lords in the debate have alluded to, we have to be clear that this is a fast-moving field, and we have to at least allow for the possibility that new technologies can provide new consumer benefits and that it is okay to argue that a new and emerging technology that was not part of the original consideration can be considered as part of the defence against a finding of breach. The fact that the intended meaning is intended to be clearer in the current drafting is aiming to provide greater certainty to all businesses while ensuring that consumers continue to get the best outcomes.

Amendment 41, from the noble Lord, Lord Clement-Jones, would change the current drafting of the countervailing benefits exemption in several ways that together are intended to ensure that the CMA is provided as soon as possible with information relating to an SMS firm’s intention to rely on the exemption. We agree with noble Lords who have spoken today that it is important that the exemption cannot be used to avoid or delay enforcement action. The conduct investigation will operate in parallel to the assessment of whether the exemption applies, meaning that the investigation deadline of six months is not affected by the exemption process. The regime has been designed to encourage an open dialogue between the CMA and SMS firms, helping to avoid delays, unintended consequences and surprises on all sides. Therefore, in many cases, if a firm intends to rely on the exemption, we anticipate that this will be clear to all parties from early on in the process.

Digital Markets, Competition and Consumers Bill

Baroness Kidron Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I too faced a glitch, having wanted to add my name to these amendments. Since we are at a new stage of the Bill, I declare my interests as set out in the register, particularly as an adviser to the Institute for Ethics in AI at Oxford and to the Digital Futures centre at the LSE and as chair of the 5Rights Foundation. I support the noble Lord, Lord Clement-Jones, who has, with this group of amendments, highlighted that job creation or displacement and the quality of work are all relevant considerations for the CMA. I think it is worth saying that, when we talk about the existential threat of AI, we always have three areas of concern. The first is the veracity and provenance of information; the second is losing control of automated weapons; and the third, importantly in this case, is the many millions of jobs that will be lost, leaving human beings without ways to earn money or, perhaps, a reason for being.

There are two prevailing views on this. One is that of Elon Musk, who, without telling us how we might put food on the table, pronounced to the Prime Minister

“There will come a point where no job is needed – you can have a job if you want one for personal satisfaction but AI will do everything”.


The other, more optimistic view is that boring or repetitive work will go, which is, in part, beautifully illustrated by David Runciman’s recent book, The Handover, where he details the fate of sports officials. In 2021, Australian and US line judges were replaced by computers, while Wimbledon chose to keep them—largely for aesthetic reasons, because of the lovely Ralph Lauren white against the green grass. Meanwhile, Carl Frey and Michael Osborne, in their much-publicised 2017 study assessing the susceptibility of 702 different jobs to computerisation, suggested that sports officials had a 98% probability of being computerised.

In fact, since 2017, automation has come to all kinds of sports but, as Runciman says,

“Cricket matches, which traditionally featured just two umpires, currently have three to manage the complex demands of the technology, plus a referee to monitor the players’ behaviour”.


Soccer has five, plus large teams of screen watchers needed to interpret—very often badly—replays provided by VAR. The NBA Replay Center in Secaucus employs 25 people in a NASA-like control room, along with a rota of regular match officials.

It would be a fool who would bet that Elon Musk is entirely wrong, but nor should we rely on the fact that all sectors will employ humans to watch over the machines, or even that human beings will find that being the supervisor of a machine, or simply making an aesthetic contribution rather than being a decision-maker, is a good result. It is more likely that the noble Lord, Lord Knight, is correct that the algorithm will indeed be supervising the human beings.

I believe that the noble Lord, Lord Clement-Jones, and his co-author, the noble Lord, Lord Knight, may well prove to be very prescient in introducing this group of amendments that thoughtfully suggest at every stage of the Bill that the CMA should take the future of work and the impact of work into account in coming to a decision. As the noble Lord made clear in setting out each amendment, digital work is no longer simply gig work and the concentration in digital markets of behemoth companies has had and will continue to have huge consequences for jobs across supply lines, as well as wages within markets and, most particularly, on terms of employment and access to work.

AI is, without question, the next disruptor. Those companies that own the technology will be dominant across multiple markets, if not every market, and for the CMA to have a mandate to consider the impact on the workforce is more than sensible, more than foresightful; it is in fact a new reality. I note that the Minister, in responding to the last group, mentioned the importance of foreseeable and existing trends: here we have one.

--- Later in debate ---
I simply make the point to the Minister that, if the Government have the opportunity to think between Committee and Report how they might find a way to level the playing field and amend the legislation accordingly, it would be welcomed across all sides of the House.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I do not actually have much to add to the excellent case that has already been made, but I, too, was at the meeting that the noble Baroness, Lady Jones of Whitchurch, mentioned, and noticed the CMA’s existing relationships.

Quite a lot has been said already, on the first group and just now, about lobbying—not lobbying only in a nasty sense but perhaps about the development of relationships that are simply human. I want to make it very clear that those words do not apply to the CMA specifically—but I have worked with many regulators, both here and abroad, and it starts with a feeling that the regulated, not the regulator, holds the information. It goes on to a feeling that the regulated, not the regulator, has the profound understanding of the limits of what is possible. It then progresses to a working relationship in which the regulator, with its limited resources, starts to weigh up what it can win, rather than what it should demand. That results in communities that have actually won legal protections remaining unprotected. It is a sort of triangulation of purpose, in which the regulator’s primary relationship ends up being geared towards government and industry, rather than towards the community that it is constituted to serve.

In that picture, I feel that the amendments in the name of the noble Baroness, Lady Jones of Whitchurch, make it clear, individually and collectively, that at every stage maximum transparency must be observed, and that the incumbents should be prevented from holding all the cards—including by hiding information from the regulator or from other stakeholders who might benefit from it.

I suggest that the amendments do not solve the problem of lobbying or obfuscation, but they incentivise providing information and they give challengers a little bit more of a chance. I am sure we are going to say again and again in Committee that information is power. It is innovation power, political power and market power. I feel passionately that these are technical, housekeeping amendments rather than ones that require any change of government policy.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron, whose speech segues straight into my Amendments 14 and 63. This is all about the asymmetry of information. On the one hand, the amendments from the noble Baroness, Lady Jones, which I strongly support and have signed, are about giving information to challengers, whereas my amendments are about extracting information from SMS undertakings.

Failure to respond to a request for information allows SMS players to benefit from the information asymmetry that exists in all technology markets. Frankly, incumbents know much more about how things work than the regulators. They can delay, obfuscate, claim compliance while not fully complying and so on. By contrast, if they cannot proceed unless they have supplied full information, their incentives are changed. They have an incentive to fully inform, if they get a benefit from doing so. That is why merger control works so well and quickly, as the merger is suspended pending provision of full information and competition authority oversight. We saw that with the Activision Blizzard case, where I was extremely supportive of what the CMA did—in many ways, it played a blinder, as was subsequently shown.

We on these Benches consider that a duty to fully inform is needed in the Bill, which is the reason for our Amendments 14 and 63. They insert a new clause in Chapter 2, which provides for a duty to disclose to the CMA

“a relevant digital activity that may give rise to actual or likely detrimental impact on competition in advance of such digital activity’s implementation or effect”

and a related duty in Chapter 6 ensuring that that undertaking

“has an overriding duty to ensure that all information provided to the CMA is full, accurate and complete”.

Under Amendment 14, any SMS undertaking wishing to rely on it must be required to both fully inform and pre-notify the CMA of any conduct that risks breaching one of the Bill’s objectives in Clause 19. This is similar to the tried-and-tested pre-notification process for mergers and avoids the reality that the SMS player may otherwise simply implement changes and ignore the CMA’s requests. A narrow pre-notification system such as this avoids the risks.

We fully support and have signed the amendments tabled by the noble Baroness, Lady Jones. As techUK says, one of the benefits that wider market participants see from the UK’s pro-competition regime is that the CMA will initiate and design remedies based on the evidence it gathers from SMS firms in the wider market. This is one of the main advantages of the UK’s pro-competition regime over the EU DMA. To achieve this, we need to make consultation rights equal for all parties. Under the Bill currently, firms with SMS status, as the noble Baroness, Lady Harding, said, will have far greater consultation rights than those that are detrimentally affected by their anti-competitive behaviour. As she and the noble Lord, Lord Vaizey, said, there are opportunities for SMS firms to comment at the outset but none for challenger firms, which can comment only at a later public consultation stage.

It is very important that there are clear consultation and evidence-gathering requirements for the CMA, which must ensure that it works fairly with SMS firms, challengers, smaller firms and consumers throughout the process, ensuring that the design of conduct requirements applies to SMS firms and pro-competition interventions consider evidence from all sides, allowing interventions to be targeted and capable of delivering effective outcomes. This kind of engagement will be vital to ensuring that the regime can meet its objectives.

We do not believe that addressing this risk requires removing the flexibility given by the Bill. Instead, we believe that it is essential that third parties are given a high degree of transparency and input on deliberation between the CMA and SMS firms. The CMA must also—and I think this touches on something referred to by the noble Baroness, Lady Jones—allow evidence to be submitted in confidence, as well as engage in wider public consultations where appropriate. We very strongly support the amendments.

On the amendments from the noble Lord, Lord Tyrie, it is a bit of a curate’s egg. I support Amendments 12A and 12B because I can see the sense in them. I do not see that we need to have another way of marking the CMA’s homework, however. I am a great believer that we need greater oversight, and we have amendments later in the Bill for proposals to increase parliamentary oversight of what the CMA is doing. However, marking the CMA’s homework at that stage is only going to be an impediment. It will be for the benefit of the SMS undertakings and not necessarily for those who wish to challenge the power of those undertakings. I am only 50% with the noble Lord, rather than the whole hog.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, all the SMS has to do is put it through one of its large language models, and hey presto.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I am losing track of the conversation because I thought we were asking for more information for the challenger companies. rather than this debate between the SMS and the regulator. Both of them are, I hope, well resourced, but the challenger companies have somehow been left out of this equation and I feel that we are trying to get them into the equation in an appropriate way.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That is not incompatible. These are two sides of the same coin, which is why they are in this group. I suppose we could have degrouped it.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I shall also discuss the leveraging or whack-a-mole provisions. Perhaps Conservative Peers today are London buses: this is the fourth London bus to make the same point. I too would have added my name to my noble friend Lord Vaizey’s amendment had I been organised enough.

I shall make a couple of points. The noble Lord, Lord Tyrie, said earlier that we are all here on the Bill because harm has already been done. If noble Lords will forgive me, I will tell a little story. In 2012, I went on a customer trip to Mountain View, Google’s headquarters in California, as the chief executive of TalkTalk. We were in the early days of digital advertising and TalkTalk was one of its biggest customers. A whole group of customers went on what people now call a digital safari to visit California and see these tech companies in action.

I will never forget that the sales director left us for a bit for a demo from some engineers from head office in Mountain View, from Google, who demoed a new functionality they were working on to enable you to easily access price comparisons for flights. It was an interesting demo because some of the other big customers of Google search at the time were independent flight search websites, whose chief executives had been flown out by Google to see all the new innovation. The blood drained from their faces as this very well-meaning engineer described and demoed the new functionality and explained how, because Google controlled the page, it would be able to promote its flight search functionality to the top of the page and demote the companies represented in the room. When the sales director returned, it was, shall we say, quite interesting,

I tell that tale because there are many examples of these platforms leveraging the power of their platform to enter adjacent markets. As my noble friend has said, that gets to the core of the Bill and how important it is that the CMA is able to impose conduct requirements without needing to go through the whole SMS designation process all over again.

I know that the tech firms’ counterargument to this is that it is important that they have the freedom to innovate, and that for a number of them this would somehow create “a regulatory requirement to seek permission to innovate”. I want to counter that: we want all companies in this space to have the freedom to innovate, but they should not have the freedom to prioritise their innovation on their monopoly platform over other people’s innovation. That is why we have to get a definition of the leveraging principle, or the whack-a-mole principle, right. As with almost all the amendments we have discussed today, I am not particularly wedded to the specific wording, but I do not think that the Bill as it is currently drafted captures this clearly enough, and Amendments 25, 26, and 27 get us much closer to where we need to be.

I, too, add my voice in support my noble friend Lord Lansley’s amendments. I must apologise for not having studied them properly in advance of today, but my noble friend introduced them so eloquently that it is very clear that we need to put data clearly in the Bill.

Finally, as a member of my noble friend’s Communications and Digital Committee, I, too, listened very carefully to the comments made by the noble Lord, Lord Clement-Jones, about copyright. I feel this is a very big issue. Whether this is the right place to address it, I do not know, but I am sure he is right that we need to address it somehow.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I am sorry to break the Conservative bus pattern but I, too, will speak to Amendments 26 and 27, to which I have added my name, and to Amendment 30. Before I do, I was very taken by the amendments spoken to by the noble Lord, Lord Lansley, and I support them. I feel somewhat sheepish that I had not seen the relationship between data and the Bill, having spent most of the past few months with my head in the data Bill. That connection is hugely important, and I am very grateful to the noble Lord for making such a clear case. In supporting Amendments 26 and 27, I recognise the value of Amendment 25, tabled by the noble Lord, Lord Vaizey, and put on record my support for the noble Lord, Lord Holmes, on Amendment 24. So much has been said that we have managed to change the name of the leveraging principle to the whack-a-mole principle and everything that has been said has been said very well.

The only point I want to make on these two amendments, apart from to echo the profound importance that other noble Lords have already spoken of, is that the ingenuity of the sector has always struck me as being equally divided between its incredible creativity in creating new products and things for us to do and services that it can provide, and an equal ingenuity in avoiding regulation of all kinds in all parts of the world. Without having not only the designated activity but the activities the sector controls that are adjacent to the activity, we do not have the core purpose of the Bill. At one point I thought it might help the Minister to see that the argument he made in relation to Clause 6(2) and (3), which was in defence of some flexibility for the Secretary of State, might equally be made on behalf of the regulator in this case.

Turning briefly to Amendment 30 in the name of the noble Lord, Lord Clement-Jones, I first have to make a slightly unusual declaration in that my husband was one of the Hollywood writers who went on strike and won a historic settlement to be a human being in charge of their AI rather than at the behest of the AI. Not only in the creative industries but in academia, I have seen first-hand the impact of scraping information. Not only is the life’s work of an academic taken without permission, but then regurgitating it as an inaccurate mere guess undermines the very purpose of academic distinctions. There is clearly a copyright issue that requires an ability both to opt out and correct, and to share in the upside, as the noble Lord pointed out.

I suggest that the LLMs and general AI firms have taken the axiom “it’s better to ask forgiveness than permission” to unbelievable new heights. Our role during the passage of this Bill may be to turn that around and say that it is better to ask permission than forgiveness.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we have had a wonderfully eclectic debate. I am sorry if we gave some of the amendments more attention than others, because we have a number of very important issues here. Even in my response I may not be giving some colleagues due deference for their hard work and the good arguments they have put forward.

As noble Lords have commented, Amendments 26, 27 and 34 are in my name. As we have discussed, Amendments 26 and 27 would ensure that the CMA can tackle anti-competitive conduct in non-designated activity, provided that this conduct is related to designated activity. This would ensure, for example, that a designated company facing conduct requirements could not simply shift the resources of its business into another similar business venture, which would have a similar outcome of anti-competitive behaviour.

I am very grateful to the noble Baroness, Lady Stowell, for her support. The example she gave of Apple resonates with all of us and has obviously been in the news. It was one of the behaviours I described as rather vindictive in the last debate. I am not sure how much extra money Apple is going to make from it, but it is a question of rubbing someone’s nose in it because you do not like the decision that has been made. I feel that we need to address this issue.

The noble Lord, Lord Vaizey, in his Amendment 25, made a very similar point about the leveraging principle. We have all signed up to “the whack-a-mole principle”; I think we will call it that from now on. As the noble Baroness, Lady Harding, made clear, this is about addressing the leveraging of SMS markets to enter adjoining markets. She gave the example of travel price comparison. I feel that is a lazy innovation; if you get so big, you stop innovating—you copy the competing firms and taking their best ideas without innovating any more. It is in all our interests to get a grip on this, so that these companies that have great resources and great capacity for innovation innovate in a creative way rather than just copying other people’s ideas.

Amendment 34, which is also in our names, would enable the CMA to keep conduct requirements under review and take account of whether those requirements are having their intended effects or if further steps of pro-competition intervention is necessary. It would provide a clearer link between the measures available to the CMA. As the noble Lord, Lord Clement-Jones, and others have said, it underpins the importance of interoperability in CMA decisions. We believe that the amendments help to clarify and reinforce the powers available to the CMA.

I listened carefully to the noble Lord, Lord Holmes, who, as ever, provided enormous insight into the tech world and the consequences of the legislation. We share his objective of getting the powers of the CMA in the right balance. His amendment challenges the Government to explain why the CMA can only impose a conduct requirement to achieve the fair dealing, open choice or trust and transparency objectives—which seems to be overly restrictive and open to legal challenge. We look forward to hearing the Minister’s explanation of why those restrictions were felt necessary. The noble Lord, Lord Holmes, also raised an important point in his Amendment 24, which we have not given sufficient weight to, about the need for those conduct requirements to deliver proper accessibility in line with previous legislation. We absolutely support him in that quest.

The amendments from the noble Lords, Lord Clement-Jones and Lord Lansley, raise important points about transparency and improved data. They stress the importance of portability and interoperability and put data firmly into the conduct requirements. We support those arguments and look forward to the Minister’s response to what we feel are common-sense proposals.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I declare my interests set out in full on the register, including as an advisor to the Institute for Ethics in AI at Oxford University, chair of the Digital Futures for Children centre at the LSE and chair of the 5Rights Foundation. I add my welcome to my noble friend Lord de Clifford, who I had the pleasure of meeting yesterday, and I look forward to his maiden speech.

I start by quoting Marcus Fysh MP who said in the other place:

“this is such a serious moment in our history as a species. The way that data is handled is now fundamental to basic human rights … I say to those in the other place as well as to those on the Front Benches that … we should think about it incredibly hard. It might seem an esoteric and arcane matter, but it is not. People might not currently be interested in the ins and out of how AI and data work, but in future you can bet your bottom dollar that AI and data will be interested in them. I urge the Government to work with us to get this right”.—[Official Report, Commons, 29/11/23; col. 878.]

He was not the only one on Report in the other place who was concerned about some of the provisions in the Bill, who bemoaned the lack of scrutiny and urged the Government to think again. Nor was he the only one who reluctantly asked noble Lords to send the Bill back to the other place in better shape.

I associate myself with the broader points made by both noble Lords who have already spoken—I do not think I disagreed with a word that they said—but my own comments will primarily focus on the privacy of children, the case for data communities, access for researchers and, indeed, the promises made to bereaved parents and then broken.

During the passage of the Data Protection Act 2018, your Lordships’ House, with cross-party support, introduced the age appropriate design code, a stand-alone data protection regime for the under-18s. The AADC’s privacy by design approach ushered in a wave of design change to benefit children: TikTok and Instagram disabled direct messaging from unknown adults to children; YouTube turned off auto-play; Google turned on safe search on by default for children; 18-plus apps were taken out of the Play Store; TikTok stopped notifications through the night; and Roblox stopped tracking and targeting children for advertising. These were just a handful of hundreds of changes to products and services likely to be accessed by children. Many of these changes have been rolled out globally, meaning that while other jurisdictions cannot police the code, children in those places benefit from it. As the previous Minister, the noble Lord, Lord Parkinson, acknowledged, it contributes to the UK’s reputation for digital regulation and is now being copied around the globe.

I set this out at length because the AADC not only drove design change, it also established the crucial link between privacy and safety. This is why it is hugely concerning that children have not been explicitly protected from changes that lessen user data protections in the Bill. I have given Ministers notice that I will seek to enshrine the principle that children have the right to a higher bar of data protection by design and default; to define children’s data as sensitive personal data in the Bill; and exclude children from proposals that risk eroding the impact of the AADC, notably in risk assessments, automated processing, onward processing, direct marketing and the extended research powers of commercial companies.

Minister Paul Scully said at Second Reading in the other place:

“We are committed to protecting children and young people online … organisations will still have to abide by our Age-appropriate design code”.—[Official Report, Commons, 17/4/23; col. 101.]

I take it from those words that any perception of, or diminution to, children’s data rights is inadvertent, and it remains the Government’s policy not to weaken the AADC as currently configured in the Bill. Will the Minister confirm that it is indeed the Government’s intention to protect the AADC and that he is willing to work with me to ensure that it is that the outcome? I will also seek a requirement for the ICO to create a statutory children’s code in relation to AI. The ubiquitous deployment of AI technology to recommend and curate is nothing new, but the rapid advances in generative AI capabilities marks a new stage in its evolution. In the hundreds of pages of the ICO’s non-binding Guidance on AI and Data Protection, its AI and Data Protection Risk Toolkit and its advice to developers on generative AI, there is but one mention of the word “child”—in a case study about child benefit.

The argument made was that children are covered by the AADC, which underlines again just how consequential it is. However, since adults are covered by data law but it is considered necessary to have specific AI guidance, the one in three users that is under 18 deserves the same consideration. I am not at liberty to say today, but later this week—perhaps as early as tomorrow—information will emerge that underlines the urgent need for specific consideration of children’s safety in relation to generative models. I hope that the Minister will agree that an AI code for kids is an imperative rather than nice to have.

Similarly, we must deliver data privacy to children in education settings. Given the extraordinary rate at which highly personal data seeps out of schools into the commercial world, including to gambling companies and advertisers, coupled with the scale of tech adoption in schools, it is untenable to continue to see tech inside school as a problem for schools and tech outside school as a problem for regulators. The spectre of a nursery teacher having enough time and knowledge to integrate the data protection terms of a singing app, or the school ICT lead having to tackle global companies such as Google and Microsoft to set the terms for their students’ privacy, is frankly ridiculous, but that is the current reality. Many school leaders feel abandoned by the Government’s insistence that they should be responsible for data protection when both the AADC and Online Safety Act have been introduced but they benefit from neither. It should be the role of the ICO to set data standards for edtech and to ensure that providers are held to account if they fall short. As it stands, a child enjoys more protection on the bus to school than in the classroom.

Finally on issues relating to children, I want to raise a technical issue around the production of AI-generated child sexual abuse material. I recognise the Government’s exemplary record on tackling CSAM but, unfortunately, innovation does not stop. While AI-generated child sexual abuse content is firmly in scope of UK law, it appears that the models or plug-ins trained on generating CSAM or trained to generate CSAM are not. At least four laws, the earliest from 1978, are routinely used to bring criminal action against CSAM and perpetrators of it, so I would be grateful if the Minister would agree to explore the issue with the police unit that has raised it with me and make an explicit commitment to close any gaps identified.

We are at an inflection point, and however esoteric and arcane the issues around data appear to be, to downgrade a child’s privacy even by a small degree has huge implications for their safety, identity and selfhood. If the Government fail to protect and future-proof children’s privacy, they will be simply giving with one hand in the OSA and taking away with the other in this Bill.

Conscious that I have had much to say about children, I will briefly put on the record issues that we can debate at greater length in Committee. While data law largely rests on the assumption of a relationship between an individual and a service, we have seen over a couple of decades that power lies in having access to large datasets. The Bill offers a wonderful opportunity to put that data power in the hands of new entrants to the market, be they businesses or communities, by allowing the sharing of individual data rights and being able to assign data rights to third parties for agreed purposes. I have been inspired by approaches coming out of academia and the third sector which have supported the drafting of amendments to find a route that would enable the sharing of data rights.

Similarly, as the noble Lord, Lord Knight, said, we must find a route to access commercial data sets for public interest research. I was concerned that in the other place when former Secretary of State Jeremy Wright queried why a much-touted research access had not materialised in the Bill, the Minister appeared to suggest that it was covered. The current drafting embeds the asymmetries of power by allowing companies to access user data, including for marketing and creating new products, but does not extend access for public interest research into the vast databases held by those same companies. There is a feeling of urgency emerging as our academic institutions see their European counter- parts gain access to commercial data because of the DSA. There is an increased need for independent research to support our new regulatory regimes such as the Online Safety Act. This is an easy win for the Government and I hope that they grasp it.

Finally, I noted very carefully the words of the Minister when he said, in relation to a coroner’s access to data, that the Secretary of State had made an offer to fill the gap. This is a gap that the Government themselves created. During the passage of the Online Safety Act we agreed to create a humane route to access data when a coroner had reason to suspect that a regulated company might have information relevant to the death of a child. The Government have reneged by narrowing the scope to those children taking their own life. Expert legal advice says that there are multiple scenarios under which the Government’s narrowing scope creates a gaping hole in provision for families of murdered children and has introduced uncertainty and delay in cases where it may not be clear how a child died at the outset.

I must ask the Minister what the Government are trying to achieve here and who they are trying to please. Given the numbers, narrowing scope is unnecessary, disproportionate and egregiously inhumane. This is about parents of murdered children. The Government lack compassion. They have created legal uncertainty and betrayed and re-traumatised a vulnerable group to whom they made promises. As we go through this Bill and the competition Bill, the Minister will at some points wish the House to accept assurances from the Dispatch Box. The Government cannot assure the House until the assurances that they gave to bereaved parents have been fulfilled.

I will stop there, but I urge the Minister to respond to the issues that I have raised rather than leave them for another day. The Bill must uphold our commitment to the privacy and safety of children. It could create an ecosystem of innovative data-led businesses and keep our universities at the forefront of tech development and innovation. It simply must fulfil our promise to families who this Christmas and every other Christmas will be missing a child without ever knowing the full circumstances surrounding that child’s death. That is the inhumanity that we in this House promised to stop—and stop it we must.

Artificial Intelligence: Regulation

Baroness Kidron Excerpts
Tuesday 14th November 2023

(1 year, 7 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I think there are two things. First, we are extremely keen, and have set this out in the White Paper, that the regulation of AI in this country should be highly interoperable with international regulation—I think all countries regulating would agree on that. Secondly, I take some issue with the characterisation of AI in this country as unregulated. We have very large areas of law and regulation to which all AI is subject. That includes data protection, human rights legislation, competition law, equalities law and many other laws. On top of that, we have the recently created central AI risk function, whose role is to identify risks appearing on the horizon, or indeed cross-cutting AI risks, to take that forward. On top of that, we have the most concentrated and advanced thinking on AI safety anywhere in the world to take us forward on the pathway towards safe, trustworthy AI that drives innovation.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, given the noble Viscount’s emphasis on the gathering of evidence and evidence-based regulation, can we anticipate having a researchers’ access to data measure in the upcoming Data Protection and Digital Information Bill?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness for her question and recognise her concern. In order to be sure that I answer the question properly, I undertake to write to her with a full description of where we are and to meet her to discuss further.

King’s Speech

Baroness Kidron Excerpts
Tuesday 14th November 2023

(1 year, 7 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I too welcome the right reverend Prelate the Bishop of Newcastle. I admire her bravery in wearing the colours of Sunderland and Newcastle simultaneously.

I declare my interests as chair of 5Rights Foundation, chair of the Digital Futures Commission at the LSE and adviser to the Institute for Ethics in AI at Oxford. Like others, I will start with Bletchley Park. That was kicked off by the Prime Minister, who set out his hopes for an AI-enabled world, while promising to tackle head-on its potential dangers. He said:

“Criminals could exploit AI for cyber-attacks, disinformation, fraud, or even child sexual abuse”—


but these are not potential dangers; they exist here and now.

In the race for AI prominence and the vast riches the technology promises, the tech leaders came to town warning us that the future they are creating is untrammelled, unprincipled and insecure and that AI will overwhelm human agency. I think that that language of existential threat makes for fabulous headlines, but it rather disempowers the rest of us. Because, if we ask if we want to supercharge the creation of child sexual abuse material, I would hazard a guess that the answer will be no; or if it is okay for facial recognition trained on white faces to prevent a black parent or child getting a security pass to enter a school, again no; or if we believe that just because something is technically possible—the creation of a disease or a weapon—it should be done, again no. Indeed, we have a record of containing the distribution of inventions that have the capability of annihilating us.

AI is not separate and different, and the language that we use to describe it—either its benefits or threats—must make that clear. AI is built, used and purveyed by business, government, civil society and even criminals. It is part of the human arrangements over which, for the moment, we still have agency. Language that disempowers us is part of the deliberate strategy of tech exceptionalism, advocated by industry lobbyists over decades, which has successfully secured the privatisation of technology, creating untold wealth for a few while outsourcing the cost to society. Who owns AI, who benefits, who is responsible and who gets hurt is still in the balance and I would assert that these are questions that we must deal with here and now.

I was disappointed to hear the noble Viscount say earlier at Questions that the Government were taking a sit-back-and-wait approach, so I have three rather more modest questions for the Minister, each of which could be tackled here and now. The first is: what plans do the Government have to ensure the robust application of our existing laws? As we saw earlier, the large language models and image creation services have used copyright material at scale. Getty Images has been testing it in court on behalf of its artists and photographers, but other rights holders, including some of the world’s finest authors, are unable to challenge this on an individual basis while their art and livelihood is scraped into vast datasets from which they do not benefit. I ask the Minister whether it would be a good idea to have an analysis of how new models are failing to uphold existing law and rights obligations as a first and urgent task for the new AI Safety Institute.

Secondly, how do the Government plan to use their legislative programme to tackle gaps that have been identified? For example, the creation, distribution and consumption of CSAM content is illegal, covered by at least three separate laws in the UK. But not one of these laws covers the models or plug-ins that create CSAM at scale—in one case, more than 20,000 images in a matter of hours—so the upcoming data protection Bill provides us with an opportunity to make training, sharing and possessing software that is trained on or trained to produce CSAM content an offence.

Also on the Prime Minister’s list is disinformation. Synthetic information that passes for real is also a here and now problem: the London Mayor, whose voice was fabricated, celebrities falsely endorsing products or a child’s picture scraped from a school website to train those aforesaid CSAM models. The loss of control of one’s personhood carries with it a democratic deficit and potentially overwhelming individual suffering. I ask the Minister whether the Government are willing to put beyond doubt that AI-generated biometric and image data constitutes a form of personal data over which an individual, whether adult or child, has rights, including the right to object to its use.

Both the data Bill and the digital markets Bill could create new data models—a subject that the noble Baroness, Lady Stowell, articulated very well in a recent article in the Times. New approaches to data rights, with new owners of data, are one way of having a voice in our AI-enabled future.

Thirdly and finally, I would like to ask the Minister why the Government have left children on the margins. I attended two official fringe events of the summit, one hosted by the then Home Secretary about child sexual abuse, the other convened by St Mary’s and the Turing Institute about embedding children’s rights in AI systems. Children are early adopters of technology—canaries in the coal mine—and many of us know the cost of poorly regulated digital environments for them. I am bewildered that, so soon after Royal Assent to the Online Safety Act, and in clear sight of the challenges that AI brings, the Government risk downgrading children’s data rights rather than explicitly protecting the age-appropriate design code and the definitions on which it is founded. Children should have been front and centre of the concerns at Bletchley, not pushed to the fringe, and perhaps the Minister could repair that damage by putting them front and centre of the new AI Safety Institute. After all, it is children who will inhabit the world we are building.

Finally, AI will create enormous benefits and upheaval across all sectors, but it also promises to put untold wealth and power in the hands of even fewer people. However, there are things in the here and now that we can do to ensure that technology innovates in ways that support human agency. It is tech exceptionalism that poses an existential threat to humanity, not the technology itself.