(1 year, 5 months ago)
Grand CommitteeMy Lords, I rise to speak to my Amendment 11 and to Amendments 14, 16, 17, 18, Clause 5 stand part and Clause 7 stand part. I will attempt to be as brief as I can, but Clause 5 involves rather a large number of issues.
Processing personal data is currently lawful only if it is performed for at least one lawful purpose, one of which is that the processing is for legitimate interests pursued by the controller or a third party, except where those interests are overridden by the interests or fundamental rights of the data subject. As such, if a data controller relies on their legitimate interest as a legal basis for processing data, they must conduct a balancing test of their interest and those of the data subject.
Clause 5 amends the UK GDPR’s legitimate interest provisions by introducing the concept of recognised legitimate interest, which allows data to be processed without a legitimate interest balancing test. This provides businesses and other organisations with a broader scope of justification for data processing. Clause 5 would amend Article 6 of the UK GDPR to equip the Secretary of State with a power to determine these new recognised legitimate interests. Under the proposed amendment, the Secretary of State must have regard to,
“among other things … the interests and fundamental rights and freedoms of data subjects”.
The usual legitimate interest test is much stronger: rather than merely a topic to have regard to, a legitimate interest basis cannot lawfully apply if the data subject’s interests override those of the data controller.
Annexe 1, as inserted by the Bill, now provides a list of exemptions but is overly broad and vague. It includes national security, public security and defence, and emergencies and crime as legitimate interests for data processing without an assessment. Conservative MP, Marcus Fysh, said on Third Reading:
“Before companies share data or use data, they should have to think about what the balance is between a legitimate interest and the data rights, privacy rights and all the other rights that people may have in relation to their data. We do not want to give them a loophole or a way out of having to think about that.” —[Official Report, Commons, 29/11/23; col. 896.]
I entirely agree with that.
The amendment in Clause 5 also provides examples of processing that may be considered legitimate interests under the existing legitimate interest purpose, under Article 6(1)(f), rather than under the new recognised legitimate interest purpose. These include direct marketing, intra-group transmission of personal data for internal administrative purposes, and processing necessary to ensure the security of a network.
The Bill also provides a much more litigious data environment. Currently, an organisation’s assessment of its lawful purposes for processing data can be challenged through correspondence or an ICO complaint, whereas, under the proposed system, an individual may be forced to legally challenge a statutory instrument in order to contest the basis on which their data is processed.
As I will explain later, our preference is that the clause not stand part, but I accept that there are some areas that need clarification and Amendment 11 is designed to do this. The UK GDPR sets out conditions in which processing of data is lawful. The Bill inserts in Article 6(1) a provision specifying that processing shall be lawful for the purposes of a recognised legitimate interest, as I referred to earlier, an example of which may be for the purposes of direct marketing.
Many companies obtain data from the open electoral register. The register is maintained by local authorities, which have the right to sell this data to businesses. Amendment 11 would insert new Article (6)(1)(aa) and (ab), which provide that data processing shall be lawful where individuals have consented for their data
“to enter the public domain via a public body”,
or where processing is carried out by public bodies pursuant to their duties and rights, which may include making such data available to the public. Individuals are free to opt out of the open electoral register if they so wish and it would be disproportionate—in fact, irritating—to consumers to notify those who have consented to their data being processed that their data is being processed.
On Amendment 14, as mentioned, the Bill would give the Secretary of State the power to determine recognised legitimate interests through secondary legislation, which is subject to minimal levels of parliamentary scrutiny. Although the affirmative procedure is required, this does not entail much scrutiny or much of a debate. The last time MPs did not approve a statutory instrument under the affirmative procedure was in 1978. In practice, interests could be added to this list at any time and for any reason, facilitating the flow and use of personal data for limitless potential purposes. Businesses could be obligated to share the public’s personal data with government or law enforcement agencies beyond what they are currently required to do, all based on the Secretary of State’s inclination at the time.
We are concerned that this Henry VIII power is unjustified and undermines the very purpose of data protection legislation, which is to protect the privacy of individuals in a democratic data environment, as it vests undue power over personal data rights in the Executive. This amendment is designed to prevent the Secretary of State from having the ability to pre-authorise data processing outside the usual legally defined route. It is important to avoid a two-tier data protection framework in which the Secretary of State can decide that certain processing is effectively above the law.
On Amendment 17, some of the most common settings where data protection law is broken relate to the sharing of HIV status of an individual living with HIV in their personal life in relation to employment, healthcare services and the police. The sharing of an individual’s HIV status can lead to further discrimination being experienced by people living with HIV and can increase their risk of harassment or even violence. The National AIDS Trust is concerned that the Bill as drafted does not go far enough to prevent individuals’ HIV status from being shared with others without their consent. They and we believe that the Bill must clarify what an “administrative purpose” is for organisations processing employees’ personal data. Amendment 17 would add wording to clarify that, in paragraph 9(b) of Article 6,
“intra-group transmission of personal data”
in the workplace, within an organisation or in a group of organisations should be permitted only for individuals who need to access an employee’s personal data as part of their work.
As far as Amendment 18 is concerned, as it stands Clause 5 gives an advantage to large undertakings with numerous companies that can transmit data intra-group purely because they are affiliated to one central body. However, this contradicts both the ICO’s and the CMA’s repeated position that first party versus third party is not a meaningful distinction to cover privacy risk. Instead, it is the distinction of what data is processed, rather than the corporate ownership of the systems doing the processing. The amendment reflects the organisational measures that undertakings should have as safeguards. The groups of undertakings transmitting data should have organisational measures via contract to be able to take advantage of this transmission of data.
Then we come to the question of Clause 5 standing part of the Bill. This clause is unnecessary and creates risks. It is unnecessary because the legitimate interest balancing test is, in fact, flexible and practical; it already allows processing for emergencies, safeguarding and so on. It is risky because creating lists of specified legitimate interests inevitably narrows this concept and may make controllers less certain about whether a legitimate interest that is not a recognised legitimate interest can be characterised as such. In the age of AI, where change is exponential, we need principles and outcome-based legislation that are flexible and can be supplemented with guidance from an independent regulator, rather than setting up a system that requires the Government to legislate more and faster in order to catch up.
There is also a risk that the drafting of this provision does not dispense with the need to conduct a legitimate interest balancing test because all the recognised legitimate interests contain a test, of necessity. Established case law interprets the concept of necessity under data protection law as requiring a human rights balancing test to be carried out. This rather points to the smoke-and-mirrors effect of this drafting, which does nothing to improve legal certainty for organisations or protections for individuals.
I now come to Clause 7 standing part. This clause creates a presumption that processing will always be in the public interest or substantial public interest if done in reliance on a condition listed in proposed new Schedule A1 to the Data Protection Act 2018. The schedule will list international treaties that have been ratified by the UK. At present, the Bill lists only the UK-US data-sharing agreement as constituting relevant international law. Clause 7 seeks to remove the requirement for a controller to consider whether the legal basis on which they rely is in the public interest or substantial public interest, has appropriate safeguards and respects data subjects’ fundamental rights and freedoms. But the conditions in proposed new Schedule A1 in respect of the UK-US agreement also state that the processing must be necessary, as assessed by the controller, to respond to a request made under the agreement.
It is likely that a court would interpret “necessity” in the light of the ECHR. The court may therefore consider that the inclusion of a necessity test means that a controller would have to consider whether the UK-US agreement, or any other treaty added to the schedule, is proportionate to a legitimate aim pursued. Not only is it unreasonable to expect a controller to do such an assessment; it is also highly unusual. International treaties are drafted on a state-to-state basis and not in a way that necessarily corresponds clearly with domestic law. Further, domestic courts would normally consider the rights under the domestic law implementing a treaty, rather than having to interpret an international instrument without reference to a domestic implementing scheme. Being required to do so may make it more difficult for courts to enforce data subjects’ rights.
The Government have not really explained why it is necessary to amend the law in this way rather than simply implementing the UK-US agreement domestically. That would be the normal approach; it would remove the need to add this new legal basis and enable controllers to use the existing framework to identify a legal basis to process data in domestic law. Instead, this amendment makes it more difficult to understand how the law operates, which could in turn deter data sharing in important situations. Perhaps the Minister could explain why Clause 7 is there.
I beg to move.
My Lords, I rise to speak to Amendments 13 and 15. Before I do, let me say that I strongly support the comments of the noble Lord, Lord Clement-Jones, about HIV and the related vulnerability, and his assertion—almost—that Clause 5 is a solution in search of a problem. “Legitimate interest” is a flexible concept and I am somewhat bewildered as to why the Government are seeking to create change where none is needed. In this context, it follows that, were the noble Lord successful in his argument that Clause 5 should not stand part, Amendments 13 and 15 would be unnecessary.
On the first day in Committee, we debated a smaller group of amendments that sought to establish the principle that nothing in the Bill should lessen the privacy protections of children. In his response, the Minister said:
“if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data”.—[Official Report, 20/3/24; col. GC 75.]
I am glad the Minister is open to listening and that the Government’s intention is to protect children, but, as discussed previously, widening the definition of “research” in Clause 3 and watering down purpose limitation protections in Clause 6 negatively impacts children’s data rights. Again, in Clause 5, lowering the protections for all data subjects has consequences for children.
Indeed. Needless to say, we take the recommendations of the DPRRC very seriously, as they deserve. However, because this is an exhaustive list, and because the technologies and practices around data are likely to evolve very rapidly in ways we are unable currently to predict, it is important to retain as a safety measure the ability to update that list. That is the position the Government are coming from. We will obviously continue to consider the DPRRC’s recommendations, but that has to come with a certain amount of adaptiveness as we go. Any addition to the list would of course be subject to parliamentary debate, via the affirmative resolution procedure, as well as the safeguards listed in the provision itself.
Clause 50 ensures that the ICO and any other interested persons should be consulted before making regulations.
Amendments 15, 16, 17 and 18 would amend the part of Clause 5 that is concerned with the types of activities that might be carried out under the current legitimate interest lawful ground, under Article 6(1)(f). Amendment 15 would prevent direct marketing organisations relying on the legitimate interest lawful ground under Article 6(1)(f) if the personal data being processed related to children. However, the age and vulnerability in general of data subjects is already an important factor for direct marketing organisations when considering whether the processing is justified. The ICO already provides specific guidance for controllers carrying out this balancing test in relation to children’s data. The fact that a data subject is a child, and the age of the child in question, will still be relevant factors to take into account in this process. For these reasons, the Government consider this amendment unnecessary.
My Lords, am I to take it from that that none of the changes currently in the Bill will expose children on a routine basis to direct marketing?
As is the case today and will be going forward, direct marketing organisations will be required to perform the balancing test; and as in the ICO guidance today and, no doubt, going forward—
I am sorry if I am a little confused—I may well be—but the balancing test that is no longer going to be there allows a certain level of processing, which was the subject of the first amendment. The suggestion now is that children will be protected by a balancing test. I would love to know where that balancing test exists.
The balancing test remains there for legitimate interests, under Article 6(1)(f).
Amendment 16 seeks to prevent organisations that undertake third-party marketing relying on the legitimate interest lawful ground under Article 6(1)(f) of the UK GDPR. As I have set out, organisations can rely on that ground for processing personal data without consent when they are satisfied that they have a legitimate interest to do so and that their commercial interests are not outweighed by the rights and interests of data subjects.
Clause 5(4) inserts in Article 6 new paragraph (9), which provides some illustrative examples of activities that may constitute legitimate interests, including direct marketing activities, but it does not mean that they will necessarily be able to process personal data for that purpose. Organisations will need to assess on a case-by-case basis where the balance of interest lies. If the impact on the individual’s privacy is too great, they will not be able to rely on the legitimate interest lawful ground. I should emphasise that this is not a new concept created by this Bill. Indeed, the provisions inserted by Clause 5(4) are drawn directly from the recitals to the UK GDPR, as incorporated from the EU GDPR.
I recognise that direct marketing can be a sensitive—indeed, disagreeable—issue for some, but direct marketing information can be very important for businesses as well as individuals and can be dealt with in a way that respects people’s privacy. The provisions in this Bill do not change the fact that direct marketing activities must be compliant with the data protection and privacy legislation and continue to respect the data subject’s absolute right to opt out of receiving direct marketing communications.
Amendment 17 would make sure that the processing of employee data for “internal administrative purposes” is subject to heightened safeguards, particularly when it relates to health. I understand that this amendment relates to representations made by the National AIDS Trust concerning the level of protection afforded to employees’ health data. We agree that the protection of people’s HIV status is vital and that it is right that it is subject to extra protection, as is the case for all health data and special category data. We have committed to further engagement and to working with the National AIDS Trust to explore solutions in order to prevent data breaches of people’s HIV status, which we feel is best achieved through non-legislative means given the continued high data protection standards afforded by our existing legislation. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press this amendment.
Amendment 18 seeks to allow businesses more confidently to rely on the existing legitimate interest lawful ground for the transmission of personal data within a group of businesses affiliated by contract for internal administrative purposes. In Clause 5, the list of activities in proposed new paragraphs (9) and (10) are intended to be illustrative of the types of activities that may be legitimate interests for the purposes of Article 6(1)(f). They are focused on processing activities that are currently listed in the recitals to the EU GDPR but are simply examples. Many other processing activities may be legitimate interests for the purposes of Article 6(1)(f) of the UK GDPR. It is possible that the transmission of personal data for internal administrative purposes within a group affiliated by contract may constitute a legitimate interest, as may many other commercial activities. It would be for the controller to determine this on a case-by-case basis after carrying out a balancing test to assess the impact on the individual.
Finally, I turn to the clause stand part debate that seeks to remove Clause 7 from the Bill. I am grateful to the noble Lord, Lord Clement-Jones, for this amendment because it allows me to explain why this clause is important to the success of the UK-US data access agreement. As noble Lords will know, that agreement helps the law enforcement agencies in both countries tackle crime. Under the UK GDPR, data controllers can process personal data without consent on public interest grounds if the basis for the processing is set out in domestic law. Clause 7 makes it clear that the processing of personal data can also be carried out on public interest grounds if the basis for the processing is set out in a relevant international treaty such as the UK-US data access agreement.
The agreement permits telecommunications operators in the UK to disclose data about serious crimes with law enforcement agencies in the US, and vice versa. The DAA has been operational since October 2022 and disclosures made by UK organisations under it are already lawful under the UK GDPR. Recent ICO guidance confirms this, but the Government want to remove any doubt in the minds of UK data controllers that disclosures under the DAA are permitted by the UK GDPR. Clause 7 makes it absolutely clear to telecoms operators in the UK that disclosures under the DAA can be made in reliance on the UK GDPR’s public tasks processing grounds; the clause therefore contributes to the continued, effective functioning of the agreement and to keeping the public in both the UK and the US safe.
For these reasons, I hope that the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment.
My Lords, this whole area of democratic engagement is one that the Minister will need to explain in some detail. This is an Alice in Wonderland schedule: “These words mean what I want them to mean”. If, for instance, you are engaging with the children of a voter—at 14, they are children—is that democratic engagement? You could drive a coach and horses through Schedule 1. The Minister used the word “necessary”, but he must give us rather more than that. It was not very reassuring.
The Minister mentioned a presumption that the ICO will update its guidance. Is there a timeframe for that? Will the guidance be updated before this comes into effect? How does the age of 14 relate to the AADC, which sets the age of adulthood at 18?
Before the Minister replies, we may as well do the full round. I agree with him, in that I very much believe in votes at 16 and possibly younger. I have been on many a climate demonstration with young people of 14 and under, so they can be involved, but the issue here is bigger than age. The main issue is not age but whether anybody should be subjected to a potential barrage of material in which they have not in any way expressed an interest. I am keen to make sure that this debate is not diverted to the age question and that we do not lose the bigger issue. I wanted to say that I sort of agree with the Minister on one element.
A fair number of points were made there. I will look at ages under 16 and see what further steps, in addition to being necessary and proportionate, we can think about to provide some reassurance. Guidance would need to be in effect before any of this is acted on by any of the political parties. I and my fellow Ministers will continue to work with the ICO—
I am sorry to press the Minister, but does the Bill state that guidance will be in place before this comes into effect?
I am not sure whether it is written in the Bill. I will check, but the Bill would not function without the existence of the guidance.
Indeed. I will make absolutely sure that we provide a full answer. By the way, I sincerely thank the noble Lord for taking the time to go through what is perhaps not the most rewarding of reads but is useful none the less.
On the question of the ICO being responsible to Parliament, in the then Online Safety Bill and the digital markets Bill we consistently asked for regulators to be directly responsible to Parliament. If that is something the Government believe they are, we would like to see an expression of it.
I would be happy to provide such an expression. I will be astonished if that is not the subject of a later group of amendments. I have not yet prepared for that group, I am afraid, but yes, that is the intention.
My Lords, it is a pleasure to follow the noble Lord, Lord Bassam, who has already set out very clearly what the group is about. I will chiefly confine myself to speaking to my Amendment 38A, which seeks to put in the Bill a clear idea of what having a human in the loop actually means. We need to have a human in the loop to ensure that a human interpreted, assessed and, perhaps most crucially, was able to intervene in the decision and any information on which it is based.
Noble Lords will be aware of many situations that have already arisen in which artificial intelligence is used—I would say that what we are currently describing is artificial intelligence but, in real terms, it is not truly that at all. What we have is a very large use of big data and, as the noble Lord, Lord Bassam, said, big data can be a very useful and powerful tool to be used for many positive purposes. However, we know that the quality of decision-making often depends on the quality of the data going in. A human is able to see whether something looks astray or wrong; there is a kind of intelligence that humans apply to this, which machines simply do not have the capacity for.
I pay credit to Justice, the law reform and human rights organisation which produced an excellent briefing on the issues around Clause 14. It asserts that, as it is currently written, it inadequately protects individuals from automated harm.
The noble Lord, Lord Bassam, referred to the Horizon case in the UK; that is the obvious example but, while we may think of some of the most vulnerable people in the UK, the Robodebt case in Australia is another case where crunching big data, and then crunching down on individuals, had truly awful outcomes. We know that there is a real risk of unfairness and discrimination in the use of these kinds of tools. I note that the UK has signed the Bletchley declaration, which says that
“AI should be designed, developed, deployed, and used, in a manner that is … human-centric, trustworthy and responsible”.
I focus particularly on “human-centric”: human beings can sympathise with and understand other human beings in a way that big data simply does not.
I draw a parallel with something covered by a special Select Committee of your Lordships’ House, last year: lethal autonomous weapon systems, or so-called killer robots. This is an obvious example of where there is a very strong argument for having a human in the loop, as the terminology goes. From the last I understood and heard about this, I am afraid that the UK Government are not fully committed to a human in the loop in the case of killer robots, but I hope that we get to that point.
When we talk about how humans’ data is used and managed, we are also talking about situations that are—almost equally—life and death: whether people get a benefit, whether they are fairly treated and whether they do not suddenly disappear off the system. Only this morning, I was reading a case study of a woman aged over 80, highlighting how she had been through multiple government departments, but could not get her national insurance number. Without a national insurance number, she could not get the pension to which she was entitled. If there is no human in the loop to cut through those kinds of situations, there is a real risk that people will find themselves just going around and around machines—a circumstance with which we are personally all too familiar, I am sure. My amendment is an attempt to put a real explanation in the Bill for having that human in the loop.
My Lords, the number of amendments proposed to Clause 14 reflects the Committee’s very real concern about the impact of automated decision-making on the privacy, safety and prospects of UK data subjects. I have specific amendments in groups 7 and 8, so I will speak to the impact of Clause 14 on children later. I will again be making arguments about the vulnerability of these systems in relation to the Government’s proposals on the DWP.
Without repeating the arguments made, I associate myself with most the proposals and the intention behind them—the need to safeguard the prospects of a fair outcome when algorithms hold sway over a person’s future. It seems entirely logical that, if the definition of solely automated decision-making requires “no meaningful human involvement”, we should be clear, as Amendment 40 proposes, about what is considered “meaningful”, so that the system cannot be gamed by providing human involvement that provides an ineffective safeguard and is therefore not meaningful.
I have sympathy with many of these amendments—Amendments 38A, 39, 47, 62, 64 and 109—and ultimately believe, as was suggested by the noble Lord, Lord Bassam, that it is a matter of trust. I refer briefly to the parliamentary briefing from the BMA, which boldly says that:
“Clause 14 risks eroding trust in AI”.
That would be a very sad outcome.
My Lords, we have heard some powerful concerns on this group already. This clause is in one of the most significant parts of the Bill for the future. The Government’s AI policy is of long standing. They started it many years ago, then had a National AI Strategy in 2021, followed by a road map, a White Paper and a consultation response to the White Paper. Yet this part of the Bill, which is overtly about artificial intelligence and automated decision-making, does not seem to be woven into their thinking at all.
My Lords, the amendments in this group highlight that Clause 14 lacks the necessary checks and balances to uphold equality legislation, individual rights and freedoms, data protection rights, access to services, fairness in the exercise of public functions and workers’ rights. I add my voice to that of the noble Lord, Lord Clement-Jones, in his attempt to make Clause 14 not stand part, which he will speak to in the next group.
I note, as the noble Lord, Lord Bassam, has, that all the current frameworks have fundamental rights at their heart, whether it is the White House blueprint, the UN Secretary-General’s advisory body on AI, with which I am currently involved, or the EU’s AI Act. I am concerned that the UK does not want to work within this consensus.
With that in mind, I particularly note the importance of Amendment 41. As the noble Lord said, we are all supposed to adhere to the Equality Act 2010. I support Amendments 48 and 49, which are virtually inter-changeable in wanting to ensure that the standard of decisions being “solely” based on automated decision-making cannot be gamed by adding a trivial human element to avoid that designation.
Again, I suggest that the Government cannot have it both ways—with nothing diminished but everything liberated and changed—so I find myself in agreement with Amendment 52A and Amendment 59A, which is in the next group, from the noble Lord, Lord Holmes, who is not in his place. These seek clarity from the Information Commissioner.
I turn to my Amendment 46. My sole concern is to minimise the impact of Clause 14 on children’s safety, privacy and life chances. The amendment provides that a significant decision about a data subject must not be based solely on automated processing if
“the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child”,
taking into account the full gamut of their rights and development stage. Children have enhanced rights under the UNCRC, to which the UK is a signatory. Due to their evolving capacities as they make the journey from infancy to adulthood, they need special protections. If their rights are diminished in the digital world, their rights are diminished full stop. Algorithms determine almost every aspect of a child’s digital experience, from the videos they watch to their social network and from the sums they are asked to do in their maths homework to the team they are assigned when gaming. We have seen young boys wrongly profiled as criminal and girls wrongly associated with gangs.
In a later group, I will speak to a proposal for a code of practice on children and AI, which would codify standards and expectations for the use of AI in all aspects of children’s lives, but for now, I hope the Minister will see that, without these amendments to automated decision-making, children’s data protection will be clearly weakened. I hope he will agree to act to make true his earlier assertion that nothing in the Bill will undermine child protection. The Minister is the Minister for AI. He knows the impact this will have. I understand that, right now, he will probably stick to the brief, but I ask him to go away, consider this from the perspective of children and parents, and ask, “Is it okay for children’s life chances to be automated in this fashion?”
My Lords, I will speak to my Amendment 48. By some quirk of fate, I failed to sign up to the amendments that the noble Lord, Lord Bassam, so cogently introduced. I would have signed up if I had realised that I had not, so to speak.
It is a pleasure to follow the noble Baroness, Lady Kidron. She has a track record of being extremely persuasive, so I hope the Minister pays heed in what happens between Committee and Report. I very much hope that there will be some room for manoeuvre and that there is not just permanent push-back, with the Minister saying that everything is about clarifying and us saying that everything is about dilution. There comes a point when we have to find some accommodation on some of these areas.
Amendments 48 and 49 are very similar—I was going to say, “Great minds think alike”, but I am not sure that my brain feels like much of a great mind at the moment. “Partly” or “predominantly” rather than “solely”, if you look at it the other way round, is really the crux of what I think many of us are concerned about. It is easy to avoid the terms of Article 22 just by slipping in some sort of token human involvement. Defining “meaningful” is so difficult in these circumstances. I am concerned that we are opening the door to something that could be avoided. Even then, the terms of the new clause—we will have a clause stand part debate on Wednesday, obviously—put all the onus on the data subject, whereas that was not the case previously under Article 22. The Minister has not really explained why that change has been made.
I conclude by saying that I very much support Amendment 41. This whole suite of amendments is well drafted. The point about the Equality Act is extremely well made. The noble Lord, Lord Holmes, also has a very good amendment here. It seems to me that involving the ICO right in the middle of this will be absolutely crucial—and we are back to public trust again. If nothing else, I would like explicitly to include that under Clause 14 in relation to Article 22 by the time this Bill goes through.
Can the Minister give me an indication of the level at which that kicks in? For example, say there is a child in a classroom and a decision has been made about their ability in a particular subject. Is it automatic that the parent and the child get some sort of read-out on that? I would be curious to know where the Government feel that possibility starts.
In that example, where a child was subject to a solely ADM decision, the school would be required to inform the child of the decision and the reasons behind it. The child and their parent would have the right to seek a human review of the decision.
We may come on to this when we get to edtech but a lot of those decisions are happening automatically right now, without any kind of review. I am curious as to why it is on the school whereas the person actually doing the processing may well be a technology company.
(1 year, 6 months ago)
Lords ChamberMy Lords, I too congratulate the noble Lord, Lord Holmes, on his wonderful speech. I declare my interests as an adviser to the Oxford Institute for Ethics in AI and the UN Secretary-General’s AI Advisory Body.
When I read the Bill, I asked myself three questions. Do we need an AI regulation Bill? Is this the Bill we need? What happens if we do not have a Bill? It is arguable that it would be better to deal with AI sector by sector—in education, the delivery of public services, defence, media, justice and so on—but that would require an enormous legislative push. Like others, I note that we are in the middle of a legislative push, with digital markets legislation, media legislation, data protection legislation and online harms legislation, all of which resolutely ignore both existing and future risk.
The taxpayer has been asked to make a £100 million investment in launching the world’s first AI safety institute, but as the Ada Lovelace Institute says:
“We are concerned that the Government’s approach to AI regulation is ‘all eyes, no hands’”,
with plenty of “horizon scanning” but no
“powers and resources to prevent those risks or even to react to them effectively after the fact”.
So yes, we need an AI regulation Bill.
Is this the Bill we need? Perhaps I should say to the House that I am a fan of the Bill. It covers testing and sandboxes, it considers what the public want, and it deals with a very important specific issue that I have raised a number of times in the House, in the form of creating AI-responsible officers. On that point, the CEO of the International Association of Privacy Professionals came to see me recently and made an enormously compelling case that, globally, we need hundreds of thousands of AI professionals, as the systems become smarter and more ubiquitous, and that those professionals will need standards and norms within which to work. He also made the case that the UK would be very well-placed to create those professionals at scale.
I have a couple of additions. Unless the Minister is going to make a surprise announcement, I think we are allowed to consider that he is going to take the Bill on in full. In addition, under Clause 2, which sets out regulatory principles, I would like to see consideration of children’s rights and development needs; employment rights, concerning both management by AI and job displacement; a public interest case; and more clarity that material that is an offence—such as creating viruses, CSAM or inciting violence—is also an offence, whether created by AI or not, with specific responsibilities that accrue to users, developers and distributors.
The Stanford Internet Observatory recently identified hundreds of known images of child sexual abuse material in an open dataset used to train popular AI text-to-image models, saying:
“It is challenging to clean or stop the distribution of publicly distributed datasets as it has been widely disseminated. Future datasets could use freely available detection tools to prevent the collection of known CSAM”.
The report illustrates that it is very possible to remove such images, but that it did not bother, and now those images are proliferating at scale.
We need to have rules upon which AI is developed. It is poised to transform healthcare, both diagnosis and treatment. It will take the weight out of some of the public services we can no longer afford, and it will release money to make life better for many. However, it brings forward a range of dangers, from fake images to lethal autonomous weapons and deliberate pandemics. AI is not a case of good or bad; it is a question of uses and abuses.
I recently hosted Geoffrey Hinton, whom many will know as the “godfather of AI”. His address to parliamentarians was as chilling as it was compelling, and he put timescales on the outcomes that leave no time to wait. I will not stray into his points about the nature of human intelligence, but he was utterly clear that the concentration of power, the asymmetry of benefit and the control over resources—energy, water and hardware—needed to run these powerful systems would be, if left until later, in so few hands that they, and not we, would be doing the rule setting.
My final question is: if we have no AI Bill, can the Government please consider putting the content of the AI regulation Bill into the data Bill currently passing through Parliament and deal with it in that way?
(1 year, 6 months ago)
Grand CommitteeMy Lords, I speak to Amendments 2, 3, 9 and 290 in my name. I thank the noble Baronesses, Lady Jones and Lady Harding, and the noble Lord, Lord Clement-Jones, for their support.
This group seeks to secure the principle that children should enjoy the same protections in UK law after this Bill passes into law as they do now. In 2018, this House played a critical role in codifying the principle that children merit special, specific protection in relation to data privacy by introducing the age-appropriate design code into the DPA. Its introduction created a wave of design changes to tech products: Google introduced safe search as its default; Instagram made it harder for adults to contact children via private messaging; Play Store stopped making adult apps available to under-18s; and TikTok stopped sending notifications through the night and hundreds of thousands of underage children were denied access to age-inappropriate services. These are just a handful of the hundreds of changes that have been made, many of them rolled out globally. The AADC served as a blueprint for children’s data privacy, and its provisions have been mirrored around the globe. Many noble Lords will have noticed that, only two weeks ago, Australia announced that it is going to follow the many others who have incorporated or are currently incorporating it into their domestic legislation, saying in the press release that it would align as closely as possible with the UK’s AADC.
As constructed in the Data Protection Act 2018, the AADC sets out the requirements of the UK GDPR as they relate to children. The code is indirectly enforceable; that is to say that the action the ICO can take against those failing to comply is based on the underlying provisions of UK GDPR, which means that any watering down, softening of provisions, unstable definitions—my new favourite—or legal uncertainty created by the Bill automatically waters down, softens and creates legal uncertainty and unstable definitions for children and therefore for child protection. I use the phrase “child protection” deliberately because the most important contribution that the AADC has made at the global level was the understanding that online privacy and safety are interwoven.
Clause 1(2) creates an obligation on the controller or processor to know, or reasonably to know, that an individual is an identifiable living individual. Amendments 2 and 3 would add a further requirement to consider whether that living individual is a child. This would ensure that providers cannot wilfully ignore the presence of children, something that tech companies have a long track record of doing. I want to quote the UK Information Commissioner, who fined TikTok £12.7 million for failing to prevent under-13s accessing that service; he said:
“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws … TikTok should have known better. TikTok should have done better … They did not do enough to check who was using their platform”.
I underline very clearly that these amendments would not introduce any requirement for age assurance. The ICO’s guidance on age assurance in the AADC and the provisions in the Online Safety Act already detail those requirements. The amendments simply confirm the need to offer a child a high bar of data privacy or, if you do not know which of your users are children, offer all users that same high bar of data privacy.
As we have just heard, it is His Majesty’s Government’s stated position that nothing in the Bill lessens children’s data privacy because nothing in the Bill lessens UK GDPR, and that the Bill is merely an exercise to reduce unnecessary bureaucracy. The noble Lords who spoke on the first group have perhaps put paid to that and I imagine that this position will be sorely tested during Committee. In the light of the alternative view that the protections afforded to children’s personal data will decline as a result of the Bill, Amendment 9 proposes that the status of children’s personal data be elevated to that of “sensitive personal data”, or special category data. The threshold for processing special category data is higher than for general personal data and the specific conditions include, for example, processing with the express consent of the data subject, processing to pursue a vital interest, processing by not-for-profits or processing for legal claims or matters of substantial public interest. Bringing children’s personal data within that definition would elevate the protections by creating an additional threshold for processing.
Finally, Amendment 290 enshrines the principle that nothing in the Bill should lead to a diminution in existing levels of privacy protections that children currently enjoy. It is essentially a codification of the commitment made by the Minister in the other place:
“The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code”.—[Official Report, Commons, 17/4/23; col. 101.]
Before I sit down, I just want to highlight the Harvard Gazette, which looked at ad revenue from the perspective of children. On Instagram, children account for 16% of ad revenue; on YouTube, 27%; on TikTok, 35%; and on Snap, an extraordinary 41.4%. Collectively, YouTube, Instagram and Facebook made nearly $2 billion from children aged nought to 12, and it will not escape many noble Lords that children aged nought to 12 are not supposed to be on those platforms. Instagram, YouTube and TikTok together made more than $7 billion from 13 to 17 year-olds. The amendments in this group give a modicum of protection to a demographic who have no electoral capital, who are not developmentally adult and whose lack of care is not an unfortunate by-product of the business model, but who have their data routinely extracted, sold, shared and scraped as a significant part of the ad market. It is this that determines the features that deliberately spread, polarise and keep children compulsively online, and it is this that the AADC—born in your Lordships’ House—started a global movement to contain.
This House came together on an extraordinary cross-party basis to ensure that the Online Safety Bill delivered for children, so I say to the Minister: I am not wedded to my drafting, nor to the approach that I have taken to maintain, clause by clause, the bar for children, even when that bar is changed for adults, but I am wedded to holding the tech sector accountable for children’s privacy, safety and well-being. It is my hope and—if I dare—expectation that noble Lords will join me in making sure that the DPDI Bill does not leave this House with a single diminution of data protection for children. To do so is, in effect, to give with one hand and take away with the other.
I hope that during Committee the Minister will come to accept that children’s privacy will be undermined by the Bill, and that he will work with me and others to resolve these issues so that the UK maintains its place as a global leader in children’s privacy and safety. I beg to move.
Okay. The Government feel that, in terms of the efficient and effective drafting of the Bill, that paragraph diminishes the clarity by being duplicative rather than adding to it by making a declaration. For the same reason, we have chosen not to make a series of declarations about other intentions of the Bill overall in the belief that the Bill’s intent and outcome are protected without such a statement.
My Lords, before our break, the noble Baroness, Lady Harding, said that this is hard-fought ground; I hope the Minister understands from the number of questions he has just received during his response that it will continue to be hard-fought ground.
I really regret having to say this at such an early stage on the Bill, but I think that some of what the Minister said was quite disingenuous. We will get to it in other parts of the Bill, but the thing that we have all agreed to disagree on at this point is the statement that the Bill maintains data privacy for everyone in the UK. That is a point of contention between noble Lords and the Minister. I absolutely accept and understand that we will come to a collective view on it in Committee. However, the Minister appeared to suggest—I ask him to correct me if I have got this wrong—that the changes on legitimate interest and purpose limitation are child safety measures because some people are saying that they are deterred from sharing data for child protection reasons. I have to tell him that they are not couched or formed like that; they are general-purpose shifts. There is absolutely no question but that the Government could have made specific changes for child protection, put them in the Bill and made them absolutely clear. I find that very worrying.
I also find it worrying, I am afraid—this is perhaps where we are heading and the thing that many organisations are worried about—that bundling the AADC in with the Online Safety Act and saying, “I’ve got it over here so you don’t need it over there” is not the same as maintaining the protections for children from a high level of data. It is not the same set of things. I specifically said that this was not an age-verification measure and would not require it; whatever response there was on that was therefore unnecessary because I made that quite clear in my remarks. The Committee can understand that, in order to set a high bar of data protection, you must either identify a child or give it to everyone. Those are your choices. You do not have to verify.
I will withdraw the amendment, but I must say that the Government may not have it both ways. The Bill cannot be different or necessary and at the same time do nothing. The piece that I want to leave with the Committee is that it is the underlying provisions that allow the ICO to take action on the age-appropriate design code. It does not matter what is in the code; if the underlying provisions change, so does the code. During Committee, I expect that there will be a report on the changes that have happened all around the world as a result of the code, and we will be able to measure whether the new Bill would be able to create those same changes. With that, I beg leave to withdraw my amendment.
My Lords, I speak to Amendments 8, 21, 23 and 145 in my name and thank the other noble Lords who have added their names to them. In the interests of brevity, and as the noble Lord, Lord Clement-Jones, has done some of the heavy lifting on this, I will talk first to Amendment 8.
The definition of scientific research has been expanded to include commercial and non-commercial activity, so far as it
“can reasonably be described as scientific”,
but “scientific” is not defined. As the noble Lord said, there is no public interest requirement, so a commercial company can, in reality, develop almost any kind of product on the basis that it may have a scientific purpose, even—or maybe especially—if it measures your propensity to impulse buy or other commercial things. The spectre of scientific inquiry is almost infinite. Amendment 8 would exclude children simply by adding proposed new paragraph (e), which says that
“the data subject is not a child or could or should be known to be a child”,
so that their personal data cannot be used for scientific research purposes to which they have not given their consent.
I want to be clear that I am pro-research and understand the critical role that data plays in enabling us to understand societal challenges and innovate towards solutions. Indeed, I have signed the amendment in the name of the noble Lord, Lord Bethell, which would guarantee access to data for academic researchers working on matters of public interest. Some noble Lords may have been here last night, when the US Surgeon- General Vice Admiral Dr Murthy, who gave the Lord Speaker’s lecture, made a fierce argument in favour of independent public interest research, not knowing that such a proposal has been laid. I hope that, when we come to group 17, the Government heed his wise words.
In the meantime, Clause 3 simply embeds the inequality of arms between academics and corporates and extends it, making it much easier for commercial companies to use personal data for research while academics continue to be held to much higher ethical and professional standards. They continue to require express consent, DBS checks and complex ethical requirements. Not doing so, simply using personal data for research, is unethical and commercial players can rely on Clause 3 to process data without consent, in pursuit of profit. Like the noble Lord, Lord Clement-Jones, I would prefer an overall solution to this but, in its absence, this amendment would protect data from being commoditised in this way.
Amendments 21 and 23 would specifically protect children from changes to Clause 6. I have spoken on this a little already, but I would like it on the record that I am absolutely in favour of a safeguarding exemption. The additional purposes, which are compatible with but go beyond the original purpose, are not a safeguarding measure. Amendment 21 would amend the list of factors that a data controller must take into account to include the fact that children are entitled to a higher standard of protection.
Amendment 23 would not be necessary if Amendment 22 were agreed. It would commit the Secretary of State to ensuring that, when exercising their power under new Article 8A, as inserted by Clause 6(5), to add, vary or omit provisions of Annex 2, they take the 2018 Act and children’s data protection into account.
Finally, Amendment 145 proposes a code of practice on the use of children’s data in scientific research. This code would, in contrast, ensure that all researchers, commercial or in the public interest, are held to the same high standards by developing detailed guidance on the use of children’s data for research purposes. A burning question for researchers is how to properly research children’s experience, particularly regarding the harms defined by the Online Safety Act.
Proposed new subsection (1) sets out the broad headings that the ICO must cover to promote good practice. Proposed new subsection (2) confirms that the ICO must have regard to children’s rights under the UNCRC, and that they are entitled to a higher standard of protection. It would also ensure that the ICO consulted with academics, those who represent the interests of children and data scientists. There is something of a theme here: if the changes to UK GDPR did not diminish data subjects’ privacy and rights, there would be no need for amendments in this group. If there were a code for independent public research, as is so sorely needed, the substance of Amendment 145 could usefully form a part. If commercial companies can extend scientific research that has no definition, and if the Bill expands the right to further processing and the Secretary of State can unilaterally change the basis for onward processing, can the Minister explain, when he responds, how he can claim that the Bill maintains protections for children?
My Lords, I will be brief because I associate myself with everything that the noble Baroness, Lady Kidron, just said. This is where the rubber hits the road from our previous group. If we all believe that it is important to maintain children’s protection, I hope that my noble friend the Minister will be able to accept if not the exact wording of the children-specific amendments in this group then the direction of travel—and I hope that he will commit to coming back and working with us to make sure that we can get wording into the Bill.
I am hugely in favour of research in the private sector as well as in universities and the public sector; we should not close our minds to that at all. We need to be realistic that all the meaningful research in AI is currently happening in the private sector, so I do not want to close that door at all, but I am extremely uncomfortable with a Secretary of State having the ability to amend access to personal data for children in this context. It is entirely sensible to have a defined code of conduct for the use of children’s data in research. We have real evidence that a code of conduct setting out how to protect children’s rights and data in this space works, so I do not understand why it would not be a good idea to do research if we want the research to happen but we want children’s rights to be protected at a much higher level.
It seems to me that this group is self-evidently sensible, in particular Amendments 8, 22, 23 and 145. I put my name to all of them except Amendment 22 but, the more I look at the Bill, the more uncomfortable I get with it; I wish I had put my name to Amendment 22. We have discussed Secretary of State powers in each of the digital Bills that we have looked at and we know about the power that big tech has to lobby. It is not fair on Secretaries of State in future to have this ability to amend—it is extremely dangerous. I express my support for Amendment 22.
Researchers must also comply with the required safeguards to protect individuals’ privacy. All organisations conducting scientific research, including those with commercial interests, must also meet all the safeguards for research laid out in the UK GDPR and comply with the legislation’s core principles, such as fairness and transparency. Clause 26 sets out several safeguards that research organisations must comply with when processing personal data for research purposes. The ICO will update its non-statutory guidance to reflect many of the changes introduced by this Bill.
Scientific research currently holds a privileged place in the data protection framework because, by its nature, it is already viewed as generally being in the public interest. As has been observed, the Bill already applies a public interest test to processing for the purpose of public health studies in order to provide greater assurance for research that is particularly sensitive. Again, this reflects recital 159.
In response to the noble Baroness, Lady Jones, on why public health research is being singled out, as she stated, this part of the legislation just adds an additional safeguard to studies into public health ensuring that they must be in the public interest. This does not limit the scope for other research unrelated to public health. Studies in the area of public health will usually be in the public interest. For the rare, exceptional times that a study is not, this requirement provides an additional safeguard to help prevent misuse of the various exemptions and privileges for researchers in the UK GDPR. “Public interest” is not defined in the legislation, so the controller needs to make a case-by-case assessment based on its purposes.
On the point made by the noble Lord, Lord Clement-Jones, about recitals and ICO guidance, although we of course respect and welcome ICO guidance, it does not have legislative effect and does not provide the certainty that legislation does. That is why we have done so via this Bill.
Amendment 7 to Clause 3 would undermine the broader consent concept for scientific research. Clause 3 places the existing concept of “broad consent” currently found in recital 33 to the UK GDPR on a statutory footing with the intention of improving awareness and confidence for researchers. This clause applies only to scientific research processing that is reliant on consent. It already contains various safeguards. For example, broad consent can be used only where it is not possible to identify at the outset the full purposes for which personal data might be processed. Additionally, to give individuals greater agency, where possible individuals will have the option to consent to only part of the processing and can withdraw their consent at any time.
Clause 3 clarifies an existing concept of broad consent which outlines how the conditions for consent will be met in certain circumstances when processing for scientific research purposes. This will enable consent to be obtained for an area of scientific research when researchers cannot at the outset identify fully the purposes for which they are collecting the data. For example, the initial aim may be the study of cancer, but it later becomes the study of a particular cancer type.
Furthermore, as part of the reforms around the reuse of personal data, we have further clarified that when personal data is originally collected on the basis of consent, a controller would need to get fresh consent to reuse that data for a new purpose unless a public interest exemption applied and it is unreasonable to expect the controller to obtain that consent. A controller cannot generally reuse personal data originally collected on the basis of consent for research purposes.
Turning to Amendments 132 and 133 to Clause 26, the general rule described in Article 13(3) of the UK GDPR is that controllers must inform data subjects about a change of purposes, which provides an opportunity to withdraw consent or object to the proposed processing where relevant. There are existing exceptions to the right to object, such as Article 21(6) of the UK GDPR, where processing is necessary for research in the public interest, and in Schedule 2 to the Data Protection Act 2018, when applying the right would prevent or seriously impair the research. Removing these exemptions could undermine life-saving research and compromise long-term studies so that they are not able to continue.
Regarding Amendment 134, new Article 84B of the UK GDPR already sets out the requirement that personal data should be anonymised for research, archiving and statistical—RAS—purposes unless doing so would mean the research could not be carried through. Anonymisation is not always possible as personal data can be at the heart of valuable research, archiving and statistical activities, for example, in genetic research for the monitoring of new treatments of diseases. That is why new Article 84C of the UK GDPR also sets out protective measures for personal data that is used for RAS purposes, such as ensuring respect for the principle of data minimisation through pseudonymisation.
The stand part notice in this group seeks to remove Clause 6 and, consequentially, Schedule 2. In the Government’s consultation on data reform, Data: A New Direction, we heard that the current provisions in the UK GDPR on personal data reuse are difficult for controllers and individuals to navigate. This has led to uncertainty about when controllers can reuse personal data, causing delays for researchers and obstructing innovation. Clause 6 and Schedule 2 address the existing uncertainty around reusing personal data by setting out clearly the conditions in which the reuse of personal data for a new purpose is permitted. Clause 6 and Schedule 2 must therefore remain to give controllers legal certainty and individuals greater transparency.
Amendment 22 seeks to remove the power to add to or vary the conditions set out in Schedule 2. These conditions currently constitute a list of specific public interest purposes, such as safeguarding vulnerable individuals, for which an organisation is permitted to reuse data without needing consent or to identify a specific law elsewhere in legislation. Since this list is strictly limited and exhaustive, a power is needed to ensure that it is kept up to date with future developments in how personal data is used for important public interest purposes.
I am interested that the safeguarding requirement is already in the Bill, so, in terms of children, which I believe the Minister is going to come to, the onward processing is not a question of safeguarding. Is that correct? As the Minister has just indicated, that is already a provision.
Just before we broke, I was on the verge of attempting to answer the question from the noble Baroness, Lady Kidron; I hope my coming words will do that, but she can intervene again if she needs to.
I turn to the amendments that concern the use of children’s data in research and reuse. Amendment 8 would also amend Clause 3; the noble Baroness suggests that the measure should not apply to children’s data, but this would potentially prevent children, or their parents or guardians, from agreeing to participate in broad areas of pioneering research that could have a positive impact on children, such as on the causes of childhood diseases.
On the point about safeguarding, the provisions on recognised legitimate interests and further processing are required for safeguarding children for compliance with, respectively, the lawfulness and purpose limitation principles. The purpose limitation provision in this clause is meant for situations where the original processing purpose was not safeguarding and the controller then realises that there is a need to further process it for safeguarding.
Research organisations are already required to comply with the data protection principles, including on fairness and transparency, so that research participants can make informed decisions about how their data is used; and, where consent is the lawful basis for processing, children, or their parents or guardians, are free to choose not to provide their consent, or, if they do consent, they can withdraw it at any time. In addition, the further safeguards that are set out in Clause 26, which I mentioned earlier, will protect all personal data, whether it relates to children or adults.
Amendment 21 would require data controllers to have specific regard to the fact that children’s data requires a higher standard of protection for children when deciding whether reuse of their data is compatible with the original purpose for which it was collected. This is unnecessary because the situations in which personal data could be reused are limited to public interest purposes designed largely to protect the public and children, in so far as they are relevant to them. Controllers must also consider the possible consequences for data subjects and the relationship between the controller and the data subject. This includes taking into account that the data subject is a child, in addition to the need to generally consider the interests of children.
Amendment 23 seeks to limit use of the purpose limitation exemptions in Schedule 2 in relation to children’s data. This amendment is unnecessary because these provisions permit further processing only in a narrow range of circumstances and can be expanded only to serve important purposes of public interest. Furthermore, it may inadvertently be harmful to children. Current objectives include safeguarding children or vulnerable people, preventing crime or responding to emergencies. In seeking to limit the use of these provisions, there is a risk that the noble Baroness’s amendments might make data controllers more hesitant to reuse or disclose data for public interest purposes and undermine provisions in place to protect children. These amendments could also obstruct important research that could have a demonstrable positive impact on children, such as research into children’s diseases.
Amendment 145 would require the ICO to publish a statutory code on the use of children’s data in scientific research and technology development. Although the Government recognise the value that ICO codes can play in promoting good practice and improving compliance, we do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by the new codes. Clause 33 of the Bill already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that it sees fit, so this is an issue that we could return to in the future if the evidence supports it.
I will read Hansard very carefully, because I am not sure that I absolutely followed the Minister, but we will undoubtedly come back to this. I will ask two questions. Earlier, before we had a break, in response to some of the early amendments in the name of the noble Lord, Lord Clement-Jones, the Minister suggested that several things were being taken out of the recital to give them solidity in the Bill; so I am using this opportunity to suggest that recital 38, which is the special consideration of children’s data, might usefully be treated in a similar way and that we could then have a schedule that is the age-appropriate design code in the Bill. Perhaps I can leave that with the Minister, and perhaps he can undertake to have some further consultation with the ICO on Amendment 145 specifically.
With respect to recital 38, that sounds like a really interesting idea. Yes, let us both have a look and see what the consultation involves and what the timing might look like. I confess to the Committee that I do not know what recital 38 says, off the top of my head. For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore not press them.
Returning to the questions by the noble Lord, Lord Clement-Jones, on the contents of recital 159, the current UK GDPR and EU GDPR are silent on the specific definition of scientific research. It does not preclude commercial organisations performing scientific research; indeed, the ICO’s own guidance on research and its interpretation of recital 159 already mention commercial activities. Scientific research can be done by commercial organisations—for example, much of the research done into vaccines, and the research into AI referenced by the noble Baroness, Lady Harding. The recital itself does not mention it but, as the ICO’s guidance is clear on this already, the Government feel that it is appropriate to put this on a statutory footing.
My Lords, I hope this is another lightbulb moment, as the noble Lord, Lord Clement-Jones, suggested. As well as Amendment 10, I will speak to Amendments 35, 147 and 148 in my name and the names of the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones. I thank them both. The purpose of these amendments is to move the Bill away from nibbling around the edges of GDPR in pursuit of post-Brexit opportunities and to actually deliver a post-Brexit opportunity.
These amendments would put the UK on an enhanced path of data sophistication while not challenging equivalence, which we will undoubtedly discuss during the Committee. I echo the voice of the noble Lord, Lord Allan, who at Second Reading expressed deep concern that equivalence was not a question of an arrangement between the Government and the EU but would be a question picked up by data activists taking strategic litigation to the courts.
Data protection as conceived by GDPR and in this Bill is primarily seen as an arrangement between an individual and an entity that processes that data—most often a commercial company. But, as evidenced by the last 20 years, the real power lies in holding either vast swathes of general data, such as those used by LLMs, or large groups of specialist data such as medical scans. In short, the value—in all forms, not simply financial—lies in big data.
As the value of data became clear, ideas such as “data is the new oil” and data as currency emerged, alongside the notion of data fiduciaries or data trusts, where you can place your data collectively. One early proponent of such ideas was Jaron Lanier, inventor of virtual reality; I remember discussing it with him more than a decade ago. However, these ideas have not found widespread practical application, possibly because they are normally based around ideas of micropayments as the primary value—and very probably because they rely on data subjects gathering their data, so they are for the boffins.
During the passage of the DPA 2018, one noble Lord counted the number of times the Minister said the words “complex” and “complicated” while referring to the Bill. Data law is complex, and the complicated waterfall of its concepts and provisions eludes most non-experts. That is why I propose the four amendments in this group, which would give UK citizens access to data experts for matters that concern them deeply.
Amendment 10 would define the term “data community”, and Amendment 35 would give a data subject the power to assign their data rights to a data community for specific purposes and for a specific time period. Amendment 147 would require the ICO to set out a code of conduct for data communities, including guidance on establishing, operating and joining a data community, as well as guidance for data controllers and data processors on responding to requests made by data communities. Amendment 148 would require the ICO to keep a register of data communities, to make it publicly available and to ensure proper oversight. Together, they would provide a mechanism for non-experts—that is, any UK citizen—to assign their data rights to a community run by representatives that would benefit the entire group.
Data communities diverge from previous attempts to create big data for the benefit of users, in that they are not predicated on financial payments and neither does each data subject need to access their own data via the complex rules and often obstructive interactions with individual companies. They put rights holders together with experts who do it on their behalf, by allowing data subjects to assign their rights so that an expert can gather the data and crunch it.
This concept is based on a piece of work done by a colleague of mine at the University of Oxford, Dr Reuben Binns, an associate professor in human-centred computing, in association with the Worker Info Exchange. Since 2016, individual Uber drivers, with help from their trade unions and the WIE, asked Uber for their data that showed their jobs, earnings, movements, waiting times and so on. It took many months of negotiation, conducted via data protection lawyers, as each driver individually asked for successive pieces of information that Uber, at first, resisted giving them and then, after litigation, provided.
After a period of time, a new cohort of drivers was recruited, and it was only when several hundred drivers were poised to ask the same set of questions that a formal arrangement was made between Uber and WIE, so that they could be treated as a single group and all the data would be provided about all the drivers. This practical decision allowed Dr Binns to look at the data en masse. While an individual driver knew what they earned and where they were, what became visible when looking across several hundred drivers is how the algorithm reacted to those who refused a poorly paid job, who was assigned the lucrative airport runs, whether where you started impacted on your daily earnings, whether those who worked short hours were given less lucrative jobs, and so on.
This research project continues after several years and benefits from a bespoke arrangement that could, by means of these amendments, be strengthened and made an industry-wide standard with the involvement of the ICO. If it were routine, it would provide opportunity equally for challenger businesses, community groups and research projects. Imagine if a group of elderly people who spend a lot of time at home were able to use a data community to negotiate cheap group insurance, or imagine a research project where I might assign my data rights for the sole purpose of looking at gender inequality. A data community would allow any group of people to assign their rights, rights that are more powerful together than apart. This is doable—I have explained how it has been done. With these amendments, it would be routinely available, contractual, time-limited and subject to a code of conduct.
As it stands, the Bill is regressive for personal data rights and does not deliver the promised Brexit dividends. But there are great possibilities, without threatening adequacy, that could open markets, support innovation in the UK and make data more available to groups in society that rarely benefit from data law. I beg to move.
My Lords, I think this is a lightbulb moment—it is inspired, and this suite of amendments fits together really well. I entirely agree with the noble Baroness, Lady Kidron, that this is a positive aspect. If the Bill contained these four amendments, I might have to alter my opinion of it—how about that for an incentive?
This is an important subject. It is a positive aspect of data rights. We have not got this right yet in this country. We still have great suspicion about sharing and access to personal data. There is almost a conspiracy theory around the use of data, the use of external contractors in the health service and so on, which is extremely unhelpful. If individuals were able to share their data with a trusted hub—a trusted community—that would make all the difference.
Like the noble Baroness, Lady Kidron, I have come across a number of influences over the years. I think the first time many of us came across the idea of data trusts or data institutions was in the Hall-Pesenti review carried out by Dame Wendy Hall and Jérôme Pesenti in 2017. They made a strong recommendation to the Government that they should start thinking about how to operationalise data trusts. Subsequently, organisations such as the Open Data Institute did some valuable research into how data trusts and data institutions could be used in a variety of ways, including in local government. Then the Ada Lovelace Institute did some very good work on the possible legal basis for data trusts and data institutions. Professor Irene Ng was heavily engaged in setting up what was called the “hub of all things”. I was not quite convinced by how it was going to work legally in terms of data sharing and so on, but in a sense we have now got to that point. I give all credit to the academic whom the noble Baroness mentioned. If he has helped us to get to this point, that is helpful. It is not that complicated, but we need full government backing for the ICO and the instruments that the noble Baroness put in her amendments, including regulatory oversight, because it will not be enough simply to have codes that apply. We have to have regulatory oversight.
My Lords, I thank the co-signatories of my amendments for their enthusiasm. I will make three very quick points. First, the certain rights that the Minister referred to are complaints after the event when something has gone wrong, not positive rights. The second point of contention I have is whether these are so far-reaching. We are talking about people’s existing rights, and these amendments do not introduce any other right apart from access to put them together. It is very worrying that the Government would see these as a threat when data subjects put together their rights but not when commercial companies put together their data.
Finally, what is the Bill for? If it is not for creating a new and vibrant data protection system for the UK, I am concerned that it undermines a lot of existing rights and will not allow for a flourishing of uses of data. This is the new world: the world of data and AI. We have to have something to offer UK citizens. I would like the Minister to say that he will discuss this further, because it is not quite adequate to nay-say it. I beg leave to withdraw.
(1 year, 8 months ago)
Grand CommitteeMy Lords, I have had a number of arguments about “proportionate” in the decade that I have been in this House. In fact, I remember that the very first time I walked into the Chamber the noble Lord, Lord Pannick, was having a serious argument with another noble Lord over a particular word. It went on for about 40 minutes and I remember thinking, “There is no place for me in this House”. Ten years later, I stand to talk about “proportionate”, which has played such a big part in my time here in the Lords.
During the passage of the DPA 2018, many of us tried to get “proportionate” into the Bill on the basis that we were trying to give comfort to people who thought data protection was in fact government surveillance of individuals. The Government said—quite rightly, as it turned out—that all regulators have to be
“proportionate, accountable, consistent, transparent, and targeted”
in the way in which they discharge their responsibilities and they pushed us back. The same thing happened on the age-appropriate design code with the ICO, and the same point was made again. As the noble Baroness, Lady Harding, just set out, we tried once more during the passage of the Online Safety Bill. Yet this morning I read this sentence in some draft consultation documents coming out of the Online Safety Act:
“Provisionally, we consider that a measure recommending that users that share CSAM”—
that is, for the uninitiated, child sexual abuse material—
“have their accounts blocked may be proportionate, given the severity of the harm. We need to do more work to develop the detail of any such measure and therefore aim to consult on it”.
This is a way in which “proportionate” has been weaponised in favour of the tech companies in one environment and it is what I am concerned about here.
As the noble Lord said, using “proportionate” introduces a gap in which uncertainty can be created, because some things are beyond question and must be considered, rather than considered on a proportionate basis. I finish by saying that associating the word specifically in relation to conduct requirements or making pro-competitive interventions must create a legal uncertainty if a regulator can pick up that word and put it against something so absolute and illegal and then have to discuss its proportionality.
I wonder if I can just slip in before Members on the Front Bench speak, particularly those who have signed the amendment. I refer again to my register of interests.
I support the principle that lies behind these amendments and want to reinforce the point that I made at Second Reading and that I sort of made on the first day in Committee. Any stray word in the Bill when enacted will be used by those with the deepest pockets—that is, the platforms—to hold up action against them by the regulator. I read this morning that the CMA has resumed its inquiry into the UK cloud market after an eight-month hiatus based on a legal argument put by Apple about the nature of the investigation.
It seems to me that Clause 19(5) is there to show the parameters on which the CMA can impose an obligation to do with fair dealing and open choices, and so on. It therefore seems that “proportionate”—or indeed perhaps even “appropriate”—is unnecessary because the CMA will be subject to judicial review on common-law principles if it makes an irrational or excessive decision and it may be subject to a legal appeal if people can argue that it has not applied the remedy within the parameters set by paragraphs (a), (b) and (c) of Clause 19(5). I am particularly concerned about whether there is anything in the Bill once enacted that allows either some uncertainty, which can be latched on to, or appeals—people refer to “judicial review plus” or appeals on the full merits, which are far more time-consuming and expensive and which will tie the regulator up in knots.
If “indispensable” and purely “benefit” are the same, why was the change made on Report in the Commons?
I was really interested in the introduction of the word “unknown”. The noble Lord, Lord Lansley, set out all the different stages and interactions. Does it not incentivise the companies to call back information to this very last stage, and the whole need-for-speed issue then comes into play?
I will revert first to the questions about the word “indispensable”. As I have said, the Government consulted very widely, and one of the findings of the consultation was that, for a variety of stakeholders, the word “indispensable” reduced the clarity of the legislation.
I cannot give a full account of the individual stakeholders right now; I am happy to ask the department to clarify further in that area. My contention is that the effect of the two sentences are the same, with the new one being clearer than the old one. I am very happy to continue to look at that and listen to the arguments of noble Lords, but that is the position. Personally, when I look at the two sentences, I find it very difficult to discern any difference in meaning between them. As I say, I am very happy to receive further arguments on that.
With respect to the participative arrangements by which a decision is reached around, for example, a conduct requirement, during the period of conduct requirement design, and during the decision-making period, it is, as my noble friend Lord Lansley has stated, highly to be expected that firms will make representations about the consumer benefits of their product. During a breach investigation, on the other hand, later on in the process, a consumer benefits exemption can be used as a safeguard or defence against a finding of breach.
Sorry, but there were so many questions that I have completely lost track. Perhaps the noble Baroness, Lady Kidron, will restate her question.
I think the Minister was in the middle of answering it and saying why something might be “unknown” right at the last.
As many noble Lords in the debate have alluded to, we have to be clear that this is a fast-moving field, and we have to at least allow for the possibility that new technologies can provide new consumer benefits and that it is okay to argue that a new and emerging technology that was not part of the original consideration can be considered as part of the defence against a finding of breach. The fact that the intended meaning is intended to be clearer in the current drafting is aiming to provide greater certainty to all businesses while ensuring that consumers continue to get the best outcomes.
Amendment 41, from the noble Lord, Lord Clement-Jones, would change the current drafting of the countervailing benefits exemption in several ways that together are intended to ensure that the CMA is provided as soon as possible with information relating to an SMS firm’s intention to rely on the exemption. We agree with noble Lords who have spoken today that it is important that the exemption cannot be used to avoid or delay enforcement action. The conduct investigation will operate in parallel to the assessment of whether the exemption applies, meaning that the investigation deadline of six months is not affected by the exemption process. The regime has been designed to encourage an open dialogue between the CMA and SMS firms, helping to avoid delays, unintended consequences and surprises on all sides. Therefore, in many cases, if a firm intends to rely on the exemption, we anticipate that this will be clear to all parties from early on in the process.
(1 year, 8 months ago)
Grand CommitteeMy Lords, I too faced a glitch, having wanted to add my name to these amendments. Since we are at a new stage of the Bill, I declare my interests as set out in the register, particularly as an adviser to the Institute for Ethics in AI at Oxford and to the Digital Futures centre at the LSE and as chair of the 5Rights Foundation. I support the noble Lord, Lord Clement-Jones, who has, with this group of amendments, highlighted that job creation or displacement and the quality of work are all relevant considerations for the CMA. I think it is worth saying that, when we talk about the existential threat of AI, we always have three areas of concern. The first is the veracity and provenance of information; the second is losing control of automated weapons; and the third, importantly in this case, is the many millions of jobs that will be lost, leaving human beings without ways to earn money or, perhaps, a reason for being.
There are two prevailing views on this. One is that of Elon Musk, who, without telling us how we might put food on the table, pronounced to the Prime Minister
“There will come a point where no job is needed – you can have a job if you want one for personal satisfaction but AI will do everything”.
The other, more optimistic view is that boring or repetitive work will go, which is, in part, beautifully illustrated by David Runciman’s recent book, The Handover, where he details the fate of sports officials. In 2021, Australian and US line judges were replaced by computers, while Wimbledon chose to keep them—largely for aesthetic reasons, because of the lovely Ralph Lauren white against the green grass. Meanwhile, Carl Frey and Michael Osborne, in their much-publicised 2017 study assessing the susceptibility of 702 different jobs to computerisation, suggested that sports officials had a 98% probability of being computerised.
In fact, since 2017, automation has come to all kinds of sports but, as Runciman says,
“Cricket matches, which traditionally featured just two umpires, currently have three to manage the complex demands of the technology, plus a referee to monitor the players’ behaviour”.
Soccer has five, plus large teams of screen watchers needed to interpret—very often badly—replays provided by VAR. The NBA Replay Center in Secaucus employs 25 people in a NASA-like control room, along with a rota of regular match officials.
It would be a fool who would bet that Elon Musk is entirely wrong, but nor should we rely on the fact that all sectors will employ humans to watch over the machines, or even that human beings will find that being the supervisor of a machine, or simply making an aesthetic contribution rather than being a decision-maker, is a good result. It is more likely that the noble Lord, Lord Knight, is correct that the algorithm will indeed be supervising the human beings.
I believe that the noble Lord, Lord Clement-Jones, and his co-author, the noble Lord, Lord Knight, may well prove to be very prescient in introducing this group of amendments that thoughtfully suggest at every stage of the Bill that the CMA should take the future of work and the impact of work into account in coming to a decision. As the noble Lord made clear in setting out each amendment, digital work is no longer simply gig work and the concentration in digital markets of behemoth companies has had and will continue to have huge consequences for jobs across supply lines, as well as wages within markets and, most particularly, on terms of employment and access to work.
AI is, without question, the next disruptor. Those companies that own the technology will be dominant across multiple markets, if not every market, and for the CMA to have a mandate to consider the impact on the workforce is more than sensible, more than foresightful; it is in fact a new reality. I note that the Minister, in responding to the last group, mentioned the importance of foreseeable and existing trends: here we have one.
My Lords, I do not actually have much to add to the excellent case that has already been made, but I, too, was at the meeting that the noble Baroness, Lady Jones of Whitchurch, mentioned, and noticed the CMA’s existing relationships.
Quite a lot has been said already, on the first group and just now, about lobbying—not lobbying only in a nasty sense but perhaps about the development of relationships that are simply human. I want to make it very clear that those words do not apply to the CMA specifically—but I have worked with many regulators, both here and abroad, and it starts with a feeling that the regulated, not the regulator, holds the information. It goes on to a feeling that the regulated, not the regulator, has the profound understanding of the limits of what is possible. It then progresses to a working relationship in which the regulator, with its limited resources, starts to weigh up what it can win, rather than what it should demand. That results in communities that have actually won legal protections remaining unprotected. It is a sort of triangulation of purpose, in which the regulator’s primary relationship ends up being geared towards government and industry, rather than towards the community that it is constituted to serve.
In that picture, I feel that the amendments in the name of the noble Baroness, Lady Jones of Whitchurch, make it clear, individually and collectively, that at every stage maximum transparency must be observed, and that the incumbents should be prevented from holding all the cards—including by hiding information from the regulator or from other stakeholders who might benefit from it.
I suggest that the amendments do not solve the problem of lobbying or obfuscation, but they incentivise providing information and they give challengers a little bit more of a chance. I am sure we are going to say again and again in Committee that information is power. It is innovation power, political power and market power. I feel passionately that these are technical, housekeeping amendments rather than ones that require any change of government policy.
My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron, whose speech segues straight into my Amendments 14 and 63. This is all about the asymmetry of information. On the one hand, the amendments from the noble Baroness, Lady Jones, which I strongly support and have signed, are about giving information to challengers, whereas my amendments are about extracting information from SMS undertakings.
Failure to respond to a request for information allows SMS players to benefit from the information asymmetry that exists in all technology markets. Frankly, incumbents know much more about how things work than the regulators. They can delay, obfuscate, claim compliance while not fully complying and so on. By contrast, if they cannot proceed unless they have supplied full information, their incentives are changed. They have an incentive to fully inform, if they get a benefit from doing so. That is why merger control works so well and quickly, as the merger is suspended pending provision of full information and competition authority oversight. We saw that with the Activision Blizzard case, where I was extremely supportive of what the CMA did—in many ways, it played a blinder, as was subsequently shown.
We on these Benches consider that a duty to fully inform is needed in the Bill, which is the reason for our Amendments 14 and 63. They insert a new clause in Chapter 2, which provides for a duty to disclose to the CMA
“a relevant digital activity that may give rise to actual or likely detrimental impact on competition in advance of such digital activity’s implementation or effect”
and a related duty in Chapter 6 ensuring that that undertaking
“has an overriding duty to ensure that all information provided to the CMA is full, accurate and complete”.
Under Amendment 14, any SMS undertaking wishing to rely on it must be required to both fully inform and pre-notify the CMA of any conduct that risks breaching one of the Bill’s objectives in Clause 19. This is similar to the tried-and-tested pre-notification process for mergers and avoids the reality that the SMS player may otherwise simply implement changes and ignore the CMA’s requests. A narrow pre-notification system such as this avoids the risks.
We fully support and have signed the amendments tabled by the noble Baroness, Lady Jones. As techUK says, one of the benefits that wider market participants see from the UK’s pro-competition regime is that the CMA will initiate and design remedies based on the evidence it gathers from SMS firms in the wider market. This is one of the main advantages of the UK’s pro-competition regime over the EU DMA. To achieve this, we need to make consultation rights equal for all parties. Under the Bill currently, firms with SMS status, as the noble Baroness, Lady Harding, said, will have far greater consultation rights than those that are detrimentally affected by their anti-competitive behaviour. As she and the noble Lord, Lord Vaizey, said, there are opportunities for SMS firms to comment at the outset but none for challenger firms, which can comment only at a later public consultation stage.
It is very important that there are clear consultation and evidence-gathering requirements for the CMA, which must ensure that it works fairly with SMS firms, challengers, smaller firms and consumers throughout the process, ensuring that the design of conduct requirements applies to SMS firms and pro-competition interventions consider evidence from all sides, allowing interventions to be targeted and capable of delivering effective outcomes. This kind of engagement will be vital to ensuring that the regime can meet its objectives.
We do not believe that addressing this risk requires removing the flexibility given by the Bill. Instead, we believe that it is essential that third parties are given a high degree of transparency and input on deliberation between the CMA and SMS firms. The CMA must also—and I think this touches on something referred to by the noble Baroness, Lady Jones—allow evidence to be submitted in confidence, as well as engage in wider public consultations where appropriate. We very strongly support the amendments.
On the amendments from the noble Lord, Lord Tyrie, it is a bit of a curate’s egg. I support Amendments 12A and 12B because I can see the sense in them. I do not see that we need to have another way of marking the CMA’s homework, however. I am a great believer that we need greater oversight, and we have amendments later in the Bill for proposals to increase parliamentary oversight of what the CMA is doing. However, marking the CMA’s homework at that stage is only going to be an impediment. It will be for the benefit of the SMS undertakings and not necessarily for those who wish to challenge the power of those undertakings. I am only 50% with the noble Lord, rather than the whole hog.
My Lords, all the SMS has to do is put it through one of its large language models, and hey presto.
I am losing track of the conversation because I thought we were asking for more information for the challenger companies. rather than this debate between the SMS and the regulator. Both of them are, I hope, well resourced, but the challenger companies have somehow been left out of this equation and I feel that we are trying to get them into the equation in an appropriate way.
That is not incompatible. These are two sides of the same coin, which is why they are in this group. I suppose we could have degrouped it.
My Lords, I shall also discuss the leveraging or whack-a-mole provisions. Perhaps Conservative Peers today are London buses: this is the fourth London bus to make the same point. I too would have added my name to my noble friend Lord Vaizey’s amendment had I been organised enough.
I shall make a couple of points. The noble Lord, Lord Tyrie, said earlier that we are all here on the Bill because harm has already been done. If noble Lords will forgive me, I will tell a little story. In 2012, I went on a customer trip to Mountain View, Google’s headquarters in California, as the chief executive of TalkTalk. We were in the early days of digital advertising and TalkTalk was one of its biggest customers. A whole group of customers went on what people now call a digital safari to visit California and see these tech companies in action.
I will never forget that the sales director left us for a bit for a demo from some engineers from head office in Mountain View, from Google, who demoed a new functionality they were working on to enable you to easily access price comparisons for flights. It was an interesting demo because some of the other big customers of Google search at the time were independent flight search websites, whose chief executives had been flown out by Google to see all the new innovation. The blood drained from their faces as this very well-meaning engineer described and demoed the new functionality and explained how, because Google controlled the page, it would be able to promote its flight search functionality to the top of the page and demote the companies represented in the room. When the sales director returned, it was, shall we say, quite interesting,
I tell that tale because there are many examples of these platforms leveraging the power of their platform to enter adjacent markets. As my noble friend has said, that gets to the core of the Bill and how important it is that the CMA is able to impose conduct requirements without needing to go through the whole SMS designation process all over again.
I know that the tech firms’ counterargument to this is that it is important that they have the freedom to innovate, and that for a number of them this would somehow create “a regulatory requirement to seek permission to innovate”. I want to counter that: we want all companies in this space to have the freedom to innovate, but they should not have the freedom to prioritise their innovation on their monopoly platform over other people’s innovation. That is why we have to get a definition of the leveraging principle, or the whack-a-mole principle, right. As with almost all the amendments we have discussed today, I am not particularly wedded to the specific wording, but I do not think that the Bill as it is currently drafted captures this clearly enough, and Amendments 25, 26, and 27 get us much closer to where we need to be.
I, too, add my voice in support my noble friend Lord Lansley’s amendments. I must apologise for not having studied them properly in advance of today, but my noble friend introduced them so eloquently that it is very clear that we need to put data clearly in the Bill.
Finally, as a member of my noble friend’s Communications and Digital Committee, I, too, listened very carefully to the comments made by the noble Lord, Lord Clement-Jones, about copyright. I feel this is a very big issue. Whether this is the right place to address it, I do not know, but I am sure he is right that we need to address it somehow.
My Lords, I am sorry to break the Conservative bus pattern but I, too, will speak to Amendments 26 and 27, to which I have added my name, and to Amendment 30. Before I do, I was very taken by the amendments spoken to by the noble Lord, Lord Lansley, and I support them. I feel somewhat sheepish that I had not seen the relationship between data and the Bill, having spent most of the past few months with my head in the data Bill. That connection is hugely important, and I am very grateful to the noble Lord for making such a clear case. In supporting Amendments 26 and 27, I recognise the value of Amendment 25, tabled by the noble Lord, Lord Vaizey, and put on record my support for the noble Lord, Lord Holmes, on Amendment 24. So much has been said that we have managed to change the name of the leveraging principle to the whack-a-mole principle and everything that has been said has been said very well.
The only point I want to make on these two amendments, apart from to echo the profound importance that other noble Lords have already spoken of, is that the ingenuity of the sector has always struck me as being equally divided between its incredible creativity in creating new products and things for us to do and services that it can provide, and an equal ingenuity in avoiding regulation of all kinds in all parts of the world. Without having not only the designated activity but the activities the sector controls that are adjacent to the activity, we do not have the core purpose of the Bill. At one point I thought it might help the Minister to see that the argument he made in relation to Clause 6(2) and (3), which was in defence of some flexibility for the Secretary of State, might equally be made on behalf of the regulator in this case.
Turning briefly to Amendment 30 in the name of the noble Lord, Lord Clement-Jones, I first have to make a slightly unusual declaration in that my husband was one of the Hollywood writers who went on strike and won a historic settlement to be a human being in charge of their AI rather than at the behest of the AI. Not only in the creative industries but in academia, I have seen first-hand the impact of scraping information. Not only is the life’s work of an academic taken without permission, but then regurgitating it as an inaccurate mere guess undermines the very purpose of academic distinctions. There is clearly a copyright issue that requires an ability both to opt out and correct, and to share in the upside, as the noble Lord pointed out.
I suggest that the LLMs and general AI firms have taken the axiom “it’s better to ask forgiveness than permission” to unbelievable new heights. Our role during the passage of this Bill may be to turn that around and say that it is better to ask permission than forgiveness.
My Lords, we have had a wonderfully eclectic debate. I am sorry if we gave some of the amendments more attention than others, because we have a number of very important issues here. Even in my response I may not be giving some colleagues due deference for their hard work and the good arguments they have put forward.
As noble Lords have commented, Amendments 26, 27 and 34 are in my name. As we have discussed, Amendments 26 and 27 would ensure that the CMA can tackle anti-competitive conduct in non-designated activity, provided that this conduct is related to designated activity. This would ensure, for example, that a designated company facing conduct requirements could not simply shift the resources of its business into another similar business venture, which would have a similar outcome of anti-competitive behaviour.
I am very grateful to the noble Baroness, Lady Stowell, for her support. The example she gave of Apple resonates with all of us and has obviously been in the news. It was one of the behaviours I described as rather vindictive in the last debate. I am not sure how much extra money Apple is going to make from it, but it is a question of rubbing someone’s nose in it because you do not like the decision that has been made. I feel that we need to address this issue.
The noble Lord, Lord Vaizey, in his Amendment 25, made a very similar point about the leveraging principle. We have all signed up to “the whack-a-mole principle”; I think we will call it that from now on. As the noble Baroness, Lady Harding, made clear, this is about addressing the leveraging of SMS markets to enter adjoining markets. She gave the example of travel price comparison. I feel that is a lazy innovation; if you get so big, you stop innovating—you copy the competing firms and taking their best ideas without innovating any more. It is in all our interests to get a grip on this, so that these companies that have great resources and great capacity for innovation innovate in a creative way rather than just copying other people’s ideas.
Amendment 34, which is also in our names, would enable the CMA to keep conduct requirements under review and take account of whether those requirements are having their intended effects or if further steps of pro-competition intervention is necessary. It would provide a clearer link between the measures available to the CMA. As the noble Lord, Lord Clement-Jones, and others have said, it underpins the importance of interoperability in CMA decisions. We believe that the amendments help to clarify and reinforce the powers available to the CMA.
I listened carefully to the noble Lord, Lord Holmes, who, as ever, provided enormous insight into the tech world and the consequences of the legislation. We share his objective of getting the powers of the CMA in the right balance. His amendment challenges the Government to explain why the CMA can only impose a conduct requirement to achieve the fair dealing, open choice or trust and transparency objectives—which seems to be overly restrictive and open to legal challenge. We look forward to hearing the Minister’s explanation of why those restrictions were felt necessary. The noble Lord, Lord Holmes, also raised an important point in his Amendment 24, which we have not given sufficient weight to, about the need for those conduct requirements to deliver proper accessibility in line with previous legislation. We absolutely support him in that quest.
The amendments from the noble Lords, Lord Clement-Jones and Lord Lansley, raise important points about transparency and improved data. They stress the importance of portability and interoperability and put data firmly into the conduct requirements. We support those arguments and look forward to the Minister’s response to what we feel are common-sense proposals.
(1 year, 9 months ago)
Lords ChamberMy Lords, I declare my interests set out in full on the register, including as an advisor to the Institute for Ethics in AI at Oxford University, chair of the Digital Futures for Children centre at the LSE and chair of the 5Rights Foundation. I add my welcome to my noble friend Lord de Clifford, who I had the pleasure of meeting yesterday, and I look forward to his maiden speech.
I start by quoting Marcus Fysh MP who said in the other place:
“this is such a serious moment in our history as a species. The way that data is handled is now fundamental to basic human rights … I say to those in the other place as well as to those on the Front Benches that … we should think about it incredibly hard. It might seem an esoteric and arcane matter, but it is not. People might not currently be interested in the ins and out of how AI and data work, but in future you can bet your bottom dollar that AI and data will be interested in them. I urge the Government to work with us to get this right”.—[Official Report, Commons, 29/11/23; col. 878.]
He was not the only one on Report in the other place who was concerned about some of the provisions in the Bill, who bemoaned the lack of scrutiny and urged the Government to think again. Nor was he the only one who reluctantly asked noble Lords to send the Bill back to the other place in better shape.
I associate myself with the broader points made by both noble Lords who have already spoken—I do not think I disagreed with a word that they said—but my own comments will primarily focus on the privacy of children, the case for data communities, access for researchers and, indeed, the promises made to bereaved parents and then broken.
During the passage of the Data Protection Act 2018, your Lordships’ House, with cross-party support, introduced the age appropriate design code, a stand-alone data protection regime for the under-18s. The AADC’s privacy by design approach ushered in a wave of design change to benefit children: TikTok and Instagram disabled direct messaging from unknown adults to children; YouTube turned off auto-play; Google turned on safe search on by default for children; 18-plus apps were taken out of the Play Store; TikTok stopped notifications through the night; and Roblox stopped tracking and targeting children for advertising. These were just a handful of hundreds of changes to products and services likely to be accessed by children. Many of these changes have been rolled out globally, meaning that while other jurisdictions cannot police the code, children in those places benefit from it. As the previous Minister, the noble Lord, Lord Parkinson, acknowledged, it contributes to the UK’s reputation for digital regulation and is now being copied around the globe.
I set this out at length because the AADC not only drove design change, it also established the crucial link between privacy and safety. This is why it is hugely concerning that children have not been explicitly protected from changes that lessen user data protections in the Bill. I have given Ministers notice that I will seek to enshrine the principle that children have the right to a higher bar of data protection by design and default; to define children’s data as sensitive personal data in the Bill; and exclude children from proposals that risk eroding the impact of the AADC, notably in risk assessments, automated processing, onward processing, direct marketing and the extended research powers of commercial companies.
Minister Paul Scully said at Second Reading in the other place:
“We are committed to protecting children and young people online … organisations will still have to abide by our Age-appropriate design code”.—[Official Report, Commons, 17/4/23; col. 101.]
I take it from those words that any perception of, or diminution to, children’s data rights is inadvertent, and it remains the Government’s policy not to weaken the AADC as currently configured in the Bill. Will the Minister confirm that it is indeed the Government’s intention to protect the AADC and that he is willing to work with me to ensure that it is that the outcome? I will also seek a requirement for the ICO to create a statutory children’s code in relation to AI. The ubiquitous deployment of AI technology to recommend and curate is nothing new, but the rapid advances in generative AI capabilities marks a new stage in its evolution. In the hundreds of pages of the ICO’s non-binding Guidance on AI and Data Protection, its AI and Data Protection Risk Toolkit and its advice to developers on generative AI, there is but one mention of the word “child”—in a case study about child benefit.
The argument made was that children are covered by the AADC, which underlines again just how consequential it is. However, since adults are covered by data law but it is considered necessary to have specific AI guidance, the one in three users that is under 18 deserves the same consideration. I am not at liberty to say today, but later this week—perhaps as early as tomorrow—information will emerge that underlines the urgent need for specific consideration of children’s safety in relation to generative models. I hope that the Minister will agree that an AI code for kids is an imperative rather than nice to have.
Similarly, we must deliver data privacy to children in education settings. Given the extraordinary rate at which highly personal data seeps out of schools into the commercial world, including to gambling companies and advertisers, coupled with the scale of tech adoption in schools, it is untenable to continue to see tech inside school as a problem for schools and tech outside school as a problem for regulators. The spectre of a nursery teacher having enough time and knowledge to integrate the data protection terms of a singing app, or the school ICT lead having to tackle global companies such as Google and Microsoft to set the terms for their students’ privacy, is frankly ridiculous, but that is the current reality. Many school leaders feel abandoned by the Government’s insistence that they should be responsible for data protection when both the AADC and Online Safety Act have been introduced but they benefit from neither. It should be the role of the ICO to set data standards for edtech and to ensure that providers are held to account if they fall short. As it stands, a child enjoys more protection on the bus to school than in the classroom.
Finally on issues relating to children, I want to raise a technical issue around the production of AI-generated child sexual abuse material. I recognise the Government’s exemplary record on tackling CSAM but, unfortunately, innovation does not stop. While AI-generated child sexual abuse content is firmly in scope of UK law, it appears that the models or plug-ins trained on generating CSAM or trained to generate CSAM are not. At least four laws, the earliest from 1978, are routinely used to bring criminal action against CSAM and perpetrators of it, so I would be grateful if the Minister would agree to explore the issue with the police unit that has raised it with me and make an explicit commitment to close any gaps identified.
We are at an inflection point, and however esoteric and arcane the issues around data appear to be, to downgrade a child’s privacy even by a small degree has huge implications for their safety, identity and selfhood. If the Government fail to protect and future-proof children’s privacy, they will be simply giving with one hand in the OSA and taking away with the other in this Bill.
Conscious that I have had much to say about children, I will briefly put on the record issues that we can debate at greater length in Committee. While data law largely rests on the assumption of a relationship between an individual and a service, we have seen over a couple of decades that power lies in having access to large datasets. The Bill offers a wonderful opportunity to put that data power in the hands of new entrants to the market, be they businesses or communities, by allowing the sharing of individual data rights and being able to assign data rights to third parties for agreed purposes. I have been inspired by approaches coming out of academia and the third sector which have supported the drafting of amendments to find a route that would enable the sharing of data rights.
Similarly, as the noble Lord, Lord Knight, said, we must find a route to access commercial data sets for public interest research. I was concerned that in the other place when former Secretary of State Jeremy Wright queried why a much-touted research access had not materialised in the Bill, the Minister appeared to suggest that it was covered. The current drafting embeds the asymmetries of power by allowing companies to access user data, including for marketing and creating new products, but does not extend access for public interest research into the vast databases held by those same companies. There is a feeling of urgency emerging as our academic institutions see their European counter- parts gain access to commercial data because of the DSA. There is an increased need for independent research to support our new regulatory regimes such as the Online Safety Act. This is an easy win for the Government and I hope that they grasp it.
Finally, I noted very carefully the words of the Minister when he said, in relation to a coroner’s access to data, that the Secretary of State had made an offer to fill the gap. This is a gap that the Government themselves created. During the passage of the Online Safety Act we agreed to create a humane route to access data when a coroner had reason to suspect that a regulated company might have information relevant to the death of a child. The Government have reneged by narrowing the scope to those children taking their own life. Expert legal advice says that there are multiple scenarios under which the Government’s narrowing scope creates a gaping hole in provision for families of murdered children and has introduced uncertainty and delay in cases where it may not be clear how a child died at the outset.
I must ask the Minister what the Government are trying to achieve here and who they are trying to please. Given the numbers, narrowing scope is unnecessary, disproportionate and egregiously inhumane. This is about parents of murdered children. The Government lack compassion. They have created legal uncertainty and betrayed and re-traumatised a vulnerable group to whom they made promises. As we go through this Bill and the competition Bill, the Minister will at some points wish the House to accept assurances from the Dispatch Box. The Government cannot assure the House until the assurances that they gave to bereaved parents have been fulfilled.
I will stop there, but I urge the Minister to respond to the issues that I have raised rather than leave them for another day. The Bill must uphold our commitment to the privacy and safety of children. It could create an ecosystem of innovative data-led businesses and keep our universities at the forefront of tech development and innovation. It simply must fulfil our promise to families who this Christmas and every other Christmas will be missing a child without ever knowing the full circumstances surrounding that child’s death. That is the inhumanity that we in this House promised to stop—and stop it we must.
(1 year, 10 months ago)
Lords ChamberI think there are two things. First, we are extremely keen, and have set this out in the White Paper, that the regulation of AI in this country should be highly interoperable with international regulation—I think all countries regulating would agree on that. Secondly, I take some issue with the characterisation of AI in this country as unregulated. We have very large areas of law and regulation to which all AI is subject. That includes data protection, human rights legislation, competition law, equalities law and many other laws. On top of that, we have the recently created central AI risk function, whose role is to identify risks appearing on the horizon, or indeed cross-cutting AI risks, to take that forward. On top of that, we have the most concentrated and advanced thinking on AI safety anywhere in the world to take us forward on the pathway towards safe, trustworthy AI that drives innovation.
My Lords, given the noble Viscount’s emphasis on the gathering of evidence and evidence-based regulation, can we anticipate having a researchers’ access to data measure in the upcoming Data Protection and Digital Information Bill?
I thank the noble Baroness for her question and recognise her concern. In order to be sure that I answer the question properly, I undertake to write to her with a full description of where we are and to meet her to discuss further.
(1 year, 10 months ago)
Lords ChamberMy Lords, I too welcome the right reverend Prelate the Bishop of Newcastle. I admire her bravery in wearing the colours of Sunderland and Newcastle simultaneously.
I declare my interests as chair of 5Rights Foundation, chair of the Digital Futures Commission at the LSE and adviser to the Institute for Ethics in AI at Oxford. Like others, I will start with Bletchley Park. That was kicked off by the Prime Minister, who set out his hopes for an AI-enabled world, while promising to tackle head-on its potential dangers. He said:
“Criminals could exploit AI for cyber-attacks, disinformation, fraud, or even child sexual abuse”—
but these are not potential dangers; they exist here and now.
In the race for AI prominence and the vast riches the technology promises, the tech leaders came to town warning us that the future they are creating is untrammelled, unprincipled and insecure and that AI will overwhelm human agency. I think that that language of existential threat makes for fabulous headlines, but it rather disempowers the rest of us. Because, if we ask if we want to supercharge the creation of child sexual abuse material, I would hazard a guess that the answer will be no; or if it is okay for facial recognition trained on white faces to prevent a black parent or child getting a security pass to enter a school, again no; or if we believe that just because something is technically possible—the creation of a disease or a weapon—it should be done, again no. Indeed, we have a record of containing the distribution of inventions that have the capability of annihilating us.
AI is not separate and different, and the language that we use to describe it—either its benefits or threats—must make that clear. AI is built, used and purveyed by business, government, civil society and even criminals. It is part of the human arrangements over which, for the moment, we still have agency. Language that disempowers us is part of the deliberate strategy of tech exceptionalism, advocated by industry lobbyists over decades, which has successfully secured the privatisation of technology, creating untold wealth for a few while outsourcing the cost to society. Who owns AI, who benefits, who is responsible and who gets hurt is still in the balance and I would assert that these are questions that we must deal with here and now.
I was disappointed to hear the noble Viscount say earlier at Questions that the Government were taking a sit-back-and-wait approach, so I have three rather more modest questions for the Minister, each of which could be tackled here and now. The first is: what plans do the Government have to ensure the robust application of our existing laws? As we saw earlier, the large language models and image creation services have used copyright material at scale. Getty Images has been testing it in court on behalf of its artists and photographers, but other rights holders, including some of the world’s finest authors, are unable to challenge this on an individual basis while their art and livelihood is scraped into vast datasets from which they do not benefit. I ask the Minister whether it would be a good idea to have an analysis of how new models are failing to uphold existing law and rights obligations as a first and urgent task for the new AI Safety Institute.
Secondly, how do the Government plan to use their legislative programme to tackle gaps that have been identified? For example, the creation, distribution and consumption of CSAM content is illegal, covered by at least three separate laws in the UK. But not one of these laws covers the models or plug-ins that create CSAM at scale—in one case, more than 20,000 images in a matter of hours—so the upcoming data protection Bill provides us with an opportunity to make training, sharing and possessing software that is trained on or trained to produce CSAM content an offence.
Also on the Prime Minister’s list is disinformation. Synthetic information that passes for real is also a here and now problem: the London Mayor, whose voice was fabricated, celebrities falsely endorsing products or a child’s picture scraped from a school website to train those aforesaid CSAM models. The loss of control of one’s personhood carries with it a democratic deficit and potentially overwhelming individual suffering. I ask the Minister whether the Government are willing to put beyond doubt that AI-generated biometric and image data constitutes a form of personal data over which an individual, whether adult or child, has rights, including the right to object to its use.
Both the data Bill and the digital markets Bill could create new data models—a subject that the noble Baroness, Lady Stowell, articulated very well in a recent article in the Times. New approaches to data rights, with new owners of data, are one way of having a voice in our AI-enabled future.
Thirdly and finally, I would like to ask the Minister why the Government have left children on the margins. I attended two official fringe events of the summit, one hosted by the then Home Secretary about child sexual abuse, the other convened by St Mary’s and the Turing Institute about embedding children’s rights in AI systems. Children are early adopters of technology—canaries in the coal mine—and many of us know the cost of poorly regulated digital environments for them. I am bewildered that, so soon after Royal Assent to the Online Safety Act, and in clear sight of the challenges that AI brings, the Government risk downgrading children’s data rights rather than explicitly protecting the age-appropriate design code and the definitions on which it is founded. Children should have been front and centre of the concerns at Bletchley, not pushed to the fringe, and perhaps the Minister could repair that damage by putting them front and centre of the new AI Safety Institute. After all, it is children who will inhabit the world we are building.
Finally, AI will create enormous benefits and upheaval across all sectors, but it also promises to put untold wealth and power in the hands of even fewer people. However, there are things in the here and now that we can do to ensure that technology innovates in ways that support human agency. It is tech exceptionalism that poses an existential threat to humanity, not the technology itself.