Data (Use and Access) Bill [HL] Debate

Full Debate: Read Full Debate
Department: Department for Business and Trade
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I reassure the noble Lord that, as he knows, we are very hopeful that we will have data adequacy so that issue will not arise. I will write to him to set out in more detail when those powers would be used.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank the Minister for her offer of a meeting. I could tell from the nods of my co-signatories that that would indeed be very welcome and we would all like to come. I was interested in the quote from the ICO about scraping. I doubt the Minister has it to hand, but perhaps she could write to say what volume of enforcement action has been taken by the ICO on behalf of data rights holders against scraping on that basis.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes, it would be helpful if we could write and set that out in more detail. Obviously the ICO’s report is fairly recent, but I am sure he has considered how the enforcement would follow on from that. I am sure we can write and give more details.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, in speaking to Amendment 137 in my name I thank the noble Baroness, Lady Harding, the noble Lord, Lord Stevenson, and my noble friend Lord Russell for their support. I also add my enthusiastic support to the amendments in the name of my noble friend Lord Colville.

This is the same amendment that I laid to the DPDI Bill, which at the time had the support of the Labour Party. I will not labour that point, but it is consistently disappointing that these things have gone into the “too difficult” box.

Amendment 137 would introduce a code of practice on children and AI. AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, the people they follow and the products they buy—and, as reported last weekend, AI is even helping farmers pick the ripest tomatoes for baked beans. But it no longer concerns simply the elective parts of life where, arguably, a child or a parent on their behalf can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors, the first of which is compulsory for children and the second of which is necessary for all of us.

The amendment has three parts. The first requires the ICO to create a code and sets out the expectations of its scope; the second considers who and what should be consulted and considered, including experts, children, and the frameworks that codify children’s existing rights; and the third part defines elements of the process, including risk assessment definitions, and sets out the principles to which the code must adhere.

When we debated this before, I anticipated that the Minister would say that the ICO had already published guidance, that we do not want to exclude children from the benefits of AI, and that we must not get in the way of innovation. Given that the new Government have taken so many cues from the previous one, I am afraid I anticipate a similar response.

I first point out, therefore, that the ICO’s non-binding guidance on AI and data protection is insufficient. It has only a single mention of a child in its 140 pages, which is a case study about child benefits. In the hundreds of pages of guidance, toolkits and sector information, nowhere are the specific needs and rights, or development vulnerabilities, of children comprehensively addressed in relation to AI. This absence of children is also mirrored in government publications on AI. Of course, we all want children to enjoy the benefits of AI, but consideration of their needs would increase the likelihood of those benefits. Moreover, it seems reckless and unprincipled not to protect them from known harms. Surely the last three decades of tech development have shown us that the experiment of a “build first, worry about the kids later—or never” approach has cost our children dearly.

Innovation is welcome but not all innovation is equal. We have bots offering 13 year-olds advice on how to seduce grown men, or encouraging them to take their own lives, edtech products that profile children to unfair and biased outcomes that limit their education and life chances, and we have gen AI that perpetuates negative, racist, misogynist and homophobic stereotypes. Earlier this month, the Guardian reported a deep bias in the AI used by the Department for Work and Pensions. This “hurt first, fix later” approach creates a lack of trust, increases unfairness, and has real-world consequences. Is it too much to insist that we ask better questions of systems that may result in children going hungry?

Why children? I am saddened that I must explain this, but from our deeply upsetting debate last week on the child protection amendments, in which the Government asserted that children are already catered for while deliberately downgrading their protections, it seems that the Government or their advisers have forgotten.

Children are different for three reasons. First, as has been established over decades, children are on a development journey. There are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony and learn different social skills. There are equally ages and stages at which they cannot do those things. The long-established consensus is that families, social groups and society more broadly, including government, step in to support them on this journey. Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces they inhabit have to be designed to be fit for childhood. Thirdly, we have a responsibility towards children that extends even beyond our responsibility to each other. This means that we cannot legitimatise profit at their expense. Allowing systems to play in the wild in the name of growth and innovation, leaving kids to pay the price, is a low bar.

It is worth noting that since we debated it, a proposal for this AI code for children that follows the full life cycle of development, deployment, use and retirement of AI systems has been drafted and has the support of multiple expert organisations and individuals around the globe. I am sure that all nations and intergovernmental organisations will have additional inputs and requirements, but it is worth saying that the proposed code, which was written with input from academics, computer scientists, lawyers, engineers and children’s rights activists, is mindful of and compatible with the EU AI Act, the White House Blueprint for an AI Bill of Rights, the Executive Order on the Safe, Secure and Trustworthy Development and Use of Artificial Intelligence, the Council of Europe’s Framework Convention on Artificial Intelligence and, of course, the UNCRC general comment no. 25.

This proposal will be launched early next year as an indication of what could and should be done. Unless the Government find their compass vis-à-vis children and tech, I suspect that another jurisdiction will adopt it ahead of the UK, making that the go-to destination for trusted tech development for child-safe products. It is perhaps worth reminding the Committee that one in three connected people is under 18, which is roughly 1 billion children. As the demographics change, the proportion and number of children will rise. It is a huge financial market.

Before I sit down, I shall briefly talk about the AADC because sometimes Ministers say that we already have a children’s code. The age-appropriate design code covers only ISS, which automatically limits it, and even the ICO by now agrees that its enforcement record is neither extensive nor impressive. It does not clearly cover the urgent area of edtech, which is the subject of another amendment, and, most pertinently to this amendment, it addresses AI profiling only, which means that it is limited in how it can look at the new and emerging challenges of generative AI. A revamp of the AADC to tackle the barriers of enforcement, account for technological advances, cover all products and services likely to be accessed by children and make our data regime AI-sensitive would be welcome, but rather than calling for a strengthening of the AADC, the ICO agreed to the downgrading of children’s data protection in the DPDI Bill and, again, has agreed to the downgrading of protections in the current Bill on ADM, scientific research, onward processing and so on. A stand-alone code for AI development is required because in this way we could be sure that children are in the minds of developers at the outset.

It is disappointing that the UK is failing to claim its place as the centre of regulated and trusted innovation. Although we are promised an AI Bill, the Government repeatedly talk of large frontier companies. AI is in every part of a child’s life from the news they read to the prices they pay for travel and goods. It is clear from previous groups that many colleagues feel that a data Bill with no AI provisions is dangerous commercially and for the communities of the UK. An AI Bill with no consideration of the daily impact on children may be a very poor next choice. Will the Minister say why a Labour Government are willing to abandon children to technology rather than building technology that anticipates children’s rights and needs?

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow my friend the noble Baroness, Lady Kidron, and to give full-throated support to my friend the noble Viscount, Lord Colville, on all his amendments. Given that the noble Baroness mentioned it and that another week has passed since we asked the Minister the question, will we see an AI Bill or a consultation before Santa comes or at some stage in the new year? I support all the amendments in this group and in doing so, as it is the first time I have spoken today in Committee, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business.

I will speak particularly to my Amendment 211A. I have put down “image, likeness and personality” not because I believe that they stand as the most important rights that are being transgressed or that they are the most important rights which we should consider; I have put them down to give a specific focus on them because, right now, they are being largely cut across and ignored, so that all of our creatives find themselves with their works, but also image, likeness and personality, disappearing into these largely foundation AI models with no potential for redress.

Once parts of you such as your name, face or voice have been ingested, as the noble Lord, Lord Clement-Jones, said in the previous group, it is difficult then to have them extracted from the model. There is no sense, for example, of seeking an equitable remedy to put one back in the situation had the breach not occurred. It is almost “once in, forever in”, then works start to be created based on those factors, features and likenesses, which compete directly with the creatives. This is already particularly prevalent in the music industry.

What plans do the Government have in terms of personality rights, image and likeness? Are they content with the current situation where there is no protection for our great creatives, not least in the music industry? What does the Bill do for our creatives? I go back to the point made by the noble Baroness, Lady Kidron. How can we have all these debates on a data Bill which is silent when it comes to AI, and a product regulation Bill where AI is specifically excluded, and yet have no AI Bill on the near horizon—unless the Minister can give us some up-to-date information this afternoon? I look forward to hearing from her.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, for their amendments and consideration of this policy area. I hope noble Lords will bear with me if I save some of the points I shall make on web crawling and intellectual property for the later group, which is specifically on that topic.

Amendments 92 and 93 from the noble Viscount are about the new disproportionate effort exemption in Article 13. I can reassure noble Lords that this exemption applies only when data is collected directly from the data subject, so it cannot be used for web crawling, which is, if you like, a secondary activity. I think that answers that concern.

Amendments 101 and 105, also from the noble Viscount, are about the changes to the existing exemption in Article 14, where data is collected from other sources. Noble Lords debated this issue in the previous group, where Amendments 97 and 99 sought to remove this exemption. The reassurances I provided to noble Lords in that debate about the proportionality test being a case-by-case exercise also apply here. Disproportionate effort cannot be used as an excuse; developers must consider the rights of the data subject on each occasion.

I also draw noble Lords’ attention to another quote from the ICO itself, made when publishing its recent outcome reports. I know I have already said that I will share more information on this. It says:

“Generative AI developers, it’s time to tell people how you’re using their information”.


The ICO is on the case on this issue, and is pursuing it.

On Amendment 137 from the noble Baronesses, Lady Kidron and Lady Harding, and other noble Lords, I fully recognise the importance of organisations receiving clear guidance from regulators, especially on complex and technical issues. AI is one such issue. I know that noble Lords are particularly conscious of how it might affect children, and I am hearing the messages about that today.

As the noble Baroness will know, the Secretary of State already has the power to request statutory codes such as this from the regulator. The existing power will allow us to ensure the correct scope of any future codes, working closely with the ICO and stakeholders and including noble Lords here today, and I am happy to meet them to discuss this further. The Government are, naturally, open to evidence about whether new statutory codes should be provided for by regulations in future. Although I appreciate the signal this can send, at the moment I do not believe that a requirement for codes on this issue is needed in this legislation. I hope noble Lords are reassured that the Government are taking this issue seriously.

Amendment 211A from the noble Lord, Lord Holmes, is about prohibiting the processing of people’s names, facial images, voices or any physical characteristics for AI training without their consent. Facial images and other physical characteristics that can be used to identify a person are already protected by the data protection legislation. An AI developer processing such data would have to identify a lawful ground for this. Consent is not the only option available, but I can reassure the noble Lord that there are firm safeguards in place for all the lawful grounds. These include, among many other things, making sure that the processing is fair and transparent. Noble Lords will know that even more stringent conditions, such as safeguards applying in relation to race, sexual orientation and any biometric data that can be used to identify someone as types of a special category of data are also covered.

Noble Lords tried to tempt me once again on the timetable for the AI legislation. I said as much as I could on that when we debated this in the last session, so I cannot add any more at this stage.

I hope that reassures noble Lords that the Bill has strong protections in place to ensure responsible data use and reuse, and, as such, that they feel content not to press their amendments.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I understand the point that the Secretary of State has the power, but does he have the intention? We are seeking an instruction to the ICO to do exactly this thing. The Secretary of State’s intention would be an excellent compromise all round to activate such a thing, and to see that in the Bill is the point here.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Discussions with the ICO are taking place at the moment about the scope and intention of a number of issues around AI, and this issue would be included in that. However, I cannot say at the moment that that intention is specifically spelled out in the way that the noble Baroness is asking.

--- Later in debate ---
I would want to see algorithmic impact assessments that cover significant impacts on work and workers, such as any impact on equal opportunities or outcomes at work, access to employment, pay, contractual status, terms and conditions of employment, health, lawful association, rights and training. Assessments should also be on an ongoing rather than a snapshot basis, involve those affected, including official representatives, in a proportionate way, and should disclose metrics and methods and be developed by regulators at both a domain and a sector level. I could go on, but I look forward to the Minister’s response.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I speak to Amendment 114 to which I have added my name. It is a very simple amendment that prevents controllers circumventing the duties for automated decision-making by adding trivial human elements to avoid the designation. So, as such, it is a very straightforward—and, I would have thought, uncontroversial—amendment. I really hope that the Government will find something in all our amendments to accept, and perhaps that is one such thing.

I am struck that previous speeches have referred to questions that I raised last week: what is the Bill for, who is it for and why is not dealing with a host of overlapping issues that cannot really be extrapolated one from another? In general, a bit like the noble Lord, Lord Holmes, I am very much with the spirit of all these amendments. They reflect the view of the Committee and the huge feeling of civil society—and many lawyers—that this sort of attack on Article 22 by Clause 80 downgrades UK data rights at a time when we do not understand the Government’s future plans and hear very little about protections. We hear about the excitements of AI, which I feel bound to say that we all share, but not at the expense of individuals.

I raise one last point in this group. I had hoped that the Minister would have indicated the Government’s openness to Amendment 88 last week, which proposed an overarching duty on controllers and processors to provide children with heightened protections. That seemed to me the most straightforward mechanism for ensuring that current standards were maintained and then threaded through new situations and technologies as they emerged. I put those two overarching amendments down on the understanding that Labour, when in opposition, was very much for this approach to children. We may need to bring back specific amendments, as we did throughout the Data Protection and Digital Information Bill, including Amendment 46 to that Bill, which sought to ensure

“that significant decisions that impact children cannot be made using automated processes unless they are in a child’s best interest”.

If the Minister does not support an overarching provision, can she indicate whether the Government would be more open to clause-specific carve-outs to protect children and uphold their rights?

Lord Thomas of Cwmgiedd Portrait Lord Thomas of Cwmgiedd (CB)
- Hansard - - - Excerpts

My Lords, I rise briefly, first, to thank everyone who has spoken so eloquently about the importance of automated decision-making, in particular its importance to public trust and the importance of human intervention. The retrograde step of watering down Article 22 is to be deplored. I am therefore grateful to the noble Lord, Lord Clement-Jones, for putting forward that this part of the Bill should not stand part. Secondly, the specific amendment that I have laid seeks to retain the broader application of human intervention for automated decision-making where it is important. I can see no justification for that watering down, particularly when there is such uncertainty about the scope that AI may bring to what can be done by automated decision-making.

--- Later in debate ---
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I have Amendment 201 in this group. At the moment, Action Fraud does not record attempted fraud; it has to have been successful for the website to agree to record it. I think that results in the Government taking decisions based on distorted and incomplete data. Collecting full data must be the right thing to do.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I had expected the noble Baroness, Lady Owen of Alderley Edge, to be in the Room at this point. She is not, so I wish to draw the Committee’s attention to her Amendment 210. On Friday, many of us were in the Chamber when she made a fantastic case for her Private Member’s Bill. It obviously dealt with a much broader set of issues but, as we have just heard, the overwhelming feeling of the House was to support her. I think we would all like to see the Government wrap it up, put a bow on it and give it to us all for Christmas. But, given that that was not the indication we got, I believe that the noble Baroness’s intention here is to deal with the fact that the police are giving phones and devices back to perpetrators with the images remaining on them. That is an extraordinary revictimisation of people who have been through enough. So, whether or not this is the exact wording or way to do it, I urge the Government to look on this carefully and positively to find a way of allowing the police the legal right to delete data in those circumstances.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.

Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.

As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.

Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.

I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.

Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.

Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.

--- Later in debate ---
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

I have Amendment 135A in this group. The Bill provides a new set of duties for the Information Commissioner but no strategic framework, as the DPDI Bill did. The Information Commissioner is a whole-economy regulator. To my mind, the Government’s strategic priorities should bear on it. This amendment would provide an enabling power, such as that which the Competition and Markets Authority, which is in an equivalent economic position, already has.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I have huge sympathy for, and experience of, many of the issues raised by the noble Lord, Lord Clement-Jones, but, given the hour, I will speak only to Amendment 145 in my name and those of the noble Baroness, Lady Harding, my noble friend Lord Russell and the noble Lord, Lord Stevenson. Given that I am so critical, I want to say how pleased I am to see the ICO reporting requirements included in the Bill.

Amendment 145 is very narrow. It would require the ICO to report specifically and separately on children. It is fair to say that one of the many frustrations for those of us who spend our time advocating for children’s privacy and safety is trying to extrapolate child-specific data from generalised reporting. Often it is not reported because it is useful to hide some of the inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snapchat provides a breakdown of the violation rate by age group, even though that would provide valuable information for academics, Governments, legislators, NGOs and, of course, regulators. It was a point of contention between many civil society organisations and Ofcom that there was no evidence that children of different ages react in different ways, which, for anyone who has had children, is clearly not the case.

Similarly, for many years we struggled to understand Ofcom’s reporting because older children were included in a group that went up to 24, and it took over 10 years for that to change. It seems to me—I hope the Government agree—that since children are entitled to specific data privacy benefits, it follows that the application and enforcement of those benefits should be reported separately. I hope that the Government can give a quick yes on this small but important amendment.