All 4 Baroness Harding of Winscombe contributions to the Data (Use and Access) Bill [HL] 2024-26

Read Bill Ministerial Extracts

Mon 16th Dec 2024
Tue 21st Jan 2025
Tue 21st Jan 2025
Tue 28th Jan 2025

Data (Use and Access) Bill [HL]

Baroness Harding of Winscombe Excerpts
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow my friend the noble Baroness, Lady Kidron, and to give full-throated support to my friend the noble Viscount, Lord Colville, on all his amendments. Given that the noble Baroness mentioned it and that another week has passed since we asked the Minister the question, will we see an AI Bill or a consultation before Santa comes or at some stage in the new year? I support all the amendments in this group and in doing so, as it is the first time I have spoken today in Committee, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business.

I will speak particularly to my Amendment 211A. I have put down “image, likeness and personality” not because I believe that they stand as the most important rights that are being transgressed or that they are the most important rights which we should consider; I have put them down to give a specific focus on them because, right now, they are being largely cut across and ignored, so that all of our creatives find themselves with their works, but also image, likeness and personality, disappearing into these largely foundation AI models with no potential for redress.

Once parts of you such as your name, face or voice have been ingested, as the noble Lord, Lord Clement-Jones, said in the previous group, it is difficult then to have them extracted from the model. There is no sense, for example, of seeking an equitable remedy to put one back in the situation had the breach not occurred. It is almost “once in, forever in”, then works start to be created based on those factors, features and likenesses, which compete directly with the creatives. This is already particularly prevalent in the music industry.

What plans do the Government have in terms of personality rights, image and likeness? Are they content with the current situation where there is no protection for our great creatives, not least in the music industry? What does the Bill do for our creatives? I go back to the point made by the noble Baroness, Lady Kidron. How can we have all these debates on a data Bill which is silent when it comes to AI, and a product regulation Bill where AI is specifically excluded, and yet have no AI Bill on the near horizon—unless the Minister can give us some up-to-date information this afternoon? I look forward to hearing from her.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I should first apologise for not being able to attend Second Reading or, arguably more importantly, to be in Committee last week to support the many amendments of the noble Baroness, Lady Kidron, on child protection. I read Hansard carefully and was deeply depressed to see that we were once again needing to rehearse, as she has done again today, the importance of protecting children in the digital era. It seems to be our lot that there is a group of us who keep coming back. We play the merry-go-round and sit in different places; it is a privilege to sit next to the noble Baroness, Lady Kidron, for the first time in the decade that I have been in the House. I support her Amendment 137. She has given a good exposé as to why we should think really carefully about how we protect children in this AI world. I would just like to add one point about AI itself.

We keep being told—in a good way—that AI is an underlying and general-purpose technology. That means we need to properly establish the principles with which we should protect children there. We know that technology is morally neutral; it is the human beings who do the damage. In every other underlying, breakthrough technology, we have learned that we have needed to protect the most vulnerable, whether it was electricity when it first went into factories, toys when they were first distributed on the mass market, or social media, with the age-appropriate design code. I feel that it would be a huge mistake, on the third Bill where many of us have debated this subject matter, for us not to address the fact that, as of today, this is the biggest breakthrough technology of our lifetime. We should recognise that children will need protecting, as well as having the opportunity to benefit from it.

--- Later in debate ---
Moved by
95: Clause 77, page 91, line 16, leave out “to the extent that” and insert “when any one or more of the following is true”
Member’s explanatory statement
This amendment would clarify that only one condition under paragraph 5 must be present for paragraphs 1 to 4 to not apply.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I was in such a hurry to apologise just now for missing Second Reading that I forgot to declare my interests and remind the Committee of my technology and, with regard to this group, charitable interests as set out in the register.

I shall speak to Amendments 95, 96, 98, 101, 102 and 104 in my name and those of the noble Lords, Lord Clement-Jones and Lord Stevenson of Balmacara, and my noble friend Lord Black of Brentwood, and Amendments 103 and 106 in my name and those of the noble Lords, Lord Clement-Jones and Lord Stevenson. I also support Amendment 162 in the name of the noble Lord, Lord Clement-Jones. I will speak only on the marketing amendments in my name and leave the noble Lord, Lord Clement-Jones, to do, I am sure, great justice to the charitable soft opt-in.

These amendments are nothing like as philosophical and emotive as the last amendment on children and AI. They aim to address a practical issue that we debated in the late spring on the Data Protection and Digital Information Bill. I will not rehearse the arguments that we made, not least because the Minister was the co-signatory of those amendments, so I know she is well versed in them.

Instead, I shall update the Committee on what has happened since then and draw noble Lords’ attention to a couple of the issues that are very real and present now. It is strange that all Governments seem reluctant to restrict the new technology companies’ use of our data but extremely keen to get into the micro detail of restricting older forms of our using data that we have all got quite used to.

That is very much the case for the open electoral register. Some 63% of people opt out of being marketed at, because they have put their name as such on the electoral register. This is a well known and well understood use of personal data. Yet, because of the tribunal ruling, it is increasingly the case that companies cannot use the open electoral register and target the 37% of people who have said that they are quite happy to receive marketing unless the company lets every single one of those users know that they are about to market to them. The danger is that we create a new cookie problem—a physical cookie problem—where, if you want to use a data source that has been commonplace for 40 years, you have to send some marketing to tell people that you are about to use it. That of course means that you will not do so, which means that you reduce the data available to a lot of small and medium-sized businesses to market their products and hand them straight to the very big tech companies, which are really happy to scrape our data all over the place.

This is a strange one, where I find myself arguing that we should just allow something that is not broken not to need to be fixed. I appreciate that the Minister will probably tell us that the wording in these amendments is not appropriate. As I said earlier in the year—in April, in the previous incarnation—I very much hope that if the wording is incorrect we could, between Committee and Report, have a discussion and agree on some wording that achieves what seems just practical common sense.

The tribunal ruling that created this problem recognised that it was causing a problem. It stated that it accepted that the loophole it created would allow one company, Experian, a sizeable competitive advantage. It is a slightly perverse one: it means that it has to let only 5 million people know that it might be about to use the open electoral register, while its competitors have to let 22 million people know. That just does not pass the common-sense test of practical use of data. Given the prior support that the Minister has shown for this issue, I very much hope that we can resolve it between Committee and Report. I beg to move.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I have a couple of amendments in this group, Amendments 158 and 161. Amendment 158 is largely self-evident; it tries to make sure that, where there is a legal requirement to communicate, that communication is not obstructed by the Bill. I would say much the same of Amendment 161; that, again, it is obvious that there ought to be easy communication where a person’s pension is concerned and the Bill should not obstruct it. I am not saying that these are the only ways to achieve these things, but they should be achieved.

I declare an interest on Amendment 160, in that I control the website of the Good Schools Guide, which has advertising on it. The function of advertising on the web is to enable people to see things for free. It is why it does not close down to a subscription-only service. If people put advertisements on the web, they want to know that they are effective and have been seen, and some information about who they have been seen by. I moved a similar amendment to the previous Government’s Bill and encountered some difficulty. If the Government are of the same mind—that this requires us to be careful—I would very much welcome the opportunity of a meeting between now and Report, and I imagine others would too, to try to understand how best to make sure that advertising can flourish on the internet.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank the noble Baroness very much for that very helpful intervention. If she has any more information about the view of the Charity Commission, we would obviously like to engage with that because we need to get this right. We want to make sure that individuals welcome and appreciate the information given to them, rather than it being something that could have a negative impact.

I think I have covered all the issues. I hope those explanations have been of some reassurance to noble Lords and that, as such, they are content not to press their amendments.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

May I just follow up by asking one quick question? I may be clutching at straws here but, in responding to the amendments in my name, she stated what the ICO believes rather than what the Government believe. She also said that the ICO may think that further permission is required to ensure transparency. I understand from the Data & Marketing Association that users of this data have four different ways of ensuring transparency. Would the Minister agree to a follow-up meeting to see whether there is a meeting of minds with what the Government think, rather than the ICO?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I am very happy to talk to the noble Baroness about this issue. She asked what the Government’s view is; we are listening very carefully to the Information Commissioner and the advice that he is putting together on this issue.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I am very grateful for the answers the noble Baroness gave to my amendments. I will study carefully what she said in Hansard, and if I have anything further to ask, I will write to her.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, in response—and very briefly, given the technical nature of all these amendments—I think that we should just note that there are a number of different issues in this group, all of which I think noble Lords in this debate will want to follow up. I thank the many noble Lords who have contributed both this time round and in the previous iterations, and ask that we follow up on each of the different issues, probably separately rather than in one group, as we will get ourselves quite tangled in the web of data if we are not careful. With that, I beg leave to withdraw the amendment.

Amendment 95 withdrawn.

Data (Use and Access) Bill [HL]

Baroness Harding of Winscombe Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise to move Amendment 15 and to speak to Amendments 16, 20, 22, 27, 39, 45 and, briefly, government Amendment 40. Together, these amendments offer protections that children were afforded in the Data Protection Act 2018, which passed through this House, and they seek to fix some of the underperformance of the ICO in relation to children’s data.

Before we debate these amendments, it is perhaps worth the Government reflecting on the fact that survey after survey shows that the vast majority—indeed, almost all—of the UK population support stronger digital regulation in respect of children. In refusing to accept these amendments, or, indeed, in replacing them with their own amendments to the same effect, the Government are throwing away one of the successes of the UK Parliament with their newfound enthusiasm for tech with fewer safeguards.

I repeat my belief that lowering data protections for adults is a regressive step for all of us, but for children it is a tragedy that puts them at greater risk of harm—a harm that we in this House have a proud record of seeking to mitigate. The amendments in my name and variously in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones, my noble friend Lord Russell and the noble Baroness, Lady Harding, are essential to preserving the UK’s commitment to child protection and privacy. As the House is well aware, there is cross-party support for child protection. While I will listen very carefully to the Minister, I too am prepared to test the opinion of the House if he has nothing to offer, and I will ask Labour colleagues to consider their responsibility to the nation’s children before they walk through the Lobby.

I will take the amendments out of numerical order, for the benefit of those who have not been following our proceedings. Amendment 22 creates a direct, unambiguous obligation on data processors and controllers to consider the central principles of the age-appropriate design code when processing children’s data. It acknowledges that children of different ages have different capacities and therefore may require different responses. Subsection (2) of the new clause it would insert addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults would experience under the Act when passed.

In the last few weeks, Meta has removed its moderators, and the once-lauded Twitter has become flooded with disinformation and abuse as a result of Elon Musk’s determined deregulation and support of untruth. We have seen the dial move on elections in Romania’s presidential election via TikTok, a rise in scams and the horror of sexually explicit deepfakes, which we will discuss in a later group.

Public trust in both tech and politics is catastrophically low. While we may disagree on the extent to which adults deserve privacy and protection, there are few in this House or the other place who do not believe it is a duty of government to protect children. Amendment 22 simply makes it a requirement that those who control and process children’s data are directly accountable for considering and prioritising their needs. Amendment 39 does the same job in relation to the ICO, highlighting the need to consider that high bar of privacy to which children are entitled, which should be a focus of the commissioner when exercising its regulatory functions, with a particular emphasis on their age and development stage.

Despite Dame Elizabeth Denham’s early success in drafting the age-appropriate design code, the ICO’s track record on enforcement is poor and the leadership has not championed children by robustly enforcing the ADC, or when faced with proposals that watered down child protections in this Bill and its predecessor. We will get to the question of the ICO next week, but I have been surprised by the amount of incoming mail dissatisfied with the regulator and calling on Parliament to demand more robust action. This amendment does exactly that in relation to children.

Government Amendment 40 would require the ICO, when exercising its functions, to consider the fact that children merit specific protections. I am grateful for and welcome this addition as far as it goes; but in light of the ICO’s disappointing track record, clearer and more robust guidance on its obligations is needed.

Moreover, the Government’s proposal is also insufficient because it creates a duty on the ICO only. It does nothing for the controllers and processors, as I have already set out in Amendment 22. It is essential that those who control and process children’s data are directly accountable for prioritising their needs. The consequences when they do not are visible in the anxiety, body dysmorphia and other developmental issues that children experience as a result of their time online.

The Government have usefully introduced an annual report of ICO activities and action. Amendment 45 simply requires them to report the action it has taken specifically in relation to children, as a separate item. Creating better reporting is one of the advances the Government have made; making it possible to see what the ICO has done in regard to children is little more than housekeeping.

This group also includes clause-specific amendments, which are more targeted than Amendment 22. Amendment 15 excludes children from the impact of the proposal to widen the definition of scientific research in Clause 68. Given that we have just discussed this, I may reconsider that amendment. However, Amendment 16 excludes children from the “recognised legitimate interest” provisions in Clause 70. This means that data controllers would still be required to consider and protect children, as currently required under the legitimate interest basis for processing their data.

Amendment 20 excludes children from the new provisions in Clause 71 on purpose limitation. Purpose limitation is at the heart of GDPR. If you ask for a particular purpose and consent to it, extending that purpose is problematic. Amendment 21 ensures that, for children at least, the status quo of data protection law stays the same: that is to say, their personal data can be used only for the purpose for which it was originally collected. If the controller wants to use it in a different way, it must go back to the child—or, if they are under 13, their parent—to ask for further permission.

Finally, Amendment 27 ensures that significant decisions that impact children cannot be made during automated processes unless they are in a child’s best interest. This is a reasonable check and balance on the proposals in Clause 80.

In full, these amendments uphold our collective responsibility to support, protect and make allowances for children as they journey from infancy to adulthood. I met with the Minister and the Bill team, and I thank them for their time. They rightly made the point that children should be participants in the digital world, and I should not seek to exempt them. I suggest to the House that it is the other way round: I will not seek to exempt children if the Government do not seek to put them at risk.

Our responsibility to children is woven into the fabric of our laws, our culture and our behaviour. It has taken two decades to begin to weave childhood into the digital environment, and I am asking the House to make sure we do not take a single retrograde step. The Government have a decision to make. They can choose to please the CEOs of Silicon Valley in the hope that capitulation on regulatory standards will get us a data centre or two; or they can prioritise the best interests of UK children and agree to these amendments, which put children’s needs first. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I rise to support all the amendments in this group. I have added my name to Amendments 15, 22, 27 and 45. The only reason my name is not on the other amendments is that others got there before me. As is always the case in our debates on this topic, I do not need to repeat the arguments of the noble Baroness, Lady Kidron. I would just like to make a very high-level point.

Data (Use and Access) Bill [HL]

Baroness Harding of Winscombe Excerpts
Moved by
24: Clause 77, page 91, line 16, at end insert—
“(ia) after point (d), insert—“(e) the personal data is from the Open Electoral Register. When personal data from the Open Electoral Register is combined with personal data from other sources to build a profile for direct marketing then transparency obligations must be fulfilled at the point the individual first provides the additional personal data to a data provider. Additional transparency must be provided by organisations using the data for direct marketing via their privacy policy and by including a data notification in a direct mail pack.””
--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I will speak to Amendment 24 in my name and in the names of the noble Lords, Lord Clement-Jones and Lord Stevenson, and my noble friend Lord Black of Brentwood, all of whom I want to thank for their support. I also welcome government Amendment 49.

Amendment 24 concerns the use of the open electoral register, an issue we debated last year in considering the Data Protection and Digital Information Bill, and through the course of this Bill. Noble Lords may think this a small, technical and unimportant issue—certainly at this time of the evening. I have taken it on because it is emblematic of the challenge we face in this country in growing our economy.

Everyone wants strong economic growth. We know that the Government do. We know that the Chancellor has been challenging all regulators to come up with ideas to create growth. This is an example of a regulator hampering growth, and we in this House have an opportunity to do something about it. Those of us who have run businesses know that often, it is in the detail of the regulation that the dead hand of the state does its greatest damage. Because each change is very detailed and affects only a tiny part of the economy, the changes get through the bureaucracy unnoticed and quietly stifle growth. This is one of those examples.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I now turn to government Amendment 49. I thank the noble Lord, Lord Clement-Jones, and other noble Lords for raising the concerns of the charity sector during earlier debates. The Government have also heard from charities and trade associations directly.

This amendment will permit charities to send marketing material—for example, promoting campaigns or fundraising activities—to people who have previously expressed an interest in their charitable purposes, without seeking express consent. Charities will have to provide individuals with a simple means of opting out of receiving direct marketing when their contact details are collected and with every subsequent message sent. The current soft opt-in rule for marketing products and services has similar requirements.

Turning to Amendment 24, I am grateful to the noble Baroness, Lady Harding, for our discussions on this matter. As was said in the debate in Grand Committee, the Government are committed to upholding the principles of transparency. I will try to outline some of that.

I understand that this amendment is about data brokers buying data from the open electoral register and combining it with data they have collected from other sources to build profiles on individuals with the intention of selling them for marketing. Despite what was said in the last debate on this, I am not convinced that all individuals registering on the open electoral register would reasonably expect this kind of profiling or invisible processing using their personal data. If individuals are unaware of the processing, this undermines their ability to exercise their other rights, such as to object to the processing. That point was well made by the noble Lord, Lord Davies.

With regard to the open electoral register, the Government absolutely agree that there are potential benefits to society through its use—indeed, economic growth has been mentioned. Notification is not necessary in all cases. There is, for example, an exemption if notifying the data subject would involve a disproportionate effort and the data was not collected directly from them. The impact on the data subject must be considered when assessing whether the effort is disproportionate. If notification is proportionate, the controller must notify.

The ICO considers that the use and sale of open electoral register data alone is unlikely to require notification. As was set out in Committee, the Government believe that controllers should continue to assess on a case-by-case basis whether cases meet the conditions for the existing disproportionate effort exemption. Moreover, I hope I can reassure the noble Baroness that in the event that the data subject already has the information—from another controller, for example—another exemption from notification applies.

The Government therefore do not see a case for a new exemption for this activity, but as requested by the noble Baroness, Lady Harding, I would be happy to facilitate further engagement between the industry and the ICO to improve a common understanding of how available exemptions are to be applied on a case-by-case basis. I understand that the ICO will use the Bill as an opportunity to take stock of how its guidance can address particular issues that organisations face.

Amendment 50, tabled by the noble Lord, Lord Clement-Jones, seeks to achieve a very similar thing to the government amendment and we studied it when designing our amendment. The key difference is that the government amendment defines which organisations can rely on the new measure and for what purposes, drawing on definitions of “charity” and “charitable purpose” in relevant charities legislation.

I trust that the noble Lord will be content with this government amendment and feel content to not to press his own.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

Before the Minister sits down, can I follow up and ask a question about invisible processing? I wonder whether he considers that a better way of addressing potential concerns about invisible processing is improving the privacy notices when people originally sign up for the open electoral register. That would mean making it clear how your data could be used when you say you are happy to be on the open electoral register, rather than creating extra work and potentially confusing communication with people after that. Can the Minister confirm that that would be in scope of potential options and further discussions with the ICO?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

The further discussions with the ICO are exactly to try to get to these points about the right way to do it. It is important that people know what they are signing up for, and it is equally important that they are aware that they can withdraw at any point. Those points obviously need to be discussed with the industry to make sure that everyone is clear about the rules.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I thank noble Lords for having humoured me in the detail of this debate. I am very pleased to hear that response from the Minister and look forward to ongoing discussions with the ICO and the companies involved. As such, I beg leave to withdraw my amendment.

Amendment 24 withdrawn.
--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak to Amendment 41 in my name and in the names of my noble friend Lord Russell, the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones. The House can be forgiven if it is sensing a bit of déjà-vu, since I have proposed this clause once or twice before. However, since Committee, a couple of things have happened that make the argument for the code more urgent. We have now heard that the Prime Minister thinks that regulating AI is “leaning out” when we should be, as the tech industry likes to say, leaning in. We have had Matt Clifford’s review, which does not mention children even once. In the meantime, we have seen rollout of AI in almost all products and services that children use. In one of the companies—a household name that I will not mention—an employee was so concerned that they rang me to say that nothing had been checked except whether the platform would fall over.

Amendment 41 does not seek to solve what is a global issue of an industry arrogantly flying a little too close to the sun and it does not grasp how we could use this extraordinary technology and put it to use for humankind on a more equitable basis than the current extractive and winner-takes-all model; it is far more modest than that. It simply says that products and services that engage with kids should undertake a mandatory process that considers their specific vulnerabilities related to age. I want to stress this point. When we talk about AI, increasingly we imagine the spectre of diagnostic benefits or the multiple uses of generative models, but of course AI is not new nor confined to these uses. It is all around us and, in particular, it is all around children.

In 2021, Amazon’s AI voice assistant, Alexa, instructed a 10 year-old to touch a live electrical plug with a coin. Last year, Snapchat’s My AI gave adult researchers posing as a 13 year-old girl tips on how to lose her virginity with a 31 year-old. Researchers were also able to obtain tips on how to hide the smell of alcohol and weed and how to conceal Snapchat conversations from their parents. Meanwhile, character.ai is being sued by the mother of a 14 year-old boy in Florida who died by suicide after becoming emotionally attached to a companion bot that encouraged him to commit suicide.

In these cases, the companies in question responded by implementing safety measures after the fact, but how many children have to put their fingers in electrical sockets, injure themselves, take their own lives and so on before we say that those measures should be mandatory? That is all that the proposed code does. It asks that companies consider the ways in which their products may impact on children and, having considered them, take steps to mitigate known risk and put procedures in place to deal with emerging risks.

One of the frustrating things about being an advocate for children in the digital world is how much time I spend articulating avoidable harms. The sorts of solutions that come after the event, or suggestions that we ban children from products and services, take away from the fact that the vast majority of products and services could, with a little forethought, be places of education, entertainment and personal growth for children. However, children are by definition not fully mature, which puts them at risk. They chat with smart speakers, disclosing details that grown-ups might consider private. One study found that three to six year-olds believed that smart speakers have thoughts, feelings and social abilities and are more reliable than human beings when it came to answering fact-based questions.

I ask the Minister: should we ban children from the kitchen or living room in which the smart speaker lives, or demand, as we do of every other product and service, minimum standards of product safety based on the broad principle that we have a collective obligation to the safety and well-being of children? An AI code is not a stretch for the Bill. It is a bare minimum.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I will speak very briefly, given the hour, just to reinforce three things that I have said as the wingman to the noble Baroness, Lady Kidron, many times, sadly, in this Chamber in child safety debates. The age-appropriate design code that we worked on together and which she championed a decade ago has driven real change. So we have evidence that setting in place codes of conduct that require technology companies to think in advance about the potential harms of their technologies genuinely drives change. That is point one.

Point two is that we all know that AI is a foundational technology which is already transforming the services that our children use. So we should be applying that same principle that was so hard fought 10 years ago for non-AI digital to this foundational technology. We know that, however well meaning, technology companies’ development stacks are always contended. They always have more good things that they think they can do to improve their products for their consumers, that will make them money, than they have the resources to do. However much money they have, they just are contended. That is the nature of technology businesses. This means that they never get to the safety-by-design issues unless they are required to. It was no different 150 or 200 years ago as electricity was rolling through the factories of the mill towns in the north of England. It required health and safety legislation. AI requires health and safety legislation. You start with codes of conduct and then you move forward, and I really do not think that we can wait.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, Amendment 41 aims to establish a code of practice for the use of children’s data in the development of AI technologies. In the face of rapidly advancing AI, it is, of course, crucial that we ensure children’s data is handled with the utmost care, prioritising their best interests and fundamental rights. We agree that AI systems that are likely to impact children should be designed to be safe and ethical by default. This code of practice will be instrumental in guiding data controllers to ensure that AI development and deployment reflect the specific needs and vulnerabilities of children.

However, although we support the intent behind the amendment, we have concerns, which echo concerns on amendments in a previous group, about the explicit reference to the UN Convention on the Rights of the Child and general comment 25. I will not rehearse my comments from earlier groups, except to say that it is so important that we do not have these explicit links to international frameworks, important as they are, in UK legislation.

In the light of this, although we firmly support the overall aim of safeguarding children’s data in AI, we believe this can be achieved more effectively by focusing on UK legal principles and ensuring that the code of practice is rooted in our domestic context.

Data (Use and Access) Bill [HL]

Baroness Harding of Winscombe Excerpts
Finally, I just want to say to the House that I was, by chance, on a call with children from all across the world on the weekend, and their primary concern was that technology, including AI, was shaping their world for the worse. Children are asking that school be a place of security, safety and freedom, without the extractive or pushy qualities that characterise tech in the rest of their lives. I hope the Minister is willing to commit to that when he responds. I beg to move.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I support the amendment in the name of the noble Baroness, Lady Kidron, to which I have added my name. I will speak briefly because I wish to associate myself with everything that she has said, as is normal on these topics.

Those of us who worked long and hard on the Online Safety Act had our fingers burnt quite badly when things were not written into the Bill. While I am pleased—and expect to be even more pleased in a few minutes—that the Government are in favour of some form of code of conduct for edtech, whether through the age-appropriate design code or not, I am nervous. As the noble Baroness, Lady Kidron said, every day with Ofcom we are seeing the risk-aversion of our regulators in this digital space. Who can blame them when it appears to be the flavour of the month to say that, if only the regulators change the way they behave, growth will magically come? We have to be really mindful that, if we ask the ICO to do this vaguely, we will not get what we need.

The noble Baroness, Lady Kidron, as ever, makes a very clear case for why it is needed. I would ask the Minister to be absolutely explicit about the Government’s intention, so that we are giving very clear directions from this House to the regulator.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Harding. I have added a few further words to my speech in response, because she made an extremely good point. I pay tribute to the noble Baroness, Lady Kidron, and her tenacity in trying to make sure that we secure a code for children’s data and education, which is so needed. The education sector presents unique challenges for protecting children’s data.

Like the noble Baronesses, Lady Kidron and Lady Harding, I look forward to what the Minister has to say. I hope that whatever is agreed is explicit; I entirely agree with the noble Baroness, Lady Harding. I had my own conversation with the Minister about Ofcom’s approach to categorisation which, quite frankly, does not follow what we thought the Online Safety Act was going to imply. It is really important that we absolutely tie down what the Minister has to say.

The education sector is a complex environment. The existing regulatory environment does not adequately address the unique challenges posed by edtech, as we call it, and the increasing use of children’s data in education. I very much echo what the noble Baroness, Lady Kidron, said: children attend school for education, not to be exploited for data mining. Like her, I cross over into considering the issues related to the AI and IP consultation.

The worst-case scenario is using an opt-in system that might incentivise learners or parents to consent, whether that is to state educational institutions such as Pearson, exam boards or any other entity. I hope that, in the other part of the forest, so to speak, that will not take place to the detriment of children. In the meantime, I very much look forward to what the Minister has to say on Amendment 44.

--- Later in debate ---
Earl of Clancarty Portrait The Earl of Clancarty (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise briefly in support of my noble friend Lady Kidron’s important amendments. I declare an interest as a visual artist.

I want to pick up on the language that Rachel Reeves used in conversation with Laura Kuenssberg in her Sunday programme, when she talked about getting the balance right. It needs to be emphasised that it is not a question of balance between the tech companies and the creative industries but a question about the use of data, and the consideration of the origin of that data should be central to a Bill about access to data. That is critical. It is perhaps ironic that at the heart of this there is a void, which is the lack of data about data, as my noble friend Lord Colville showed clearly in his speech. The creative industries themselves successfully use AI. As Paul McCartney pointed out in the same Laura Kuenssberg programme, in his case he did so by actively seeking and obtaining permission for the use of data, as everyone should. These amendments are wholly reasonable and do what the creative industries are asking for. If the Government do not accept them, I shall certainly vote for them.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I also support these amendments so brilliantly introduced by the noble Baroness, Lady Kidron. As a just-finishing member of the Communication and Digital Committee, I, too, associate myself with everything that our departing chair has just said so ably.

I am a lover of the book Why Nations Fail, written by two Nobel laureates. It charts how countries succeed and fail in adopting technology. There are two important lessons in that book. The first is that one must not turn one’s back on the technology. As we consider this very difficult issue, it is important to say that those of us in favour of these amendments are not trying to be the German boatman sinking the first steamboat, the Ottoman Empire turning its back on the printing press or the hand knitters objecting to knitting machines in Elizabethan times. We embrace AI. It will transform society for the good. That is the first important point.

The second lesson that Why Nations Fail teaches us is that, even as one embraces technology, the rule of law, property rights and giving people certainty over what they create and own are one of the other essential ingredients to success in harnessing the benefits of technology. That is why this issue matters so much. I, too, rewrote my brief remarks overnight on the back of the DeepSeek launch yesterday. I was struck by the panic among those in Silicon Valley, who thought, “Oh, my God. Is it possible that the Chinese have stolen open AI’s IP in order to create a better product?” Gosh, has Silicon Valley for a moment begun to feel what creative copyright owners have been feeling for several years? Actually, the valley is learning that certainty of copyright is an important part of driving growth in an adoption of technology.

Another interesting thing happens when you ask DeepSeek what happened in Tiananmen Square in 1989. It will not tell you, so it is clear that these supposed black boxes can be quite specific about what they include and exclude. That gives me confidence, as a non-technologist, that if we give the technology companies the challenge of creating simple mechanisms for copyright owners, they will jolly well do it, because they can definitely do it when they want to exclude content from models today.