Moved by
24: Clause 77, page 91, line 16, at end insert—
“(ia) after point (d), insert—“(e) the personal data is from the Open Electoral Register. When personal data from the Open Electoral Register is combined with personal data from other sources to build a profile for direct marketing then transparency obligations must be fulfilled at the point the individual first provides the additional personal data to a data provider. Additional transparency must be provided by organisations using the data for direct marketing via their privacy policy and by including a data notification in a direct mail pack.””
--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I will speak to Amendment 24 in my name and in the names of the noble Lords, Lord Clement-Jones and Lord Stevenson, and my noble friend Lord Black of Brentwood, all of whom I want to thank for their support. I also welcome government Amendment 49.

Amendment 24 concerns the use of the open electoral register, an issue we debated last year in considering the Data Protection and Digital Information Bill, and through the course of this Bill. Noble Lords may think this a small, technical and unimportant issue—certainly at this time of the evening. I have taken it on because it is emblematic of the challenge we face in this country in growing our economy.

Everyone wants strong economic growth. We know that the Government do. We know that the Chancellor has been challenging all regulators to come up with ideas to create growth. This is an example of a regulator hampering growth, and we in this House have an opportunity to do something about it. Those of us who have run businesses know that often, it is in the detail of the regulation that the dead hand of the state does its greatest damage. Because each change is very detailed and affects only a tiny part of the economy, the changes get through the bureaucracy unnoticed and quietly stifle growth. This is one of those examples.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I now turn to government Amendment 49. I thank the noble Lord, Lord Clement-Jones, and other noble Lords for raising the concerns of the charity sector during earlier debates. The Government have also heard from charities and trade associations directly.

This amendment will permit charities to send marketing material—for example, promoting campaigns or fundraising activities—to people who have previously expressed an interest in their charitable purposes, without seeking express consent. Charities will have to provide individuals with a simple means of opting out of receiving direct marketing when their contact details are collected and with every subsequent message sent. The current soft opt-in rule for marketing products and services has similar requirements.

Turning to Amendment 24, I am grateful to the noble Baroness, Lady Harding, for our discussions on this matter. As was said in the debate in Grand Committee, the Government are committed to upholding the principles of transparency. I will try to outline some of that.

I understand that this amendment is about data brokers buying data from the open electoral register and combining it with data they have collected from other sources to build profiles on individuals with the intention of selling them for marketing. Despite what was said in the last debate on this, I am not convinced that all individuals registering on the open electoral register would reasonably expect this kind of profiling or invisible processing using their personal data. If individuals are unaware of the processing, this undermines their ability to exercise their other rights, such as to object to the processing. That point was well made by the noble Lord, Lord Davies.

With regard to the open electoral register, the Government absolutely agree that there are potential benefits to society through its use—indeed, economic growth has been mentioned. Notification is not necessary in all cases. There is, for example, an exemption if notifying the data subject would involve a disproportionate effort and the data was not collected directly from them. The impact on the data subject must be considered when assessing whether the effort is disproportionate. If notification is proportionate, the controller must notify.

The ICO considers that the use and sale of open electoral register data alone is unlikely to require notification. As was set out in Committee, the Government believe that controllers should continue to assess on a case-by-case basis whether cases meet the conditions for the existing disproportionate effort exemption. Moreover, I hope I can reassure the noble Baroness that in the event that the data subject already has the information—from another controller, for example—another exemption from notification applies.

The Government therefore do not see a case for a new exemption for this activity, but as requested by the noble Baroness, Lady Harding, I would be happy to facilitate further engagement between the industry and the ICO to improve a common understanding of how available exemptions are to be applied on a case-by-case basis. I understand that the ICO will use the Bill as an opportunity to take stock of how its guidance can address particular issues that organisations face.

Amendment 50, tabled by the noble Lord, Lord Clement-Jones, seeks to achieve a very similar thing to the government amendment and we studied it when designing our amendment. The key difference is that the government amendment defines which organisations can rely on the new measure and for what purposes, drawing on definitions of “charity” and “charitable purpose” in relevant charities legislation.

I trust that the noble Lord will be content with this government amendment and feel content to not to press his own.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

Before the Minister sits down, can I follow up and ask a question about invisible processing? I wonder whether he considers that a better way of addressing potential concerns about invisible processing is improving the privacy notices when people originally sign up for the open electoral register. That would mean making it clear how your data could be used when you say you are happy to be on the open electoral register, rather than creating extra work and potentially confusing communication with people after that. Can the Minister confirm that that would be in scope of potential options and further discussions with the ICO?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

The further discussions with the ICO are exactly to try to get to these points about the right way to do it. It is important that people know what they are signing up for, and it is equally important that they are aware that they can withdraw at any point. Those points obviously need to be discussed with the industry to make sure that everyone is clear about the rules.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I thank noble Lords for having humoured me in the detail of this debate. I am very pleased to hear that response from the Minister and look forward to ongoing discussions with the ICO and the companies involved. As such, I beg leave to withdraw my amendment.

Amendment 24 withdrawn.
--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak to Amendment 41 in my name and in the names of my noble friend Lord Russell, the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones. The House can be forgiven if it is sensing a bit of déjà-vu, since I have proposed this clause once or twice before. However, since Committee, a couple of things have happened that make the argument for the code more urgent. We have now heard that the Prime Minister thinks that regulating AI is “leaning out” when we should be, as the tech industry likes to say, leaning in. We have had Matt Clifford’s review, which does not mention children even once. In the meantime, we have seen rollout of AI in almost all products and services that children use. In one of the companies—a household name that I will not mention—an employee was so concerned that they rang me to say that nothing had been checked except whether the platform would fall over.

Amendment 41 does not seek to solve what is a global issue of an industry arrogantly flying a little too close to the sun and it does not grasp how we could use this extraordinary technology and put it to use for humankind on a more equitable basis than the current extractive and winner-takes-all model; it is far more modest than that. It simply says that products and services that engage with kids should undertake a mandatory process that considers their specific vulnerabilities related to age. I want to stress this point. When we talk about AI, increasingly we imagine the spectre of diagnostic benefits or the multiple uses of generative models, but of course AI is not new nor confined to these uses. It is all around us and, in particular, it is all around children.

In 2021, Amazon’s AI voice assistant, Alexa, instructed a 10 year-old to touch a live electrical plug with a coin. Last year, Snapchat’s My AI gave adult researchers posing as a 13 year-old girl tips on how to lose her virginity with a 31 year-old. Researchers were also able to obtain tips on how to hide the smell of alcohol and weed and how to conceal Snapchat conversations from their parents. Meanwhile, character.ai is being sued by the mother of a 14 year-old boy in Florida who died by suicide after becoming emotionally attached to a companion bot that encouraged him to commit suicide.

In these cases, the companies in question responded by implementing safety measures after the fact, but how many children have to put their fingers in electrical sockets, injure themselves, take their own lives and so on before we say that those measures should be mandatory? That is all that the proposed code does. It asks that companies consider the ways in which their products may impact on children and, having considered them, take steps to mitigate known risk and put procedures in place to deal with emerging risks.

One of the frustrating things about being an advocate for children in the digital world is how much time I spend articulating avoidable harms. The sorts of solutions that come after the event, or suggestions that we ban children from products and services, take away from the fact that the vast majority of products and services could, with a little forethought, be places of education, entertainment and personal growth for children. However, children are by definition not fully mature, which puts them at risk. They chat with smart speakers, disclosing details that grown-ups might consider private. One study found that three to six year-olds believed that smart speakers have thoughts, feelings and social abilities and are more reliable than human beings when it came to answering fact-based questions.

I ask the Minister: should we ban children from the kitchen or living room in which the smart speaker lives, or demand, as we do of every other product and service, minimum standards of product safety based on the broad principle that we have a collective obligation to the safety and well-being of children? An AI code is not a stretch for the Bill. It is a bare minimum.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I will speak very briefly, given the hour, just to reinforce three things that I have said as the wingman to the noble Baroness, Lady Kidron, many times, sadly, in this Chamber in child safety debates. The age-appropriate design code that we worked on together and which she championed a decade ago has driven real change. So we have evidence that setting in place codes of conduct that require technology companies to think in advance about the potential harms of their technologies genuinely drives change. That is point one.

Point two is that we all know that AI is a foundational technology which is already transforming the services that our children use. So we should be applying that same principle that was so hard fought 10 years ago for non-AI digital to this foundational technology. We know that, however well meaning, technology companies’ development stacks are always contended. They always have more good things that they think they can do to improve their products for their consumers, that will make them money, than they have the resources to do. However much money they have, they just are contended. That is the nature of technology businesses. This means that they never get to the safety-by-design issues unless they are required to. It was no different 150 or 200 years ago as electricity was rolling through the factories of the mill towns in the north of England. It required health and safety legislation. AI requires health and safety legislation. You start with codes of conduct and then you move forward, and I really do not think that we can wait.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, Amendment 41 aims to establish a code of practice for the use of children’s data in the development of AI technologies. In the face of rapidly advancing AI, it is, of course, crucial that we ensure children’s data is handled with the utmost care, prioritising their best interests and fundamental rights. We agree that AI systems that are likely to impact children should be designed to be safe and ethical by default. This code of practice will be instrumental in guiding data controllers to ensure that AI development and deployment reflect the specific needs and vulnerabilities of children.

However, although we support the intent behind the amendment, we have concerns, which echo concerns on amendments in a previous group, about the explicit reference to the UN Convention on the Rights of the Child and general comment 25. I will not rehearse my comments from earlier groups, except to say that it is so important that we do not have these explicit links to international frameworks, important as they are, in UK legislation.

In the light of this, although we firmly support the overall aim of safeguarding children’s data in AI, we believe this can be achieved more effectively by focusing on UK legal principles and ensuring that the code of practice is rooted in our domestic context.