(5 days, 1 hour ago)
Lords ChamberMy Lords, I add my congratulations to the noble Baroness, Lady Owen, for her skill and persistence in persuading the Government to address this noxious practice, which is causing so many women so much distress and humiliation. It is outrageous that this is still not unlawful.
I very much welcome what the Minister said, and I will press him on four matters. I hope that I understood him correctly when he said that the criminal offence will include solicitation in this country of the creation of these images abroad. I see that he is nodding—I am grateful. This is vital for this provision because, unless the criminal offence in this country covers such matters, the mischief will continue, as the Minister recognises. I can see no difficulty in terms of constitutional theory or practice or international law, because there are many offences in the criminal calendar where what is criminalised is conduct in this country, even though part of the matter that causes concern occurs abroad. I am very grateful to the Minister.
Secondly—and I hope I understood the Minister correctly—he said that the Government’s amendments will contain no intent element other than intent to create the image. That is very important. If the prosecution has to establish some other intent, that will enable defendants to come up with all sorts of spurious explanations such as, “It was not my intent” and “I didn’t realise that it would have this effect”, which would frustrate the purpose. I think that is what the Minister said, and I would welcome confirmation on that important point.
I would also welcome confirmation on another point. Another “intent”—intent to cause alarm, distress or humiliation—is in his Amendment 56A, which I of course appreciate will not be pursued in due course. Does the Minister’s statement that no other intent than intent to create the image will be required also covers the other element, which is in Amendment 56A? That also requires the prosecution to prove, as an alternative, the purpose for which these images are created. It has to be proved under Amendment 56A that the purpose is to obtain sexual gratification. The defendant will inevitably say that it is not their purpose. Could the Minister confirm that that will not be replicated in the amendment that will eventually be brought forward? I see the Minister nodding, and I am grateful to him.
Thirdly, the Minister referred to what will be in the amendment that will eventually be brought forward. If I understood him correctly, there will be a defence of reasonable excuse. The Minister confirms that that is what he said. I have great difficulty in understanding in what circumstances a defendant could have a reasonable excuse for creating or soliciting a fake image of a sexual nature without either the consent of the victim or, at the very least, a reasonable belief by the defendant that the victim had consented. Can the Minister give us an example of where the image has been created or solicited and the defendant does not believe that the woman has consented, or does not have a reasonable belief that the woman has consented, but there is nevertheless a reasonable excuse for this conduct? I cannot think of one. I am not expecting an answer from the Minister today, but if his amendment contains the reasonable excuse defence, I for one will be pressing him on it.
Fourthly and finally, I understood the Minister to give a commitment, not that the amendment will be ready in time necessarily for Third Reading, but that it will be ready and introduced during the passage of this Bill through Parliament. My understanding is that there is no question of this being kicked into the long grass. We have a commitment that the Government will propose legislation in the course of parliamentary consideration of this Bill. If I am right on that—again, I saw the Minister nodding—I very much hope that the noble Baroness, Lady Owen, will not feel it necessary to press her amendment this evening. She has made enormous progress on this, which is much welcomed around the House. It would be much better, would it not, to preserve and reserve her position for Third Reading, if she needs to bring the matter back then?
My Lords, it is such a pleasure briefly to follow my noble friend Lord Pannick; not for the first time I thought that, if I ever get in trouble, I know who I will go to.
I record my admiration for the noble Baroness, Lady Owen. She has fought a just and forensic fight and she has mastered the gift of the House of Lords very rapidly. I also thank the Minister, Sarah Sackman, for the meeting on this subject and for agreeing to look again and again at the issue of intent and consent, which is something that those of us who have been in the world of sexual offences really must insist on, so I was delighted to hear from the noble Baroness and the Minister that that is somewhat resolved.
My Lords, I will speak to both amendments in this group. Amendment 58, which is in my name and those of my noble friend Lord Tarassenko, and the noble Lords, Lord Stevenson and Lord Clement-Jones, seeks to ensure that the value of our publicly held large datasets is realised for the benefit of UK citizens. A full explanation of the amendment can be found at col. 162GC of Hansard. Amendment 71 is new and has a different approach to many of the same ends.
The speed at which the Government are giving access to our data is outpacing their plans to protect its financial or societal value. As we have seen over the last 24 hours, in which $1 trillion was wiped off the US AI sector and China provided a next gen proposal for AI—at least for the moment—technology moves at pace, but data is still the gold dust on which it rests.
Amendment 58 would require the Government to act as custodian of that vital asset. For example, they would need to decide the criteria for assigning publicly held data sets as a sovereign data asset, secure a valuation for that asset, and then be accountable for the decisions they took to protect that value and generate a return, both financial and societal, on behalf of the British public.
When that idea was proposed at Second Reading, the Minister said the Government’s proposal for a national data library would address those concerns. However, despite requests for further information from noble Lords in Committee, the Minister did not elaborate. That is a source of considerable frustration, given that in the same period no fewer than seven recommendations for the national data library in Matt Clifford’s AI action plan were fully accepted and widely trumpeted by the Government—including giving away BBC assets for free without asking the BBC.
I am grateful to the noble Baroness, Lady Kidron, and the noble Lord, Lord Tarassenko, for Amendments 58 and 71, one of which we also considered in Committee. I suspect that we are about to enter an area of broad agreement here. This is a very active policy area, and noble Lords are of course asking exactly the right questions of us. They are right to emphasise the need for speed.
I agree that it is essential that we ensure that legal and policy frameworks are fit for purpose for the modern demands and uses of data. This Government have been clear that they want to maximise the societal benefits from public sector data assets. I said in the House very recently that we need to ensure good data collection, high-quality curation and security, interoperability and ways of valuing data that secure appropriate value returns to the public sector.
On Amendment 58, my officials are considering how we approach the increased demand and opportunity of data, not just public sector data but data across our economy. This is so that we can benefit from the productivity and growth gains of improvements to access to data, and harness the opportunities, which are often greater when different datasets are combined. As part of this, we sought public views on this area as part of the industrial strategy consultation last year. We are examining our current approach to data licensing, data valuation and the legal framework that governs data sharing in the public sector.
Given the complexity, we need to do this in a considered manner, but we of course need to move quickly. Crucially, we must not betray the trust of people or the trust of those responsible for managing and safeguarding these precious data assets. From my time as chair of the Natural History Museum, I am aware that museums and galleries are considering approaches to this very carefully. The noble Lord, Lord Lucas, may well be interested to see some of the work going on on biodiversity datasets there, where there are huge collections of great value that we actually did put value against.
Of course, this issue cuts across the public sector, including colleagues from the Geospatial Commission, NHS, DHSC, National Archives, Department for Education, Ordnance Survey and Met Office, for example. My officials and I are very open to discussing the policy issues with noble Lords. I recently introduced the noble Lord, Lord Tarassenko, to officials from NHSE dealing with the data side of things there and linked him with the national data library to seek his input. As was referred to, yesterday, the noble Baroness, Lady Kidron, the noble Lords, Lord Clement-Jones, Lord Tarassenko and Lord Stevenson, and the noble Viscount, Lord Camrose, all met officials, and we remain open to continuing such in-depth conversations. I hope the noble Baroness appreciates that this is an area with active policy development and a key priority for the Government.
Turning to Amendment 71, also from the noble Baroness, I agree that the national data library represents an enormous opportunity for the United Kingdom to unlock the full value of our public data. I agree that the protection and care of our national data is essential. The scope of the national data library is not yet finalised, so it is not possible to confirm whether a new statutory body or specific statutory functions are the right way to do this. Our approach to the national data library will be guided by the principles of public law and the requirements of the UK’s data protection legislation, including the data protection principles and data subject rights. This will ensure that data sharing is fair, secure and preserves privacy. It will also ensure that we have clear mechanisms for both valuation and value capture. We have already sought, and continue to seek, advice from experts on these issues, including work from the independent Prime Minister’s Council for Science and Technology. The noble Lord, Lord Freyberg, also referred to the work that I was involved with previously at the Tony Blair Institute.
The NDL is still in the early stages of development. Establishing it on a statutory footing at this point would be inappropriate, as work on its design is currently under way. We will engage and consult with a broad range of stakeholders on the national data library in due course, including Members of both Houses.
The Government recognise that our data and its underpinning infrastructure is a strategic national asset. Indeed, it is for that reason that we started by designating the data centres as critical national infrastructure. As the subjects of these amendments remain an active area of policy development, I ask the noble Baroness to withdraw her amendment.
I am grateful for a breakout of agreement at this time of night; that is delightful. I agree with everything that the Minister said, but one thing we have not mentioned is the incredible cost of managing the data and the investment required. I support the Government investing to get the value out, as I believe other noble Lords do, and I would just like to put that point on record.
We had a meeting yesterday and thought it was going to be about data assets, but it turned out to be about data communities, which we had debated the week before. Officials said that it was incredibly useful, and it might have been a lot quicker if they had had it earlier. In echoing what was said in the amendment of the noble Baroness, Lady Owen, there is considerable interest and expertise, and I would love to see the Government move faster, possibly with the help of noble Lords. With that, I beg leave to withdraw the amendment.
My Lords, I move Amendment 68 in my name and those of the noble Lords, Lord Arbuthnot, Lord Holmes and Lord Clement-Jones. This amendment has been debated several times within this Bill and its predecessor; however, this version differs slightly in approach. The objective remains the same: to overturn the common-law assumption in both civil and criminal law that computers are infallible.
This assumption has led to untold injustice. Innocent people have lost their lives, freedom and livelihoods because the law wrongly assumed that computers are never wrong. This of course is nonsense, as explained in detail in our last debate, at column GC 153 of Hansard. In summary, computer systems are very susceptible to both human and technological error. Indeed, the presence of bugs is normal, anticipated and routine in all contexts other than the court.
As with previous iterations of this amendment, Amendment 68 overturns that common-law assumption, but the drafting now closely mirrors provisions under the Electronic Trade Documents Act 2023, which was enacted in recognition that the majority of trade documents are now electronic.
The ETDA ensures and assures the integrity of electronic trade documents. It was put in place to protect those on both sides of the trade, so I am curious, at the very least, as to why we will be able to consider the efficacy of computer evidence in relation to trade but not in our legal system. I am also concerned that the MoJ, under several Governments, has been so slow to recognise the scale of the problem of this assumption, which one of my most experienced computer science colleagues described as “wicked nonsense”.
In brief, the amendment provides that the electronic evidence produced by or derived from a computer may be relied upon as evidence where that evidence is not challenged and where the court is satisfied that the evidence can be relied upon. The rest of the amendment is carefully drafted by legal experts and computer scientists with legal expertise to support the court in coming to a meaningful assessment of whether to be satisfied, or not, that the evidence can be relied upon.
This proposal has been tried and tested within our legal system. We know that it works, and I therefore see no reason why the Government should not simply accept it. However, rather than discuss it, the Government chose to announce, last week, a consultation on computer evidence. The call for evidence is a source of significant frustration for those of us who have championed this issue, as is the fact that the promised meeting with the MoJ did not happen before that announcement, in spite of repeated requests.
In her introductory remarks to the consultation, the Minister for Justice, Sarah Sackman, says that the purpose of the consultation is to help her department
“better understand how the current presumption concerning the admissibility of computer evidence is working in practice, and whether it is fit for purpose in the modern world”.
This is a backward step. The evidence that presumption is not working and is not fit for purpose is overwhelming and decades long; what are needed now are solutions, one of which is before us tonight.
Moreover, the Government’s preference for doing everything behind doors has sunk their own consultation. Had experts been consulted, the first thing they would have pointed out is that the scope is insufficient because it does not address civil proceedings but only criminal proceedings, even though the presumption is the same for both. This means that, at best, the Government’s consultation can lead only to a partial solution.
We in this House have discussed this issue in the case of the postmasters; it is a case that is front of mind. This approach may have spared those postmasters who were subject to criminal prosecutions, but not those such as Lee Castleton who was subject to civil proceedings by the Post Office, which chased him to bankruptcy. He was also branded a thief, spat at and verbally abused in the street. He developed post-traumatic stress disorder. His wife developed epilepsy from stress, his daughter developed an eating disorder and his son remains so traumatised that he cannot be in a room where someone says the words “Post Office”. A solution that does not prevent the injustice done to Lee and his family from happening to others is not fit for purpose. If the MoJ had done us the courtesy of a meeting, this could have been avoided.
I am sure the Minister will assure us that the Government are acting, but for those whose lives have been ruined, those who have fought for too many years on this issue, the consultation creates the spectre of yet another battle and further delay when the solutions are here and at hand. I want nothing more than to be wrong on this, and for the Government to prove me wrong. But for past victims, for lawyers and experts who have given their time so generously, and for those whose lives will be ruined because the computer got it wrong, half a consultation on a matter so well-established and urgent is a pretty poor result. I beg to move.
My Lords, as so often, I listened with awe to the noble Baroness. Apart from saying that I agree with her wholeheartedly, which I do, there is really no need for me for me to add anything, so I will not.
Amendment 68 from the noble Baroness, Lady Kidron, aims to prevent future miscarriages of justice, such as the appalling Horizon scandal. I thank the noble Baroness and, of course, the noble Lord, Lord Arbuthnot, for the commitment to ensuring that this important issue is debated. The Government absolutely recognise that the law in this area needs to be reviewed. Noble Lords will of course be aware that any changes to the legal position would have significant ramifications for the whole justice system and are well beyond the scope of this Bill.
I am glad to be able to update the noble Baroness on this topic since Committee. On 21 January the Ministry of Justice launched a call for evidence on this subject. That will close on 15 April, and next steps will be set out immediately afterwards. That will ensure that any changes to the law are informed by expert evidence. I take the point that there is a lot of evidence already available, but input is also needed to address the concerns of the Serious Fraud Office and the Crown Prosecution Service, and I am sure they will consider the important issues raised in this amendment.
I hope the noble Baroness appreciates the steps that the Ministry of Justice has taken on this issue. The MoJ will certainly be willing to meet any noble Lords that wish to do so. As such, I hope she feels content to withdraw the amendment.
The Minister did not quite address my point that the consultation is not broad enough in scope, but I will accept the offer of a meeting. Although the noble Lord, Lord Arbuthnot, spoke very briefly, he is my partner in crime on this issue; indeed, he is a great campaigner for the postmasters and has done very much. So I say to the Minister: yes, I will have the meeting, but could it happen this time? With that, I beg leave to withdraw the amendment.
(5 days, 1 hour ago)
Lords ChamberMy Lords, Amendment 44 in my name and those of the noble Lords, Lord Russell and Lord Clement-Jones, and the noble Baroness, Lady Harding, proposes a statutory code of practice on children’s education to ensure that children benefit from heightened protections when their data is processed for the purposes of relating to education.
My understanding is that, when the Minister stands up, he will tell us that the Secretary of State is going to write to the ICO and require him to either write such a code or, if it is more practical, extend the AADC to cover educational settings. The either/or is because Government say the ICO is undertaking a consultation on edtech and DSIT is doing a consultation on AI, both of which have ramifications for children’s data at school.
Rather than make the argument for the amendment as written, I shall put on record for the department and the ICO the expectations of such a code. I hope that the Minister concurs with this list and that he will ensure that the ICO works with me and expert colleagues in the field to look at and respond to the evidence and ensure that the code addresses our concerns.
The code must apply to education provided in school settings but also outside the classroom—for example, when children use edtech products to complete homework set by school or for independent learning. The code must consider all aspects of the provision of education, including safeguarding and administration, as well as learning. The code should take as a starting point that children merit heightened protections and consider the needs of children at different ages and stages. The code should provide specific guidance on profiling, including predictions that may impact on children’s educational opportunities or outcomes. The code should require the ICO or the DfE to work with third parties to develop certification and accreditation schemes to support educators and parents in choosing products and services that are safe and private and improve learning outcomes. Lastly, in drawing up the code, the ICO must consult with children, parents, educators, devolved Governments and industry.
I also want to put on record that “school” means an entity that provides education to children in the UK. Importantly, that includes early-years providers, nursery schools, primary schools and so on, because often early years are left out of this equation.
May I ask for a commitment from the Dispatch Box that, when the order is complete and some of those conversations are being discussed, we can have a meeting with the ICO, the DfE and noble Lords who have fought for this since 2018?
I am very happy to give that commitment. That would be an important and useful meeting.
I thank the Minister and the Government. As I have just said, we have been fighting for this since 2018, so that is quite something. I forgot to say in my opening remarks that edtech does not, of course, have an absolute definition. However, in my mind—it is important for me to say this to the House—it includes management, safety and tech that is used for educational purposes. All those are in schools, and we have evidence of problems with all of them. I was absolutely delighted to hear the Government’s commitments, and I look forward to working with the ICO and the department. With that, I beg leave to withdraw.
In moving Amendment 44A, I shall also speak to Amendments 61 to 65 in my name and the names of the noble Lords, Lord Stevenson and Lord Clement-Jones, and my noble friend Lord Freyberg. I registered my interests in Committee, but I begin by restating that I am a copyright holder. I am married to a copyright holder, and I have deep connections with many in the creative communities who are impacted by this issue. I am also an adviser to the Institute for Ethics in AI at Oxford, and I have the pleasure and privilege of working alongside dozens of people whose businesses and academic interests relate solely to AI.
Until last night, I had a very technical argument about the amendments—about what they would do and how they would work. But I sat in the Gallery of the other place for several hours last night to listen to the debate on the creative industries, and I listened virtually to what I did not see from the Gallery, and it really made me reconsider my approach today. It was striking that, whether on the Green, DUP, Liberal Democrat or Conservative Benches, or the Government’s own Back Benches, the single biggest concern in a debate that ran for hours, with many speakers, was the question of copyright and AI. Indeed, it figured in all but two or three speeches. Moreover, as I sat in the Gallery and people started to read from the Times, the Mail, Politico and the tech blogs, an increasing flow of MPs, many on the Government’s own Benches and some actually in the Government, texted me to say that their leadership was wrong and they hoped this fight would be won for the UK’s creative industries.
It is a very great privilege to be on these Benches and never have to vote against the Whip. But I say to my friends and colleagues on all sides that, as we debate today, hundreds of organisations and many individual rights holders are watching. They are watching to see what this House will do in the face of a government proposal that will transfer their hard-earned property from them to another sector without compensation, and with it their possibility of a creative life, or a creative life for the next generation.
The Government are doing this not because the current law does not protect intellectual property rights, nor because they do not understand the devastation it will cause, but because they are hooked on the delusion that the UK’s best interests and economic future align with those of Silicon Valley. The Minister will say to the House that a consultation is ongoing and we should wait for the results. This was the same line the Minister in the other place, Chris Bryant, took last night; he said that his door and his mind were open. If that is the case, I would like to know why the honourable Dame Caroline Dinenage, the chair of the Commons Select Committee, said she felt “gaslit” by Ministers and the Secretary of State.
I would also like to know why the Minister in charge of the Bill in the other place has refused a meeting with me twice and why the creative industries say that they get blandishments from a junior Minister while the AI companies get the undivided attention of the Secretary of State. Most importantly, the assertion that the consultation is open and fair is critically undermined because it was launched with a preferred option. For the record, the Government’s preferred option is to give away the property rights of those who earned them on the promise of growth, growth, growth to the nation. Unfortunately, the Government cannot say to whom that growth will accrue or how much it will be. But the one thing they are absolutely sure of—Government, Opposition, AI companies and those whose property rights the Government are giving away—is that it will not accrue to the creative industries.
We have before us the most extraordinary sight of a Labour Government transferring wealth directly from 2.4 million individual creatives, SMEs and UK brands on the promise of vague riches in the future. Before I turn to the Opposition—which I will—I make it clear that there is a role in our economy for AI, there is a role in our economy for companies headquartered elsewhere, there is a role in our economy for new AI models and there is an opportunity of growth in the combination of AI and creative industries. But this forced marriage, on slave terms, is not it.
We have the Government, putting growth front and centre, stunting one of their most lucrative industries, and the equally extraordinary sight of the Conservative Opposition for the most part sitting on their hands, against the wishes of many in their tribe, putting party ahead of country because they prefer to have proof of the Government’s economic incompetence rather than protect the property rights of their citizens and creative industries.
Let me kill a few sacred cows. Judges, lawyers and academics all agree that the law on copyright is clear, and the ICO determination that copyright stands in spite of the advances of AI is also clear. Ministers choosing to mirror the tech lobbyist language of uncertainty rather than defending the property rights of citizens and wealth creators is bewildering. They are not quoting the law or the experts; they point at the number of court cases as proof of lack of clarity. But I am at a loss, since a person who has had their goods stolen relying on their legal rights seems to be a sign that the law is clear.
However, given the scale of the theft and the audacity of the robber barons, they should be able to turn to the Government for protection—rather than suggesting that we redefine the notion of theft. The Minister, the honourable Chris Bryant, said last night that change is needed and that we cannot do nothing, and the unified voices of the creative industries—from the biggest brands such as Sony and Disney to newspapers such as the Telegraph, the FT and the Guardian, and those who represent publishers, musicians or visual artists and the artists themselves—all agree. Nobody is saying that we should leave it as it is. They are saying, “Make the copyright regime fit for the age of AI”—which is exactly what the amendments do.
The amendments surface the names and owners of the crawlers that currently operate anonymously, record when, where and how IP is taken and, crucially, allow creators to understand what has been taken so that they can seek redress. This is not new, burdensome regulation—and it is certainly less regulation than the incredibly complex, costly and ultimately unworkable opt-outs or rights reservation mechanism of the preferred option of the consultation. All that creators are asking for is the enforcement of an existing property right. And when I say “creators”, I am not talking about 19th-century aristocrats occupying the time between lunch and dinner. In spite of the immense pleasure and extraordinary soft power that the creative industries bring, it is a hard-nosed, incredibly competitive and successful sector. It takes training, skill and talent to pursue what is often an insecure career, in which the copyright of career highs pays for the costs of a freelance life and the ongoing costs of making new work.
In the other place last night, the Minister talked about transparency without reference to the fact that the tech lobby is already on manoeuvres, saying that transparency must not be too detailed because it will impact on their IP. Creatives’ IP is being given away for literally nothing, but AI companies wish to hide behind the IP of products that are simply impossible to make without the raw material of that data. So will the Minister explain why the Government pay for software licences, why our NHS pays for drugs and why members of the Cabinet pay for branded clothes, yet the Government think that the creative industries should invent something for nothing?
The Government say that doing nothing is not an option. I agree—they could call a halt to the theft, instruct the ICO and the IPO now or even do an impact assessment of their preferred policy. This is the most extraordinary thing. They have a preferred way forward but, when I asked, they had to admit that they had not done an economic impact assessment, including of job displacement, even while acknowledging that job losses were inevitable. The Prime Minister cited an IMF report that claimed that, if fully realised, the gains from AI could be worth up to an average of £47 billion to the UK each year over a decade. He did not say that the very same report suggested that unemployment would increase by 5.5% over the same period. This is a big number—a lot of jobs and a very significant cost to the taxpayer. Nor does that £47 billion account for the transfer of funds from one sector to another. The creative industries contribute £126 billion per year to the economy. I do not understand the excitement about £47 billion when you are giving up £126 billion.
The Government have a preferred option, but they have no enforcement mechanism. They have a preferred option, but no protocols to make it work. They have a preferred option but, by their own admission, no idea how an individual artist could hope to chase down dozens, hundreds or maybe thousands of AI companies to opt out or trace their work and rights. They have a preferred option, which is to give away other people’s livings and their vast contribution to the Treasury, and with that the jobs, joy and soft power of our creative industries that the country relies on globally.
There are plenty of great ideas about how creativity could add GDP to the country, but that is not the demand that the Government have made of the sector. I will not quote most of those to whom I have spoken in the last week, because the language is unparliamentary. However, I will pass on the deep regret of Lord Lloyd- Webber that he is no longer in his place to stand by me today. I will also pass on the words of a Labour donor, who said that this was economically illiterate.
My Lords, I want to return to the moment just before the Front-Bench speeches of the Opposition and the Government, when there was absolute agreement around the House. There were fantastic speeches from all sides, which understood AI not as competition but as a fellow traveller of the creative industries. I want to make that really clear, as all colleagues did around the House. I thank all noble Lords on both Benches who are being whipped not to vote for this for saying that they will support it. As I said at the outset, there are many hundreds of people watching this, and they want to know what the House is going to do to protect their future.
I will not address my remarks to the noble Viscount, Lord Camrose; he knows what I think. For a Conservative Party not to act on the property rights of UK citizens is a crying shame. To the Government and the Minister —to whom I keep finding myself saying, “who I like very much”—I have to say that this is not good enough. The Minister used the word “premature” twice. There may be a dispute about 09 or 10, but we seem to be in agreement on the 17 over on our Benches. It is not premature to use the copyright law to protect the property rights of British citizens.
I also noticed the slight slide around the preferred option. I am sorry, but to say it is a preferred option and then suggest that it is an open consultation is simply not correct. I also want to talk about this business of the impact assessment—and I am going to revisit this. I was in a meeting with officials, and I asked for the impact assessment. They said, “Well, there was one, but I don’t think it will suffice for you, Lady Kidron”. The reason it did not suffice for the noble Baroness, Lady Kidron, is because this impact assessment of AI on companies was just eight bullet points. If just one of the bullet points concerns this point about job losses and loss of income, I do not call that an impact assessment. To have a preferred option that is so catastrophic for our country’s second most effective industry—£126 billion down the drain for this magical £4.7 billion—means that I, like other noble Lords, do not understand what we are doing here.
I can see that the Chamber is filling up. Finally, on this point about international law, we have heard it all before: we heard about data law, we heard about the OSA, we heard about competition law. I wonder whether, when they do an impact assessment, the Government might consider how many creative copyright owners might like to come to the UK to ply their trade when we have our copyright laws in full order. I remember one of the first reasons Canal+ gave for making its IPO in London was our copyright laws—and it has “Paddington”.
I thank all noble Lords for speaking. They made tremendous speeches, which were educated, thoughtful and non-hysterical. These are very modest amendments, and this House has a duty to those people outside to vote on them. I will add that, at a personal level, in the 12 years I have been in your Lordships’ House I have done so many deals with the Government of the day, whichever Government that was. I have always tried to avoid voting, and I have never called a vote that I did not know I was going to win. Because of the whipping arrangements, I believe I will lose today, but we will vote. I invite those people who want the creative industries to know that we have their back to follow me through the Lobby. I would like to test the opinion of the House.
My Lords, these amendments have to do with research access for online safety. Having sat on the Joint Committee of the draft Online Safety Bill back in 2021, I put on record that I am delighted that the Government have taken the issue of research access to data very seriously. It was a central plank of what we suggested and it is fantastic that they have done it.
Of the amendments in my name, Amendment 51 would simply ensure that the provisions of Clause 123 are acted on by removing the Government’s discretion as to whether they introduce regulations. It also introduces a deadline of 12 months for the Government to do so. Amendment 53 seeks to ensure that the regulators will enable independent researchers to research how online risks and harms impact different groups, especially vulnerable users, including children. Given the excitements we have already had this evening, I do not propose to press any of them, but I would like to hear from the Minister that he has heard me and that the Government will seek to enshrine the principle of different ages, different stages, different people, when he responds.
I note that the noble Lord, Lord Bethell, who has the other amendments in this group, to which I added my name, is not in his place, but I understand that he has sought—and got—reassurance on his amendments. So there is just one remaining matter on which I would like further reassurance: the scope of the legal privilege exception. A letter from the Minister on 10 January explains:
“The clause restates the existing law on legally privileged information as a reassurance that regulated services will not be asked to break the existing legislation on the disclosure of this type of data”.
It seems that the Minister has veered tantalisingly close to answering my question, but not in a manner that I can quite understand. So I would really love to understand—and I would be grateful to the Minister if he would try to explain to me—how the Government will prevent tech companies using legal privilege as a shield. Specifically, would CCing a lawyer on every email exchange, or having a lawyer in every team, allow companies to prevent legitimate scrutiny of their safety record? I have sat in Silicon Valley headquarters and each team came with its own lawyer—I would really appreciate clarity on this issue. I beg to move.
My Lords, I can only support what the noble Baroness, Lady Kidron, had to say. This is essentially unfinished business from the Online Safety Act, which we laboured in the vineyard to deliver some time ago. These amendments aim to strengthen Clause 123 and try to make sure that this actually happens and that we do not get the outcomes of the kind that the noble Baroness has mentioned.
I, too, have read the letter from the Minister to the noble Lord, Lord Bethell. It is hedged about with a number of qualifications, so I very much hope that the Minister will cut through it and give us some very clear assurances, because I must say that I veer back and forth when I read the paragraphs. I say, “There’s a win”, and then the next paragraph kind of qualifies it, so perhaps the Minister will give us true clarity when he responds.
I thank the noble Baroness, Lady Kidron, for the amendments on researchers’ access to data for online safety research, an incredibly important topic. It is clear from Committee that the Government’s proposals in this clause are broadly welcomed. They will ensure that researchers can access the vital data they need to undertake an analysis of online safety risks to UK users, informing future online safety interventions and keeping people safe online.
Amendment 51 would compel the Secretary of State to make regulations for a researcher access framework, and to do so within 12 months. While I am sympathetic to the spirit of the noble Baroness’s amendment, a fixed 12-month timescale and requirement to make regulations may risk compressing the time and options available to develop the most effective and appropriate solution, as my noble friend Lady Jones outlined in Committee. Getting this right is clearly important. While we are committed to introducing a framework as quickly as possible, we do not want to compromise its quality. We need adequate time to ensure that the framework is fit for purpose, appropriately safeguarded and future-proofed for a fast-evolving technological environment.
As required by the Online Safety Act, Ofcom is currently preparing a report into the ways in which researchers can access data and the barriers that they face, as well as exploring how additional access might be achieved. This report will be published in July of this year. We are also committed to conducting a thorough consultation on the issue prior to any enforceable requirements coming into force. The Government intend to consult on the framework as soon as practicable after the publication of Ofcom’s report this summer.
Sufficient time is required for a thorough consultation with the wide range of interested stakeholders in this area, including the research community, civil society and industry. I know that the noble Baroness raised a concern in Committee that the Government would rely on Ofcom’s report to set the framework for the regime, but I can assure her that a robust evidence-gathering process is already under way. The framework will be informed by collaboration with key stakeholders and formal consultation, as well as being guided by evidence from Ofcom’s report on the matter. Once all interested parties have had their say and the consultation is completed, the Government expect to make regulations to install the framework. It is right that the Government commit to a full consultation process and do not seek to prejudge the outcomes of that process by including a mandatory requirement for regulations now.
Amendment 53 would seek to expand the list of examples of the types of provision that the regulations might make. Clause 123 gives non-exhaustive examples of what may be included in future regulations; it certainly does not limit those regulations to the examples given. Given the central importance of protecting children and vulnerable users online, a key aim of any future regulations would be to support researchers to conduct research into the different ways that various groups of people experience online safety, without the need for this amendment. Indeed, a significant driving force for establishing this framework in the first place is to improve the quality of research that is possible to understand the risks to users online, particularly those faced by children. I acknowledge the point that the noble Baroness made about people of all ages. We would be keen to discuss this further with her as we consult on specific requirements as part of developing regulations.
I will touch on the point about legal privilege. We believe that routinely copying a lawyer on to all emails and documents is not likely to attract legal privilege. Legal privilege protects communication specifically between legal advisers and their clients being created for the purpose of giving or receiving legal advice, or for the sole or dominant purpose of litigation. It would not be satisfactory just to copy everyone on everything.
We are confident that we can draft regulations that will make it entirely clear that the legal right to data for research purposes cannot be avoided by tech companies seeking to rely on contractual provisions that purport to prevent the sharing of data for research purposes. Therefore, there is no need for a specific requirement in the Bill to override a terms of service.
I thank the Minister for his very full answer. My legal adviser on my right—the noble and learned Lord, Lord Thomas of Cwmgiedd—let me know that I was in a good place here. I particularly welcome the Minister’s invitation to discuss Ofcom’s review and the consultation. Perhaps he would not mind if I brought some of my researcher friends with me to that meeting. With that, I beg leave to withdraw the amendment.
(1 week, 5 days ago)
Lords ChamberMy Lords, I support what the noble Baroness, Lady Freeman, said. Her maiden speech was a forewarning of how good her subsequent speeches would be and how dedicated she is to openness, which is absolutely crucial in this area. We are going to have to get used to a lot of automatic processes and come to consider that they are by and large fair. Unless we are able to challenge it, understand it and see that it has been properly looked after, we are not going to develop that degree of trust in it.
Anyone who has used current AI programs will know about the capacity of AI for hallucination. The noble Lord, Lord Clement-Jones, uses them a lot. I have been looking, with the noble Lord, Lord Saatchi, at how we could use them in this House to deal with the huge information flows we have and to help us understand the depths of some of the bigger problems and challenges we are asked to get a grip on. But AI can just invent things, leaping at an answer that is easier to find, ignoring two-thirds of the evidence and not understanding the difference between reliable and unreliable witnesses.
There is so much potential, but there is so much that needs to be done to make AI something we can comfortably rely on. The only way to get there is to be absolutely open and allow and encourage challenge. The direction pointed out by the noble Lord, Lord Clement-Jones, and, most particularly by the noble Baroness, Lady Freeman, is one that I very much think we should follow.
My Lords, I will very briefly speak to Amendment 30 in my name. Curiously, it was in the name of the noble Viscount, Lord Camrose, in Committee, but somehow it has jumped.
On the whole, I have always advocated for age-appropriate solutions. The amendment refers to preventing children consenting to special category data being used in automated decision-making, simply because there are some things that children should not be able to consent to.
I am not sure that this exact amendment is the answer. I hope that the previous conversation that we had before the dinner break will produce some thought about this issue—about how automatic decision-making affects children specifically—and we can deal with it in a slightly different way.
While I am on my feet, I want to say that I was very struck by the words of my noble friend Lady Freeman, particularly about efficacy. I have seen so many things that have purported to work in clinical conditions that have failed to work in the complexity of real life, and I want to associate myself with her words and, indeed, the amendments in her name and that of the noble Lord, Lord Clement-Jones.
I start with Amendment 26, tabled by the noble Viscount, Lord Camrose. As he said in Committee, a principles-based approach ensures that our rules remain fit in the face of fast-evolving technologies by avoiding being overly prescriptive. The data protection framework achieves this by requiring organisations to apply data protection principles when personal data is processed, regardless of the technology used.
I agree with the principles that are present for AI, which are useful in the context in which they were put together, but introducing separate principles for AI could cause confusion around how data protection principles are interpreted when using other technologies. I note the comment that there is a significant overlap between the principles, and the comment from the noble Viscount that there are situations in which one would catch things and another would not. I am unable to see what those particular examples are, and I hope that the noble Viscount will agree with the Government’s rationale for seeking to protect the framework’s technology-neutral set of principles, rather than having two separate sets.
Amendment 28 from the noble Lord, Lord Clement-Jones, would extend the existing safeguards for decisions based on solely automated processing to decisions based on predominantly automated processing. These safeguards protect people when there is no meaningful human involvement in the decision-making. The introduction of predominantly automated decision-making, which already includes meaningful human involvement—and I shall say a bit more about that in a minute—could create uncertainty over when the safeguards are required. This may deter controllers from using automated systems that have significant benefits for individuals and society at large. However, the Government agree with the noble Viscount on strengthening the protections for individuals, which is why we have introduced a definition for solely automated decision-making as one which lacks “meaningful human involvement”.
I thank noble Lords for Amendments 29 and 36 and the important points raised in Committee on the definition of “meaningful human involvement”. This terminology, introduced in the Bill, goes beyond the current UK GDPR wording to prevent cursory human involvement being used to rubber stamp decisions as not being solely automated. The point at which human involvement becomes meaningful is context specific, which is why we have not sought to be prescriptive in the Bill. The ICO sets out in its guidance its interpretation that meaningful human involvement must be active: someone must review the decision and have the discretion to alter it before the decision is applied. The Government’s introduction of “meaningful” into primary legislation does not change this definition, and we are supportive of the ICO’s guidance in this space.
As such, the Government agree on the importance of the ICO continuing to provide its views on the interpretation of terms used in the legislation. Our reforms do not remove the ICO’s ability to do this, or to advise Parliament or the Government if it considers that the law needs clarification. The Government also acknowledge that there may be a need to provide further legal certainty in future. That is why there are a number of regulation-making powers in Article 22D, including the power to describe meaningful human involvement or to add additional safeguards. These could be used, for example, to impose a timeline on controllers to provide human intervention upon the request of the data subject, if evidence suggested that this was not happening in a timely manner following implementation of these reforms. Any regulations must follow consultation with the ICO.
Amendment 30 from the noble Baroness, Lady Kidron, would prevent law enforcement agencies seeking the consent of a young person to the processing of their special category or sensitive personal data when using automated decision-making. I thank her for this amendment and agree about the importance of protecting the sensitive personal data of children and young adults. We believe that automated decision-making will continue to be rarely deployed in the context of law enforcement decision-making as a whole.
Likewise, consent is rarely used as a lawful basis for processing by law enforcement agencies, which are far more likely to process personal data for the performance of a task, such as questioning a suspect or gathering evidence, as part of a law enforcement process. Where consent is needed—for example, when asking a victim for fingerprints or something else—noble Lords will be aware that Clause 69 clearly defines consent under the law enforcement regime as
“freely given, specific, informed and unambiguous”
and
“as easy … to withdraw … as to give”.
So the tight restrictions on its use will be crystal clear to law enforcement agencies. In summary, I believe the taking of an automated decision based on a young person’s sensitive personal data, processed with their consent, to be an extremely rare scenario. Even when it happens, the safeguards that apply to all sensitive processing will still apply.
I thank the noble Viscount, Lord Camrose, for Amendments 31 and 32. Amendment 31 would require the Secretary of State to publish guidance specifying how law enforcement agencies should go about obtaining the consent of the data subject to process their data. To reiterate a point made by my noble friend Lady Jones in Committee, Clause 69 already provides a definition of “consent” and sets out the conditions for its use; they apply to all processing under the law enforcement regime, not just automated decision-making, so the Government believe this amendment is unnecessary.
Amendment 32 would require the person reviewing an automated decision to have sufficient competence and authority to amend the decision if required. In Committee, the noble Viscount also expressed the view that a person should be “suitably qualified”. Of course, I agree with him on that. However, as my noble friend Lady Jones said in Committee, the Information Commissioner’s Office has already issued guidance which makes it clear that the individual who reconsiders an automated decision must have the “authority and competence” to change it. Consequently, the Government do not feel that it is necessary to add further restrictions in the Bill as to the type of person who can carry out such a review.
The noble Baroness, Lady Freeman, raised extremely important points about the performance of automated decision-making. The Government already provide a range of products, but A Blueprint for Modern Digital Government, laid this morning, makes it clear that part of the new digital centre’s role will be to offer specialist insurance support, including, importantly in relation to this debate,
“a service to rigorously test models and products before release”.
That function will be in place and available to departments.
On Amendments 34 and 35, my noble friend Lady Jones previously advised the noble Lord, Lord Clement-Jones, that the Government would publish new algorithmic transparency recording standard records imminently. I am pleased to say that 14 new records were published on 17 December, with more to follow. I accept that these are not yet in the state in which we would wish them to be. Where these amendments seek to ensure that the efficacy of such systems is evaluated, A Blueprint for Modern Digital Government, as I have said, makes it clear that part of the digital centre’s role will be to offer such support, including this service. I hope that this provides reassurance.
My Lords, we have waited with bated breath for the Minister to share his hand, and I very much hope that he will reveal the nature of his bountiful offer of a code of practice on the use of automated decision-making.
I will wear it as a badge of pride to be accused of introducing an analogue concept by the noble Viscount, Lord Camrose. I am still keen to see the word “predominantly” inserted into the Bill in reference to automated decision-making.
As the Minister can see, there is considerable unhappiness with the nature of Clause 80. There is a view that it does not sufficiently protect the citizen in the face of automated decision-making, so I hope that he will be able to elaborate further on the nature of those protections.
I will not steal any of the thunder of the noble Baroness, Lady Kidron. For some unaccountable reason, Amendment 33 is grouped with Amendment 41. The groupings on this Bill have been rather peculiar and at this time of night I do not think any long speeches are in order, but it is important that we at least have some debate about the importance of a code of conduct for the use of AI in education, because it is something that a great many people in the education sector believe is necessary. I beg to move.
My Lords, I shall speak to Amendment 41 in my name and in the names of my noble friend Lord Russell, the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones. The House can be forgiven if it is sensing a bit of déjà-vu, since I have proposed this clause once or twice before. However, since Committee, a couple of things have happened that make the argument for the code more urgent. We have now heard that the Prime Minister thinks that regulating AI is “leaning out” when we should be, as the tech industry likes to say, leaning in. We have had Matt Clifford’s review, which does not mention children even once. In the meantime, we have seen rollout of AI in almost all products and services that children use. In one of the companies—a household name that I will not mention—an employee was so concerned that they rang me to say that nothing had been checked except whether the platform would fall over.
Amendment 41 does not seek to solve what is a global issue of an industry arrogantly flying a little too close to the sun and it does not grasp how we could use this extraordinary technology and put it to use for humankind on a more equitable basis than the current extractive and winner-takes-all model; it is far more modest than that. It simply says that products and services that engage with kids should undertake a mandatory process that considers their specific vulnerabilities related to age. I want to stress this point. When we talk about AI, increasingly we imagine the spectre of diagnostic benefits or the multiple uses of generative models, but of course AI is not new nor confined to these uses. It is all around us and, in particular, it is all around children.
In 2021, Amazon’s AI voice assistant, Alexa, instructed a 10 year-old to touch a live electrical plug with a coin. Last year, Snapchat’s My AI gave adult researchers posing as a 13 year-old girl tips on how to lose her virginity with a 31 year-old. Researchers were also able to obtain tips on how to hide the smell of alcohol and weed and how to conceal Snapchat conversations from their parents. Meanwhile, character.ai is being sued by the mother of a 14 year-old boy in Florida who died by suicide after becoming emotionally attached to a companion bot that encouraged him to commit suicide.
In these cases, the companies in question responded by implementing safety measures after the fact, but how many children have to put their fingers in electrical sockets, injure themselves, take their own lives and so on before we say that those measures should be mandatory? That is all that the proposed code does. It asks that companies consider the ways in which their products may impact on children and, having considered them, take steps to mitigate known risk and put procedures in place to deal with emerging risks.
One of the frustrating things about being an advocate for children in the digital world is how much time I spend articulating avoidable harms. The sorts of solutions that come after the event, or suggestions that we ban children from products and services, take away from the fact that the vast majority of products and services could, with a little forethought, be places of education, entertainment and personal growth for children. However, children are by definition not fully mature, which puts them at risk. They chat with smart speakers, disclosing details that grown-ups might consider private. One study found that three to six year-olds believed that smart speakers have thoughts, feelings and social abilities and are more reliable than human beings when it came to answering fact-based questions.
I ask the Minister: should we ban children from the kitchen or living room in which the smart speaker lives, or demand, as we do of every other product and service, minimum standards of product safety based on the broad principle that we have a collective obligation to the safety and well-being of children? An AI code is not a stretch for the Bill. It is a bare minimum.
My Lords, I will speak very briefly, given the hour, just to reinforce three things that I have said as the wingman to the noble Baroness, Lady Kidron, many times, sadly, in this Chamber in child safety debates. The age-appropriate design code that we worked on together and which she championed a decade ago has driven real change. So we have evidence that setting in place codes of conduct that require technology companies to think in advance about the potential harms of their technologies genuinely drives change. That is point one.
Point two is that we all know that AI is a foundational technology which is already transforming the services that our children use. So we should be applying that same principle that was so hard fought 10 years ago for non-AI digital to this foundational technology. We know that, however well meaning, technology companies’ development stacks are always contended. They always have more good things that they think they can do to improve their products for their consumers, that will make them money, than they have the resources to do. However much money they have, they just are contended. That is the nature of technology businesses. This means that they never get to the safety-by-design issues unless they are required to. It was no different 150 or 200 years ago as electricity was rolling through the factories of the mill towns in the north of England. It required health and safety legislation. AI requires health and safety legislation. You start with codes of conduct and then you move forward, and I really do not think that we can wait.
I thank the noble Lord, Lord Clement-Jones, for Amendment 33, and the noble Baroness, Lady Kidron, for Amendment 41, and for their thoughtful comments on AI and automated decision-making throughout this Bill’s passage.
The Government have carefully considered these issues and agree that there is a need for greater guidance. I am pleased to say that we are committing to use our powers under the Data Protection Act to require the ICO to produce a code of practice on AI and solely automated decision-making through secondary legislation. This code will support controllers in complying with their data protection obligations through practical guidance. I reiterate that the Government are committed to this work as an early priority, following the Bill receiving Royal Assent. The secondary legislation will have to be approved by both Houses of Parliament, which means it will be scrutinised by Peers and parliamentarians.
I can also reassure the noble Baroness that the code of practice will include guidance about protecting data subjects, including children. The new ICO duties set out in the Bill will ensure that where children’s interests are relevant to any activity the ICO is carrying out, it should consider the specific protection of children. This includes when preparing codes of practice, such as the one the Government are committing to in this area.
I understand that noble Lords will be keen to discuss the specific contents of the code. The ICO, as the independent data protection regulator, will have views as to the scope of the code and the topics it should cover. We should allow it time to develop those thoughts. The Government are also committed to engaging with noble Lords and other stakeholders after Royal Assent to make sure that we get this right. I hope noble Lords will agree that working closely together to prepare the secondary legislation to request this code is the right approach instead of pre-empting the exact scope.
The noble Lord, Lord Clement-Jones, mentioned edtech. I should add—I am getting into a habit now—that it is discussed in a future group.
Before the Minister sits down, I welcome his words, which are absolutely what we want to hear. I understand that the ICO is an independent regulator, but it is often the case that the scope and some of Parliament’s concerns are delivered to it from this House—or, indeed, from the other place. I wonder whether we could find an opportunity to make sure that the ICO hears Parliament’s wish on the scope of the children’s code, at least. I am sure the noble Lord, Lord Clement-Jones, will say similar on his own behalf.
It will be clear to the ICO from the amendments that have been tabled and my comments that there is an expectation that it should take into account the discussion we have had on this Bill.
(1 week, 5 days ago)
Lords ChamberMy Lords, last week the Government published the AI Opportunities Action Plan and confirmed that they have accepted or partially accepted all 50 of the recommendations from the report’s author, Matt Clifford. Reading the report, there can be no doubting Government’s commitment to making the UK a welcoming environment for AI companies. What is less clear is how creating the infrastructure and skills pool needed for AI companies to thrive will lead to economic and social benefits for UK citizens.
I am aware that the Government have already said that they will provide further details to flesh out the top-level commitments, including policy and legislative changes over the coming months. I reiterate the point made by many noble Lords in Committee that, if data is the ultimate fuel and infrastructure on which AI is built, why, given that we have a new Government, is the data Bill going through the House without all the strategic pieces in place? This is a Bill flying blind.
Amendment 1 is very modest and would ensure that information that traders were required to provide to customers on goods, services and digital content included information that had been created using AI to build a profile about them. This is necessary because the data that companies hold about us is already a combination of information proffered by us and information inferred, increasingly, by AI. This amendment would simply ensure that all customer data—our likes and dislikes, buying habits, product uses and so on—was disclosable, whether provided by us or a guesstimate by AI.
The Government’s recent statements have promised to “mainline AI into the veins” of the nation. If AI were a drug, its design and deployment would be subject to governance and oversight to ensure its safety and efficacy. Equally, they have said that they will “unleash” AI into our public services, communities and business. If the rhetoric also included commitments to understand and manage the well-established risks of AI, the public might feel more inclined to trust both AI and the Government.
The issue of how the data Bill fails to address AI— and how the AI Opportunities Action Plan, and the government response to it, fail to protect UK citizens, children, the creative industries and so on—will be a theme throughout Report. For now, I hope that the Government can find their way to agreeing that AI-generated content that forms part of a customer’s profile should be considered personal data for the purposes of defining business and customer data. I beg to move.
My Lords, this is clearly box-office material, as ever.
I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.
Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.
I thank the noble Baroness, Lady Kidron, and the noble Viscount, Lord Camrose, for their proposed amendments and continued interest in Part 1 of this Bill. I hope I can reassure the noble Baroness that the definition of customer data is purposefully broad. It encompasses information relating to a customer or a trader and the Government consider that this would indeed include inferred data. The specific data to be disclosed under a smart data scheme will be determined in the context of that scheme and I reassure the noble Baroness that there will be appropriate consultation before a smart data scheme is introduced.
I turn to Amendment 5. Clause 13 provides statutory authority for the Secretary of State or the Treasury to give financial assistance to decision-makers, enforcers and others for the purpose of meeting any expense in the exercise of their functions in the smart data schemes. Existing and trusted bodies such as sector regulators will likely be in the lead of the delivery of new schemes. These bodies will act as decision-makers and enforcers. It is intended that smart data schemes will be self-financing through the fees and levies produced by Clauses 11 and 12. However, because of the nature of the bodies that are involved, it is deemed appropriate for there to be a statutory spending authority as a backstop provision if that is necessary. Any spending commitment of resources will, of course, be subject to the usual estimates process and to existing public sector spending controls and transparency requirements.
I hope that with this brief explanation of the types of bodies involved, and the other explanations, the noble Baroness will be content to withdraw Amendment 1 and that noble Lords will not press Amendment 5.
I thank the Minister for his reassurance, particularly that we will have an opportunity for a consultation on exactly how the smart data scheme works. I look forward to such agreement throughout the afternoon. With that, I beg leave to withdraw my amendment.
My Lords, in moving Amendment 2 I will speak to Amendments 3, 4, 25, 42 and 43, all of which are in my name and that of the noble Lord, Lord Clement-Jones. The very detailed arguments for Amendments 25, 42 and 43 were made during the DPDI Bill and can be found at col. GC 89 of vol. 837 of Hansard, and the subsequent arguments for their inclusion in this Bill were made in Committee at col. GC 454. For that reason, I do not propose to make them in full again. I simply say that these amendments for data communities represent a more ambitious and optimistic view of the Bill that would empower citizens to use data law to benefit those with common interests. The example I gave last time was of gig workers assigning their data rights to an expert third party to see whether they were being fairly compensated. That is not something that any individual data subject can easily do alone.
The new Amendments 2, 3 and 4 demonstrate how the concept of data communities might work in relation to the Government’s smart data scheme. Amendment 2 would add enacting data rights to the list of actions that the Secretary of State or the Treasury can enable an authorised person to take on behalf of customers. Amendment 3 requires the Secretary of State or the Treasury to include data communities in the list of those who would be able to activate rights, including data rights on a customer’s behalf. Amendment 4 provides a definition of “data communities”.
Data communities are a process by which one data holder can assign their rights for a given purpose to a community of people who agree with that purpose. I share the Government’s desire to empower consumers and to promote innovation, and these amendments would do just that. Allowing the sharing of data rights of individuals, as opposed to specific categories of data, would strengthen the existing proposal and provide economic and social benefit to the UK and its citizens, rather than imagining that the third party is always a commercial entity.
In response to these amendments in Committee, the then Minister said two things. The first was that the UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf. She also warmly said that something of this kind was being planned by government and invited me and other noble Lords to discuss this area further. I made it clear that I would like such a meeting, but it has only just been scheduled and is planned for next week, which clearly does not meet the needs of the House, since we are discussing this today. I would be grateful if the current Minister could undertake to bring something on this subject back at Third Reading if we are not reassured by what we hear at the meeting.
While the UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf, in the example I gave the Minister in Committee it took many years and a bespoke agreement between the ICO and Uber for the 300-plus drivers to combine their data. Under equivalent GDPR provisions in European law, it required a Court of Appeal judgment in Norway before Uber conceded that it was entitled to the data on the drivers’ behalf. A right that cannot be activated without legal action and years of effort is not a right fully given; the UK GDPR is not sufficient in these circumstances.
I want to stress that these amendments are not simply about contesting wrongs. Introducing the concept of data communities would facilitate innovation and promote fairness, which is surely an aim of the legislation.
My Lords, I rise to speak to Amendments 2, 3, 4, 25, 42 and 43. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, for these amendments on data communities, which were previously tabled in Committee, and for the new clauses linking these with the Bill’s clauses on smart data.
As my noble friend Lady Jones noted in Committee, the Government support giving individuals greater agency over their data. The Government are strongly supportive of a robust regime of data subject rights and believe strongly in the opportunity presented by data for innovation and economic growth. UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf. Stakeholders have, however, said that there may be barriers to this in practice.
I reassure noble Lords that the Government are actively exploring how we can support data intermediaries while maintaining the highest data protection standards. It is our intention to publish a call for evidence in the coming weeks on the activities of data intermediaries and the exercise of data subject rights by third parties. This will enable us to ensure that the policy settings on this topic are right.
In the context of smart data specifically, Part 1 of the Bill does not limit who the regulations may allow customers to authorise. Bearing in mind the IT and security-related requirements inherent in smart data schemes, provisions on who a customer may authorise are best determined in the context of a specific scheme, when the regulations are made following appropriate consultation. I hope to provide some additional reassurance that exercise of the smart data powers is subject to data protection legislation and does not displace data rights under that legislation.
There will be appropriate consultation, including with the Information Commissioner’s Office, before smart data schemes are introduced. This year, the Department for Business and Trade will be publishing a strategy on future uses of these powers.
While the smart data schemes and digital verification services are initial examples of government action to facilitate data portability and innovative uses of data, my noble friend Lady Jones previously offered a meeting with officials and the noble Baroness, Lady Kidron, to discuss these proposals, which I know my officials have arranged for next week—as the noble Baroness indicated earlier. I hope she is therefore content to withdraw her amendment.
Before the Minister sits down, may I ask whether there is a definition of “customer” and whether that includes a user in the broader sense, or means worker or any citizen? Is it a customer relationship?
My understanding is that “customer” reflects an individual, but I am sure that the Minister will give a better explanation at the meeting with officials next week.
I thank the noble Lord for that request, and I am sure my officials would be willing to do that.
My Lords, I do not intend to detain the House on this for very long, but I want to say that holding meetings after the discussion on Report is not adequate. “Certain rights” and “customer” are exactly the sort of terms that I am trying to address here. To the noble Viscount—and my noble friend—Lord Camrose, I say that it is not adequate, and we have an academic history going back a long way. I hope that the meeting next week is fruitful and that the Government’s enthusiasm for this benefits workers, citizens and customers. I beg leave to withdraw the amendment.
My Lords, I thank my noble friend Lady Kidron and the noble Viscount, Lord Camrose, for adding their signatures to my Amendment 14. I withdrew this amendment in Committee, but I am now asking the Minister to consider once again the definition of “scientific research” in the Bill. If he cannot satisfy me in his speech this evening, I will seek the opinion of the House.
I have been worried about the safeguards for defining scientific research since the Bill was published. This amendment will require that the research should be in “the public interest”, which I am sure most noble Lords will agree is a laudable aim and an important safeguard. This amendment has been looked at in the context of the Government’s recent announcements on turning this country into an AI superpower. I am very much a supporter of this endeavour, but across the country there are many people who are worried about the need to set up safeguards for their data. They fear data safety is threatened by this explosion of AI and its inexorable development by the big tech companies. This amendment will go some way to building public trust in the AI revolution.
The vision of Donald Trump surrounded at his inauguration yesterday by tech billionaires, most of whom have until recently been Democrats, puts the fear of God into me. I fear their companies are coming for our data. We have some of the best data in the world, and it needs to be safeguarded. The AI companies are spending billions of dollars developing their foundation models, and they are beholden to their shareholders to minimise the cost of developing these models.
Clause 67 gives a huge fillip to the scientific research community. It exempts research which falls within the definition of scientific research as laid out in the Bill from having to gain new consent from data subjects to reuse millions of points of data.
It costs time and money for the tech companies to get renewed consent from data holders before reusing their data. This is an issue we will discuss further when we debate amendments on scraping data from creatives without copyright licensing. It is clear from our debates in Committee that many noble Lords fear that AI companies will do what they can to avoid either getting consent or licensing data for use in scraping data. Defining their research as scientific will allow them to escape these constraints. I could not be a greater supporter of the wonderful scientific research that is carried out in this country, but I want the Bill to ensure that it really is scientific research and not AI development camouflaged as scientific research.
The line between product development and scientific research is often blurred. Many developers posit efforts to increase model capabilities, efficiency, or indeed the study of their risks, as scientific research. The balance has to be struck between allowing this country to become an AI superpower and exploiting its data subjects. I contend that this amendment will go far to allay public fears of the abuse and use of their data to further the profits and goals of huge AI companies, most of which are based in the United States.
Noble Lords have only to look at the outrage last year at Meta’s use of Instagram users’ data without their consent to train the datasets for its new Llama AI model to understand the levels of concern. There were complaints to regulators, and the ICO posted that Meta
“responded to our request to pause and review plans to use Facebook and Instagram user data to train generative AI”.
However, so far, there has been no official change to Meta’s privacy policy that would legally bind it to stop processing data without consent for the development of its AI technologies, and the ICO has not issued a binding order to stop Meta’s plans to scrape users’ data to train its AI systems. Meanwhile, Meta has resumed reusing subjects’ data without their consent.
I thank the Minister for meeting me and talking through Amendment 14. I understand his concerns that, at a public interest threshold, the definition of scientific research will create a heavy burden on researchers, but I think it is worth the risk in the name of safety. Some noble Lords are concerned about the difficulty of defining “public interest”. However, the ICO has very clear guidelines about what public interest consists of. It states that
“you should broadly interpret public interest in the research context to include any clear and positive public benefit likely to arise from that research”.
It continues:
“The public interest covers a wide range of values and principles about the public good, or what is in society’s best interests. In making the case that your research is in the public interest, it is not enough to point to your own private interests”.
The guidance even includes further examples of research in the public interest, such as
“the advancement of academic knowledge in a given field … the preservation of art, culture and knowledge for the enrichment of society … or … the provision of more efficient or more effective products and services for the public”.
This guidance is already being applied in the Bill to sensitive data and public health data. I contend that if these carefully thought-through guidelines are good enough for health data, they should be good enough for all scientific data.
This view is supported in the EU, where
“the special data protection regime for scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests.”
The Minister will tell the House that the data exempted to be used for scientific research is well protected—that it has both the lawfulness test, as set out in the UK GDPR, and a reasonableness test. I am concerned that the reasonableness test in this Bill references
“processing for the purposes of any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity”.
Normally, a reasonableness test requires an expert in the context of that research to decide whether it is reasonable to consider it scientific. However, in this Bill, “reasonable” just means that an ordinary person in the street can decide whether the research is reasonable to be considered scientific. This must be a broadening of the threshold of the definition.
It seems “reasonable” in the current climate to ask the Government to include a public interest test before giving the AI companies extensive scope to reuse our data, without getting renewed consent, on the pretext that the work is for scientific research. In the light of possible deregulation of the sector by the new regime in America, it is beholden on this country to ensure that our scientific research is dynamic, but safe. If the Government can bring this reassurance then for millions of people in this country they will increase trust in Britain’s AI revolution. I beg to move.
My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.
In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.
There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.
My Lords, it is almost impossible to better the arguments put forward by the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, so I am not even going to try.
The inclusion of a public interest requirement would ensure that the use of data for scientific research would serve a genuine societal benefit, rather than primarily benefiting private interests. This would help safeguard against the misuse of data for purely commercial purposes under the guise of research. The debate in Committee highlighted the need for further clarity and stronger safeguards in the Bill, to ensure that data for scientific research genuinely serves the public interest, particularly concerning the sensitive data of children. The call for a public interest requirement reflects the desire to ensure a balance between promoting research and innovation and upholding the rights and interests of data subjects. I very much hope that the House will support this amendment.
My Lords, I rise to move Amendment 15 and to speak to Amendments 16, 20, 22, 27, 39, 45 and, briefly, government Amendment 40. Together, these amendments offer protections that children were afforded in the Data Protection Act 2018, which passed through this House, and they seek to fix some of the underperformance of the ICO in relation to children’s data.
Before we debate these amendments, it is perhaps worth the Government reflecting on the fact that survey after survey shows that the vast majority—indeed, almost all—of the UK population support stronger digital regulation in respect of children. In refusing to accept these amendments, or, indeed, in replacing them with their own amendments to the same effect, the Government are throwing away one of the successes of the UK Parliament with their newfound enthusiasm for tech with fewer safeguards.
I repeat my belief that lowering data protections for adults is a regressive step for all of us, but for children it is a tragedy that puts them at greater risk of harm—a harm that we in this House have a proud record of seeking to mitigate. The amendments in my name and variously in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones, my noble friend Lord Russell and the noble Baroness, Lady Harding, are essential to preserving the UK’s commitment to child protection and privacy. As the House is well aware, there is cross-party support for child protection. While I will listen very carefully to the Minister, I too am prepared to test the opinion of the House if he has nothing to offer, and I will ask Labour colleagues to consider their responsibility to the nation’s children before they walk through the Lobby.
I will take the amendments out of numerical order, for the benefit of those who have not been following our proceedings. Amendment 22 creates a direct, unambiguous obligation on data processors and controllers to consider the central principles of the age-appropriate design code when processing children’s data. It acknowledges that children of different ages have different capacities and therefore may require different responses. Subsection (2) of the new clause it would insert addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults would experience under the Act when passed.
In the last few weeks, Meta has removed its moderators, and the once-lauded Twitter has become flooded with disinformation and abuse as a result of Elon Musk’s determined deregulation and support of untruth. We have seen the dial move on elections in Romania’s presidential election via TikTok, a rise in scams and the horror of sexually explicit deepfakes, which we will discuss in a later group.
Public trust in both tech and politics is catastrophically low. While we may disagree on the extent to which adults deserve privacy and protection, there are few in this House or the other place who do not believe it is a duty of government to protect children. Amendment 22 simply makes it a requirement that those who control and process children’s data are directly accountable for considering and prioritising their needs. Amendment 39 does the same job in relation to the ICO, highlighting the need to consider that high bar of privacy to which children are entitled, which should be a focus of the commissioner when exercising its regulatory functions, with a particular emphasis on their age and development stage.
Despite Dame Elizabeth Denham’s early success in drafting the age-appropriate design code, the ICO’s track record on enforcement is poor and the leadership has not championed children by robustly enforcing the ADC, or when faced with proposals that watered down child protections in this Bill and its predecessor. We will get to the question of the ICO next week, but I have been surprised by the amount of incoming mail dissatisfied with the regulator and calling on Parliament to demand more robust action. This amendment does exactly that in relation to children.
Government Amendment 40 would require the ICO, when exercising its functions, to consider the fact that children merit specific protections. I am grateful for and welcome this addition as far as it goes; but in light of the ICO’s disappointing track record, clearer and more robust guidance on its obligations is needed.
Moreover, the Government’s proposal is also insufficient because it creates a duty on the ICO only. It does nothing for the controllers and processors, as I have already set out in Amendment 22. It is essential that those who control and process children’s data are directly accountable for prioritising their needs. The consequences when they do not are visible in the anxiety, body dysmorphia and other developmental issues that children experience as a result of their time online.
The Government have usefully introduced an annual report of ICO activities and action. Amendment 45 simply requires them to report the action it has taken specifically in relation to children, as a separate item. Creating better reporting is one of the advances the Government have made; making it possible to see what the ICO has done in regard to children is little more than housekeeping.
This group also includes clause-specific amendments, which are more targeted than Amendment 22. Amendment 15 excludes children from the impact of the proposal to widen the definition of scientific research in Clause 68. Given that we have just discussed this, I may reconsider that amendment. However, Amendment 16 excludes children from the “recognised legitimate interest” provisions in Clause 70. This means that data controllers would still be required to consider and protect children, as currently required under the legitimate interest basis for processing their data.
Amendment 20 excludes children from the new provisions in Clause 71 on purpose limitation. Purpose limitation is at the heart of GDPR. If you ask for a particular purpose and consent to it, extending that purpose is problematic. Amendment 21 ensures that, for children at least, the status quo of data protection law stays the same: that is to say, their personal data can be used only for the purpose for which it was originally collected. If the controller wants to use it in a different way, it must go back to the child—or, if they are under 13, their parent—to ask for further permission.
Finally, Amendment 27 ensures that significant decisions that impact children cannot be made during automated processes unless they are in a child’s best interest. This is a reasonable check and balance on the proposals in Clause 80.
In full, these amendments uphold our collective responsibility to support, protect and make allowances for children as they journey from infancy to adulthood. I met with the Minister and the Bill team, and I thank them for their time. They rightly made the point that children should be participants in the digital world, and I should not seek to exempt them. I suggest to the House that it is the other way round: I will not seek to exempt children if the Government do not seek to put them at risk.
Our responsibility to children is woven into the fabric of our laws, our culture and our behaviour. It has taken two decades to begin to weave childhood into the digital environment, and I am asking the House to make sure we do not take a single retrograde step. The Government have a decision to make. They can choose to please the CEOs of Silicon Valley in the hope that capitulation on regulatory standards will get us a data centre or two; or they can prioritise the best interests of UK children and agree to these amendments, which put children’s needs first. I beg to move.
My Lords, I rise to support all the amendments in this group. I have added my name to Amendments 15, 22, 27 and 45. The only reason my name is not on the other amendments is that others got there before me. As is always the case in our debates on this topic, I do not need to repeat the arguments of the noble Baroness, Lady Kidron. I would just like to make a very high-level point.
I will speak first to government Amendment 40, tabled in my name, concerning the ICO’s duty relating to children’s personal data. Before that, though, I thank the noble Lords, Lord Stevenson and Lord Russell, the noble Baroness, Lady Harding, and in particular the noble Baroness, Lady Kidron, for such considered debates on this incredibly important issue, both in today’s discussion in the House and in the meetings we have had together. Everyone here wants this to be effective and recognises that we must protect children.
The Government are firmly committed to maintaining high standards of protection for children, which is why they decided not to proceed with measures in the previous Data Protection and Digital Information Bill that would have reduced requirements for data protection impact assessments, prior consultation with the ICO and the designation of data protection officers. The ICO guidance is clear that organisations must complete an impact assessment in relation to any processing activity that uses children’s or other vulnerable people’s data for marketing purposes, profiling or other automated decision-making, or for offering online services directly to children.
The Government also expect organisations which provide online services likely to be accessed by children to continue to follow the standards on age-appropriate design set out in the children’s code. The noble Baroness, Lady Kidron, worked tirelessly to include those provisions in the Data Protection Act 2018 and the code continues to provide essential guidance for relevant online services on how to comply with the data protection principles in respect of children’s data. In addition to these existing provisions, Clause 90 already includes a requirement for the ICO to consider the rights and interests of children when carrying out its functions.
I appreciate the point that the noble Baroness made in Committee about the omission of the first 10 words of recital 38 from these provisions. As such, I am very happy to rectify this through government Amendment 40. The changes we are making to Clause 90 will require the Information Commissioner to consider, where relevant, when carrying out its regulatory functions the fact that children merit special protection with regard to their personal data. I hope noble Lords will support this government amendment.
Turning to Amendment 15 from the noble Baroness, Lady Kidron, which excludes children’s data from Clause 68, I reassure her that neither the protections for adults nor for children are being lowered. Clause 68 faithfully transposes the existing concept of giving consent to processing for an area of scientific research from the current recital. This must be freely given and be fully revokable at any point. While the research purpose initially identified may become more specific as the research progresses, this clause does not permit researchers to use the data for research that lies outside the original consent. As has been highlighted by the noble Viscount, Lord Camrose, excluding children from Clause 68 could have a detrimental effect on health research in children and could unfairly disadvantage them. This is already an area of research that is difficult and underrepresented.
I know that the noble Baroness, Lady Kidron, cares deeply about this but the fact is that if we start to make research in children more difficult—for example, if research on children with a particular type of cancer found something in those children that was relevant to another cancer, this would preclude the use of that data—that cannot be right for children. It is a risk to move and exempt children from this part of the Bill.
Amendment 16 would prevent data controllers from processing children’s data under the new recognised legitimate interests lawful ground. However, one of the main reasons this ground was introduced was to encourage organisations to process personal data speedily when there is a pressing need to do so for important purposes. This could be where there is a need to report a safeguarding concern or to prevent a crime being committed against a child. Excluding children’s data from the scope of the provision could therefore delay action being taken to protect some children—a point also made in the debate.
Amendment 20 aims to prohibit further processing of children’s personal data when it was collected under the consent lawful basis. The Government believe an individual’s consent should not be undermined, whether they are an adult or a child. This is why the Bill sets out that personal data should be used only for the purpose a person has consented to, apart from situations that are in the public interest and authorised by law or to comply with the UK GDPR principles. Safeguarding children or vulnerable individuals is one of these situations. There may be cases where a child’s data is processed under consent by a social media company and information provided by the child raises serious safeguarding concerns. The social media company must be able to further process the child’s data to make safeguarding referrals when necessary. It is also important to note that these public interest exceptions apply only when the controller cannot reasonably be expected to obtain consent.
I know the noble Baroness, Lady Kidron, hoped that the Government might also introduce amendments to require data controllers to apply a higher standard of protection to children’s data than to adults’. The Government have considered Amendment 22 carefully, but requiring all data controllers to identify whether any of the personal data they hold relates to children, and to apply a higher standard to it, would place disproportionate burdens on small businesses and other organisations that currently have no way of differentiating age groups.
Although we cannot pursue this amendment as drafted, my understanding of the very helpful conversations that I have had with the noble Baroness, Lady Kidron, is that she intended for this amendment to be aimed at online services directed at or likely to be accessed by children, not to every public body, business or third sector organisation that might process children’s data from time to time.
I reassure noble Lords that the Government are open to exploring a more targeted approach that focuses on those services that the noble Baroness is most concerned about. The age-appropriate design code already applies to such services and we are very open to exploring what further measures could be beneficial to strengthen protection for children’s data. This point was eloquently raised by the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Stevenson, and is one that we would like to continue. Combined with the steps we are taking in relation to the new ICO duty, which will influence the support and guidance it provides for organisations, we believe this could drive better rates of compliance. I would be very pleased to work with all noble Lords who have spoken on this to try to get this into the right place.
I turn to Amendment 27, tabled by the noble Baroness, Lady Kidron. I agree with her on the importance of protecting children’s rights and interests when undertaking solely automated decision-making. However, we think this amendment, as currently drafted, would cause operational confusion as to when solely automated decision-making can be carried out. Compliance with the reformed Article 22 and the wider data protection legislation will ensure high standards of protection for adults and children alike, and that is what we should pursue.
I now turn to Amendment 39, which would replace the ICO’s children’s duty, and for which I again thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell. As a public body, the ICO must adhere to the UK’s commitment to the UN Convention on the Rights of the Child, and we respectfully submit that it is unnecessary to add further wording of this nature to the ICO’s duty. We believe that government Amendment 40, coupled with the ICO’s principal objective to secure an appropriate level of protection, takes account of the fact that the needs of children might not always look the same.
Finally, to address Amendment 45, the Government believe that the Bill already delivers on this aim. While the new annual regulatory action report in Clause 101 will not break down the activity that relates to children, it does cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. This will deliver greater transparency and accountability on the ICO’s actions. Furthermore, Clause 90 requires the ICO to set out in its annual report how it has complied with its statutory duties. This includes the new duty relating to children.
To conclude, I hope that the amendment we tabled today and the responses I have set out reassure noble Lords of our commitment to protect children’s data. I ask noble Lords to support the amendment tabled in my name, and hope that the noble Baroness, Lady Kidron, feels content to withdraw her own.
Before the Minister sits down, I have some things to say about his words. I did not hear: “agree to bring forward a government amendment at Third Reading”. Those are the magic words that would help us get out of this situation. I have tried to suggest several times that the Government bring forward their own amendment at Third Reading, drafted in a manner that would satisfy the whole House, with the words of the noble Viscount, Lord Camrose, incorporated and the things that are fundamental.
I very much admire the Minister and enjoy seeing him in his place but I say to him that we have been round this a few times now and a lot of those amendments, while rather nerdy in their obsession, are based on lived experience of trying to hold the regulator and the companies to account for the law that we have already passed. I am seeking those magic words before the Minister sits down.
I have likewise enjoyed working with the noble Baroness. As has been said several times, we are all working towards the same thing, which is to protect children. The age-appropriate design code has been a success in that regard. That is why we are open to exploring what further measures can be put in place in relation to the ICO duty, which can help influence and support the guidance to get that into the right place. That is what I would be more than happy to work on with the noble Baroness and others to make sure that we get it right.
I thank the Minister for that very generous offer. I also thank the noble Lord, Lord Stevenson, for his incredible support. I note that, coming from the Government Benches, that is a very difficult thing to do, and I really appreciate it. On the basis that we are to have an amendment at Third Reading, whether written by me with government and opposition help or by the Government, that will address these fundamental concerns set out by noble Lords, I will not press this amendment today.
These are not small matters. The implementation of the age-appropriate design code depends on some of the things being resolved in the Bill. There is no equality of arms here. A child, whether five or 15, is no match for the billions of dollars spent hijacking their attention, their self-esteem and their body. We have to, in these moments as a House, choose David over Goliath. I thank the Minister and all the supporters in this House —the “Lords tech team”, as we have been called in the press. With that, I beg leave to withdraw the amendment.
(6 months, 1 week ago)
Lords ChamberI welcome the new Ministers and commend the noble Lord, Lord Vallance, on his maiden speech. Indeed, I wish the new Government well in their ambition for growth and their commitment to creativity in education, without which we squander both joy and one of our most valuable industries. I am encouraged by early statements about skills and innovation.
I will use my time to raise vital unfinished business that was abandoned as the snap election was called. In doing so, I declare my interests in the register, particularly as chair of 5Rights Foundation and adviser to the Oxford Institute for Ethics in AI. Top of the list was the measure to give coroners access to company data in cases where a child has died. We have campaigned long and hard for this and I am grateful to the Secretary of State, Peter Kyle, for committing to carry it forward in the data Bill. Can the Minister say when the Bill is anticipated and confirm that it will not undermine any existing protections for children’s data privacy?
Similarly promised and equally urgent is the new criminal offence of training, distributing or sharing digital files that create AI-generated child sexual abuse. The offence was agreed in principle with the Home Office and the irrefutable reasons for it are recorded in Hansard on 24 April at col. 588GC. Can the Minister please also commit to this measure?
Other agreed measures, all supported by the Labour Front Bench when in opposition, include data access for independent academic researchers. Access to data is an essential part of the innovation supply chain, and therefore the growth agenda.
There is a scandal brewing as the edtech sector oversells and underdelivers in our schools. The DfE had agreed to a review to establish criteria for efficacy, safety, security and privacy, so that children are as well protected inside the classroom as on the bus to school. A trusted edtech sector is yet to be developed anywhere in the world. It is a necessity and an opportunity.
The new Secretary of State has committed to strengthening the Online Safety Act. The children’s coalition has set out its concerns with Ofcom’s draft codes, which I will forward to Ministers. The gaps that it has identified are as mission-critical to the published codes as they will be to tackle violence against women and girls. It would mean a lot if the Secretary of State’s commitment made in the media to look again was repeated at the Dispatch Box today.
Finally on unfinished business, current UK law determines that computer information is always reliable, which is nonsense and has contributed to multiple injustices, most notably Horizon. The previous Lord Chancellor looked at how to rectify this. I was delighted to see the new Attorney-General introduced today. This must be a priority for him.
This is not an arbitrary list but part of a broader view that we need to live with and alongside technology to build a future that many cannot yet imagine and access, as the noble Lord, Lord Vallance, said. Technology will play an enormous part in our economy, but it is also fundamental to our security, self-worth, well-being, happiness, confidence in the future, and Britain’s place in the world, all of which are essential for growth.
Like the noble Lord, Lord Clement-Jones, I am concerned by the absence of a more comprehensive AI Bill, and I pray that the incoming Government have not already blinked in the face of tech lobbying. An AI Bill to establish minimum standards for the design and deployment of AI systems, manage risk, build necessary digital infrastructure and distribute the benefits more equitably is essential. As the noble Lord, Lord Clement-Jones, said, innovation should not be unconditional and regulation need not be the enemy of innovation.
Our response to digital transformation has been poor, largely due to a gap between the expertise of policymakers and those we seek to regulate. A permanent Joint Committee of both Houses on digital regulation is often asked for and could address this. In the meantime, I invite the Minister to meet the cross-party Peers informally referred to as the Lords tech team—of which the noble Baroness, Lady Jones of Whitchurch, was once part—to take forward the issues I have raised and work towards a model of innovation that serves the public as well as the Government’s growth agenda.
(9 months, 1 week ago)
Grand CommitteeThat is one of the questions that I can now answer. The power will allow this, in so far as it pertains to helping the Secretary of State establish whether the benefits are being paid properly, as with paragraph 1(2) of new Schedule 3B. Rules around living together are relevant only to some benefits. That is a very short answer, but I could expand on it.
May I add to the very long letter? I have been sitting here worrying about this idea that one of the “signals” will be excess capital and then there are matching accounts. If the matching account has more capital—for example, the person who has a connected account is breaking the £16,000 or £6,000—does that signal trigger some sort of investigation?
That is a very fair question, and I hope that I understand it correctly. I can say that the limit for the DWP is that it can gain only from what the third party produces. Whatever goes on behind the doors of the third party is for them and not us. Whether there is a related account and how best to operate is a matter for the bank to decide. We may therefore end up getting very limited information, in terms of the limits of our powers. I hope that helps, but I will add some more detail in the letter.
My Lords, having listened carefully to representations from across the House at Second Reading, I am introducing this amendment to address concerns about the data preservation powers established in the Bill. The amendment provides for coroners, and procurators fiscal in Scotland, to initiate the data preservation process when they decide it is necessary and appropriate to support their investigations into a child’s death, irrespective of the suspected cause of death.
This amendment demonstrates our commitment to ensuring that coroners and procurators fiscal can access the online data they may need to support their investigation into a child’s death. It is important to emphasise that coroners and procurators fiscal, as independent judges, have discretion about whether to trigger the data preservation process. We are grateful to the families, Peers and coroners whom we spoke to in developing these measures. In particular, I thank the noble Baroness, Lady Kidron, who is in her place. I beg to move.
My Lords, it is an unusual pleasure to support the Minister and to say that this is a very welcome amendment to address a terrible error of judgment made when the Government first added the measure to the Bill in the other place and excluded data access for coroners in respect of children who died by means other than suicide. I shall not replay here the reasons why it was wrong, but I am extremely glad that the Government have put it right. I wish to take this opportunity to pay tribute to those past and present at 5Rights and the NSPCC for their support and to those journalists who understood why data access for coroners is a central plank of online safety.
I too recognise the role of the Bereaved Families for Online Safety. They bear the pain of losing a child and, as their testimony has repeatedly attested, not knowing the circumstances surrounding that death is a particularly cruel revictimisation for families, who never lose their grief but simply learn to live with it. We owe them a debt of gratitude for putting their grief to work for the benefit of other families and other children.
My Lords, Amendment 251 is also in the names of the noble Lords, Lord Arbuthnot and Lord Clement-Jones, and the noble Baroness, Lady Jones. I commend the noble Lord, Lord Arbuthnot, for his staunch support of the sub-postmasters over many years. I am grateful to him for adding his name to this amendment.
This amendment overturns a previous intervention in the law that has had and will continue to have far-reaching consequences if left in place: the notion that computer evidence should in law be presumed to be reliable. This error, made by the Government and the Law Commission at the turn of the century and reinforced by the courts over decades, has, as we now know, cost innocent people their reputations, their livelihoods and, in some cases, their lives.
Previously, Section 69 of the Police and Criminal Evidence Act 1984 required prosecutors in criminal cases relying on information from computers to confirm that the computer was operating correctly and could not have been tampered with before it submitted evidence. As the volume of evidence from computers increased, this requirement came to be viewed as burdensome.
In 1997, the Law Commission published a paper, Evidence in Criminal Proceedings: Hearsay and Related Topics, in which it concluded that Section 69
“fails to serve any useful purpose”.
As a result, it was repealed. The effect of this repeal was to create a common law presumption, in both criminal and civil proceedings, of the proper functioning of machines—that is to say, the computer is always right. In principle, there is a low threshold for rebutting this presumption but, in practice, as the Post Office prosecutions all too tragically show, a person challenging evidence derived from a computer will typically have no visibility of the system in question or the ways in which it could or did fail. As a result, they will not know what records of failures should be disclosed to them and might be asked for.
This situation was illustrated in the Post Office prosecution of sub-postmaster Mrs Seema Misra. Paul Marshall, Mrs Misra’s defence lawyer, describes how she was
“taunted by the prosecution for being unable to point to any … identifiable … problem”,
while they hid behind the presumption that the Horizon system was “reliable” under the law. On four occasions during her prosecution, Mrs Misra requested court order disclosure by the Post Office of Horizon error records. Three different judges dismissed her applications. Mrs Misra went to prison. She was eight weeks pregnant, and it was her son’s 10th birthday. On being sentenced, she collapsed.
The repeal of Section 69 of PACE 1984 reflects the Law Commission’s flawed belief that most computer errors were “down to the operator” or “apparent to the operator”, and that you could
“take as read that computer evidence is reliable unless a person can say otherwise”.
In the words of a colleague of mine from the University of Oxford, a professor of computing with a side consultancy specialising in finding bugs for global tech firms ahead of rollout, this assumption is “eye-wateringly mistaken”. He recently wrote to me and said:
“I have been asking fellow computer scientists for evidence that computers make mistakes, and have found that they are bewildered at the question since it is self-evident”.
There is an injustice in being told that a machine will always work as expected, and a further injustice in being told that the only way you can prove that it does not work is to ask by name for something that you do not know exists. That is to say, Mrs Misra did not have the magic word.
In discussions, the Government assert that the harm caused by Horizon was due to the egregious failures of corporate governance at the Post Office. That there has been a historic miscarriage of justice is beyond question, and the outcome is urgently awaited. But the actions of the Post Office were made possible in part because of a flaw in our legal and judicial processes. What happened at the Post Office is not an isolated incident but potentially the tip of an iceberg, where the safety of an unknown number of criminal convictions and civil judgments is called into question.
For example, the Educational Testing Service, an online test commissioned by the Home Office, wrongly determined that 97% of English language students were cheating, a determination that cost the students their right to stay in the UK and/or their ability to graduate, forfeiting thousands of pounds in student fees. The Guardian conducted interviews with dozens of the students, who described the painful consequences. One man was held in UK immigration detention centres for 11 months. Others described being forced into destitution, becoming homeless and reliant on food banks as they attempted to challenge the accusation. Others became depressed and suicidal when confronted with the wasted tuition fees and the difficulty of shaking off an allegation of dishonesty.
The widespread coverage of the Horizon scandal has made many victims of the Home Office scandal renew their efforts to clear their names and seek redress. In another case, at the Princess of Wales Hospital in 2012, nurses were wrongly accused of falsifying patient records because of discrepancies found with computer records. Some of the nurses were subjected to criminal prosecution, suffering years of legal action before the trial collapsed, when it emerged that a visit by an engineer to fix a bug had eradicated all the data that the nurses were accused of failing to gather. That vital piece of information could easily have been discovered and disclosed, if computer evidence was not automatically deemed to be reliable.
It may have already done so, but I will certainly pass that on.
I thank everyone who spoke and the Minister for the offer of a meeting alongside his colleagues from the MoJ. I believe he will have a very busy diary between Committee and Report, based on the number of meetings we have agreed to.
However, I want to be very clear here. We have all recognised that the story of the Post Office sub-postmasters makes this issue clear, but it is not about the sub-postmasters. I commend the Government for what they are doing. We await the inquiry with urgent interest, and I am sure I speak for everyone in wishing the sub-postmasters a fair settlement—that is not in question. What is in question is the fact that we do not have unlimited Lord Arbuthnots to be heroic about all the other things that are about to happen. I took it seriously when he said not one moment longer: it could be tomorrow.
My Lords, I rise somewhat reluctantly to speak to Amendment 291 in my name. It could hardly be more important or necessary, but I am reluctant because I really think that the Minister, alongside his colleagues in DSIT and the Home Office, should have taken this issue up. I am quite taken aback that, despite my repeated efforts with both of those departments, they have not done so.
The purpose of the amendment is simple. It is already illegal in the UK to possess or distribute child sexual abuse material, including AI-generated or computer-generated child sexual abuse material. However, while the content is clearly covered by existing law, the mechanism that enables its creation—the files trained on or trained to create child sexual abuse material—is not. This amendment closes that gap.
Some time ago, I hosted an event at which members of OCCIT—the online child sexual exploitation and abuse covert intelligence team—gave a presentation to parliamentarians. For context, OCCIT is a law enforcement unit of the National Police Chiefs’ Council that uses covert police tactics to track down offender behaviour, with a view to identifying emerging risks in the form of new technologies, behaviours and environments. The presentation its officers gave concerned AI-generated abuse scenarios in virtual reality, and it was absolutely shattering for almost everyone who was present.
A few weeks later, the team contacted me and said that what it had showed then was already out of date. What it was now seeing was being supercharged by the ease with which criminals can train models that, when combined with general-purpose image-creation software, enable those with a sexual interest in children to generate CSAM images and videos at volume and—importantly—to order. Those building and distributing this software were operating with impunity, because current laws are insufficient to enable the police to take action against them.
In the scenarios that they are now facing, a picture of any child can be blended with existing child sexual abuse imagery, pornography or violent sexual scenarios. Images of several children can be honed into a fictitious child and used similarly or, as I will return to in a moment, a picture of an adult can be made to look younger and then used to create child sexual abuse. Among this catalogue of horrors are the made-to-order models trained using images of a child known to the perpetrator—a neighbour’s child or a family member—to create bespoke CSAM content. In short, the police were finding that the scale, sophistication and horror of violent child sexual abuse had hit a new level.
The laws that the police use to enforce against CSAM are Section 1 of the Protection of Children Act 1978 and Section 160 of the Criminal Justice Act 1988, both of which create offences in respect of indecent photographs or pseudophotographs of a child. AI content depicting child sexual abuse in the scenarios that I have just described is also illegal under the law, but creating and distributing the software models needed to generate them is not.
There are many services that allow anyone to take any public image and put it in a false situation. Although I have argued elsewhere that AI images should carry a mark of provenance, these services are not the subject of this amendment. This amendment is laser focused on criminalising AI models that are trained on or trained to create child sexual abuse material. They are specific, specialist and being traded with impunity. These models blend images of children—known children, stock photos, images scraped from social media or synthetic, fabricated AI depictions of children—with existing CSAM or pornography, and they allow paedophiles to generate bespoke CSAM scenarios.
I thank the noble Baroness, Lady Kidron, for tabling Amendment 291, which would create several new criminal offences relating to the use of AI to collect, collate and distribute child abuse images or to possess such images after they have been created. Nobody can dispute the intention behind this amendment.
We recognise the importance of this area. We will continue to assess whether and what new offences are needed to further bolster the legislation relating to child sexual abuse and AI, as part of our wider ongoing review of how our laws need to adapt to AI risks and opportunities. We need to get the answers to these complex questions right, and we need to ensure that we are equipping law enforcement with the capabilities and the powers needed to combat child sexual abuse. Perhaps, when I meet the noble Baroness, Lady Kidron, on the previous group, we can also discuss this important matter.
However, for now, I reassure noble Lords that any child sex abuse material, whether AI generated or not, is already illegal in the UK, as has been said. The criminal law is comprehensive with regard to the production and distribution of this material. For example, it is already an offence to produce, store or share any material that contains or depicts child sexual abuse, regardless of whether the material depicts a real child or not. This prohibition includes AI-generated child sexual abuse material and other pseudo imagery that may have been AI or computer generated.
We are committed to bringing to justice offenders who deliberately misuse AI to generate child sexual abuse material. We demonstrated this as part of the road to the AI Safety Summit, where we secured agreement from NGO, industry and international partners to take action to tackle AI-enabled child sexual abuse. The strongest protections in the Online Safety Act are for children, and all companies in scope of the legislation will need to tackle child sexual abuse material as a priority. Applications that use artificial intelligence will not be exempt and must incorporate robust guard-rails and safety measures to ensure that AI models and technology cannot be manipulated for child sexual abuse purposes.
Furthermore, I reassure noble Lords that the offence of taking, making, distributing and possessing with a view to distribution any indecent photograph or pseudophotograph of a child under the age of 18 carries a maximum sentence of 10 years’ imprisonment. Possession alone of indecent photographs or pseudophotographs of children can carry a maximum sentence of up to five years’ imprisonment.
However, I am not able to accept the amendment, as the current drafting would capture legitimate AI models that have been deliberately misused by offenders without the knowledge or intent of their creators to produce child sexual abuse material. It would also inadvertently criminalise individual users who possess perfectly legal digital files with no criminal intent, due to the fact that they could, when combined, enable the creation of child sexual abuse material.
I therefore ask the noble Baroness to withdraw the amendment, while recognising the strength of feeling and the strong arguments made on this issue and reiterating my offer to meet with her to discuss this ahead of Report.
I do not know how to express in parliamentary terms the depth of my disappointment, so I will leave that. Whoever helped the noble Viscount draft his response should be ashamed. We do not have a comprehensive system and the police do not have the capability; they came to me after months of trying to get the Home Office to act, so that is an untruth: the police do not have the capability.
I remind the noble Viscount that in previous debates his response on the bigger picture of AI has been to wait and see, but this is a here and now problem. As the noble Baroness, Lady Jones, set out, this would give purpose and reason—and here it is in front of us; we can act.
(9 months, 2 weeks ago)
Grand CommitteeMy Lords, I start today with probably the most innocuous of the amendments, which is that Clause 44 should not stand part. Others are more significant, but its purpose, if one can describe it as such, is as a probing clause stand part, to see whether the Minister can explain the real motive and impact of new Section 164A, which is inserted by Clause 44. As the explanatory statement says, it appears to hinder
“data subjects’ right to lodge complaints, and extends the scope of orders under Section 166 of the Data Protection Act to the appropriateness of the Commissioner’s response to a complaint.”
I am looking to the Minister to see whether he can unpack the reasons for that and what the impact is on data subjects’ rights.
More fundamental is Amendment 153, which relates to Clause 45. This provision inserts new Section 165A into the Data Protection Act, according to which the commissioner would have the discretion to refuse to act on a complaint if the complainant did not try to resolve the infringement of their rights with the relevant organisation and at least 45 days have passed since then. The right to an effective remedy constitutes a core element of data protection—most individuals will not pursue cases before a court, because of the lengthy, time- consuming and costly nature of judicial proceedings—and acts as a deterrent against data protection violations, in so far as victims can obtain meaningful redress. Administrative remedies are particularly useful, because they focus on addressing malpractice and obtaining meaningful changes in how personal data is handled in practice.
However, the ICO indicates that in 2021-22 it did not serve a single GDPR enforcement notice, secured no criminal convictions and issued only four GDPR fines, totalling just £633,000, despite the fact that it received over 40,000 data subject complaints. Moreover, avenues to challenge ICO inaction are extremely limited. Scrutiny of the information tribunal has been restricted to a purely procedural as opposed to a substantive nature. It was narrowed even further by the Administrative Court decision, which found that the ICO was not obliged to investigate each and every complaint.
Amendment 153 would remove Clause 45. The ICO already enjoys a wide margin of discretion and little accountability for how it handles complaints. In light of its poor performance, it does not seem appropriate to expand the discretion of the new information commission even further. It would also extend the scope of orders under Section 166 of the Data Protection Act to the appropriateness of the commissioner’s response to a complaint. This would allow individuals to promote judicial scrutiny over decisions that have a fundamental impact into how laws are enforced in practice and it would increase the overall accountability of the new information commission.
We have signed Amendment 154, in the name of the noble Baroness, Lady Jones, and I look forward to hearing what she says on that. I apologise for the late tabling of Amendments 154A to 154F, which are all related to Amendments 155 and 175. Clause 47 sets out changes in procedure in the courts, in relation to the right of information of a data subject under the 2018 Act, but there are other issues that need resolving around the jurisdiction of the courts and the Upper Tribunal in data protection cases. That is the reason for tabling these amendments.
The High Court’s judgment in the Delo v ICO case held that part of the reasoning in Killock and Veale about the relative jurisdiction of the courts and tribunals was wrong. The Court of Appeal’s decision in the Delo case underlines concerns, but does not properly address the jurisdictions’ limits in Sections 166 and 167 of the 2018 Act, regarding the distinction between determining procedural failings or the merits of decisions by the ICO. Surely jurisdiction under these sections should be in either the courts or the tribunals, not both. In the view of many, including me, it should be in the tribunals. That is what these amendments seek.
It is clear from these two judgments that there was disagreement on the extent of the jurisdiction of tribunals and courts, notably between Mrs Justice Farbey and Mr Justice Mostyn. The commissioner submitted very different submissions to the Upper Tribunal, the High Court and the Court of Appeal, in relation to the extent and limits of Sections 166 and 167. It is not at all clear what Parliament’s intentions were, when passing the 2018 Act, on the extents and limits of the powers in these sections and whether the appropriate source of redress is a court or tribunal.
This has resulted in jurisdictional confusion. A large number of claims have been brought in either the courts or the tribunals, under either Section 166 or Section 167, and the respective court or tribunal has frequently ruled that the claim should have been made under the other section and it therefore does not have jurisdiction, so that the claim is struck out. The Bill offers a prime opportunity to resolve this issue.
Clause 45(5), which creates new Section 166A, would only blur the lines even more and fortify the reasoning for the claim to be put into the tribunals, rather than the courts. These amendments would give certainty to the courts and tribunals as to their powers and would be much less confusing for litigants in person, most of whom do not have the luxury of paying hundreds of thousands in court fees. This itself is another reason for this to remain in the tribunals, which do not charge fees to issue proceedings.
The proposed new clause inserted by Amendment 287 would require the Secretary of State to exercise powers under Section 190 of the 2018 Act to allow public interest organisations to raise data protection complaints on behalf of individuals generally, without the need to obtain the authorisation of each individual being represented. It would therefore implement Article 80(2) of the GDPR, which provides:
“Member States may provide that any body, organisation or association referred to in paragraph 1 of this Article, independently of a data subject’s mandate, has the right to lodge, in that Member State, a complaint with the supervisory authority which is competent pursuant to Article 77 and to exercise the rights referred to in Articles 78 and 79 if it considers that the rights of a data subject under this Regulation have been infringed as a result of the processing”.
The intention behind Article 80(2) is to allow appropriately constituted organisations to bring proceedings concerning infringements of the data protection regulations in the absence of the data subject. That is to ensure that proceedings may be brought in response to an infringement, rather than on the specific facts of an individual’s case. As a result, data subjects are, in theory, offered greater and more effective protection of their rights. Actions under Article 80(2) could address systemic infringements that arise by design, rather than requiring an individual to evidence the breaches and the specific effects to them.
At present, an affected individual—a data subject—is always required to bring a claim or complaint to a supervisory authority. Whether through direct action or under Section 187 of the 2018 Act, a data subject will have to be named and engaged. In practice, a data subject is not always identifiable or willing to bring action to address even the most egregious conduct.
Article 80(2) would fill a gap that Article 80(1) and Section 187 of the Data Protection Act are not intended to fill. Individuals can be unwilling to seek justice, exercise their rights and lodge data protection complaints on their own, either for fear of retaliation from a powerful organisation or because of the stigma that may be associated with the matter where a data protection violation occurred. Even a motivated data subject may be unwilling to take action due to the risks involved. For instance, it would be reasonable for that data subject not to want to become involved in a lengthy, costly legal process that may be disproportionate to the loss suffered or remedy available. This is particularly pressing where the infringement concerns systemic concerns rather than where an individual has suffered material or non-material damage as a result of the infringement.
Civil society organisations have long helped complainants navigate justice systems in seeking remedies in the data protection area, providing a valuable addition to the enactment of UK data protection laws. My Amendment 287 would allow public interest organisations to lodge representative complaints, even without the mandate of data subjects, to encourage the filing of well-argued, strategically important cases with the potential to improve significantly the data subject landscape as a whole. This Bill is the ideal opportunity for the Government to implement fully Article 80(2) of the GDPR from international law and plug a significant gap in the protection of UK citizens’ privacy.
In effect, this is unfinished business from our debates on the 2018 Act, when we made several attempts to persuade the Government of the merits of introducing the rights under Article 80(2). I hope that the Government will think again. These are extremely important rights and are available in many other countries governed by a similar GDPR. I beg to move.
My Lords, as a veteran of the 2018 arguments on Article 80(2), I rise in support of Amendment 287, which would see its implementation.
Understanding and exercising personal data rights is not straightforward. Even when the rights are being infringed, it is rare that an individual data subject has the time, knowledge or ability to make a complaint to the ICO. This is particularly true for vulnerable groups, including children and the elderly, disadvantaged groups and other groups of people, such as domestic abuse survivors or members of the LGBTQ community, who may have specific reasons for not identifying themselves in relation to a complaint. It is a principle in law that a right that cannot be activated is not fully given.
A data subject’s ability to claim protection is constrained by a range of factors, none of which relates to the validity of their complaint or the level of harm experienced. Rather, the vast majority are prevented from making a complaint by a lack of expertise, capacity, time and money; by the fact that they are not aware that they have data rights; or by the fact that they understand neither that their rights have been infringed nor how to make a complaint about them.
I have considerable experience of this. I remind the Committee that I am chair of the 5Rights Foundation, which has raised important and systemic issues of non-compliance with the AADC. It has done this primarily by raising concerns with the ICO, which has then undertaken around 40 investigations based on detailed submissions. However, because the information is not part of a formalised process, the ICO has no obligation to respond to the 5Rights Foundation team, the three-month time limit for complaints does not apply and, even though forensic work by the 5Rights Foundation identified the problem, its team is not consulted or updated on progress or the outcome—all of which would be possible had it submitted the information as a formal complaint. I remind the Committee that in these cases we are talking about complaints involving children.
I thank the noble Lord; that is an important point. The question is: how does the Sorting Hat operate to distribute cases between the various tribunals and the court system? We believe that the courts have an important role to play in this but it is about how, in the early stages of a complaint, the case is allocated to a tribunal or a court. I can see that more detail is needed there; I would be happy to write to noble Lords.
Before we come to the end of this debate, I just want to raise something. I am grateful to the Minister for offering to bring forward the 2021 consultation on Article 80(2)—that will be interesting—but I wonder whether, as we look at the consultation and seek to understand the objections, the Government would be willing to listen to our experiences over the past two or three years. I know I said this on our previous day in Committee but there is, I hope, some point in ironing out some of the problems of the data regime that we are experiencing in action. I could bring forward a number of colleagues on that issue and on why it is a blind spot for both the ICO and the specialist organisations that are trying to bring systemic issues to its attention. It is very resource-heavy. I want a bit of goose and gander here: if we are trying to sort out some of the resourcing and administrative nightmares in dealing with the data regime, from a user perspective, perhaps a bit of kindness could be shown to that problem as well as to the problem of business.
I would be very happy to participate in that discussion, absolutely.
(9 months, 2 weeks ago)
Grand CommitteeMy Lords, I support Amendment 135 in the name of the noble Lord, Lord Bethell, to which I have added my name. He set out our struggle during the passage of the Online Safety Bill, when we made several attempts to get something along these lines into the Bill. It is worth actually quoting the Minister, Paul Scully, who said at the Dispatch Box in the other place:
“we have made a commitment to explore this … further and report back to the House in due course on whether further measures to support researcher access to data are required and, if so, whether they could also be implemented through other legislation such as the Data Protection and Digital Information Bill”.—[Official Report, Commons, 12/9/23; col. 806.]
When the Minister responds, perhaps he could update the House on that commitment and explain why the Government decided not to address it in the Bill. Although the Bill proposes a lessening of the protections on the use of personal data for research done by commercial companies, including the development of products and marketing, it does nothing to enable public interest research.
I would like to add to the list that the noble Lord, Lord Bethell, started, because as well as Melanie Dawes, the CEO of Ofcom, so too the United States National Academy of Sciences, the Lancet commission, the UN advisory body on AI, the US Surgeon General, the Broadband Commission and the Australian eSafety Commissioner have all in the last few months called for greater access to independent research.
I ask the noble Viscount to explain the Government’s thinking in detail, and I really do hope that we do not get more “wait and see”, because it does not meet the need. We have already passed online safety legislation that requires evidence, and by denying access to independent researchers, we have a perverse situation in which the regulator has to turn to the companies it is regulating for the evidence to create their codes, which, as the noble Viscount will appreciate, is a formula for the tech companies to control the flow of evidence and unduly temper the intent of the legislation. I wish to make most of my remarks on that subject.
In Ofcom’s consultation on its illegal harms code, the disparity between the harms identified and Ofcom’s proposed code caused deep concern. Volume 4 states the following at paragraph 14.12 in relation to content moderation:
“We are not proposing to recommend some measures which may be effective in reducing risks of harm. This is principally due to currently limited evidence”.
Further reading of volume 4 confirms that the lack of evidence is the given reason for failing to recommend measures across a number of harms. Ofcom has identified harms for which it does not require mitigation. This is not what Parliament intended and spectacularly fails to deliver on the promises made by Ministers. Ofcom can use its information-gathering powers to build evidence on the efficacy required to take a bolder approach to measures but, although that is welcome, it is unsatisfactory for many reasons.
First, given the interconnectedness between privacy, safety, security and competition, regulatory standards cannot be developed in silo. We have a thriving academic community that can work across different risks and identify solutions across different parts of the tech ecosystem.
Secondly, a regulatory framework in which standards are determined exclusively through private dialogue between the regulator and the regulated does not have the necessary transparency and accountability to win public trust.
Thirdly, regulators are overstretched and under-resourced. Our academics stand ready and willing to work in the public interest and in accordance with the highest ethical standards in order to scrutinise and understand the data held so very closely by tech companies, but they need a legal basis to demand access.
Fourthly, if we are to maintain our academic institutions in a post-Brexit world, we need to offer UK academics the same support as those in Europe. Article 40(4) of the European Union’s Digital Services Act requires platforms to
“provide access to data to vetted researchers”
seeking to carry out
“research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35”.
It will be a considerable loss to the UK academic sector if its European colleagues have access to data that it does not.
Fifthly, by insisting on evidence but not creating a critical pathway to secure it, the Government have created a situation in which the lack of evidence could mean that Ofcom’s codes are fixed at what the tech companies tell it is possible in spring 2024, and will always be backward-looking. There is considerable whistleblower evidence revealing measures that the companies could have taken but chose not to.
I have considerable personal experience of this. For example, it was nearly a decade ago that I told Facebook that direct messaging on children’s accounts was dangerous, yet only now are we beginning to see regulation reflecting that blindingly obvious fact. That is nearly a decade in which something could have been done by the company but was not, and of which the regulator will have no evidence.
Finally, as we discussed on day one in Committee, the Government have made it easier for commercial companies to use personal data for research by lowering the bar for the collection of data and expanding the concept of research, further building the asymmetry that has been mentioned in every group of amendments we have debated thus far. It may not be very parliamentary language, but it is crazy to pass legislation and then obstruct its implementation by insisting on evidence that you have made it impossible to gather.
I would be grateful if the Minister could answer the following questions when he responds. Is it the Government’s intention that Ofcom codes be based entirely on the current practice of tech companies and that the regulator can demand only mitigations that exist currently, as evidenced by those companies? Do the Government agree that whistleblowers, NGO experts and evidence from user experience can be taken by regulators as evidence of what could or should be done? What route do the Government advise Ofcom to take to mitigate identified risks for which there are no current measures in place? For example, should Ofcom describe the required outcome and leave it to the companies to determine how they mitigate the risk, should it suggest mitigations that have been developed but not tried—or is the real outcome of the OSA to identify risk and leave that risk in place?
Do the Government accept that EU research done under the auspices of the DSA should be automatically considered as an adequate basis for UK regulators where the concerns overlap with UK law? Will the new measures announced for testing and sandboxing of AI models allow for independent research, in which academics, independent of government or tech, will have access to data? Finally, what measures will the Government take to mitigate the impact on universities of a brain drain of academics to Europe, if we do not provide equivalent legislative support to enable them to access the data required to study online safety and privacy? If the Minister is unable to answer me from the Dispatch Box, perhaps he will agree to write to me and place his letter in the Library for other noble Lords to read.
My Lords, there is little for me to say. The noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, have left no stone unturned in this debate. They introduced this amendment superbly, and I pay tribute to them and to Reset, which was with us all the way through the discussions on online harms at the Joint Committee on the draft Online Safety Bill, advocating for these important provisions.
As the noble Lord, Lord Bethell, said, there is a strong body of opinion out there. Insight from what might be called approved independent researchers would enable policy-making and regulatory innovation to keep pace with emerging trends and threats, which can span individual harms, matters of public safety and even national security. We have seen the kinds of harms taking place in social media, and it is absolutely vital that we understand what is happening under the bonnet of social media. It is crucial in detecting, identifying and understanding the systemic risks of online harms and non-compliance with law.
When we discussed the Online Safety Bill, it was a question of not just content but functionality. That was one of the key things. An awful lot of this research relates to that: how algorithms operate in amplifying content and some of the harms taking place on social media. The noble Lord, Lord Bethell, referred to X closing its API for researchers and Meta’s move to shut CrowdTangle. We are going into reverse, whereas we should be moving forward in a much more positive way. When the Online Safety Bill was discussed, we got the review from Ofcom, but we did not get the backup—the legislative power for Ofcom or the ICO to be able to authorise and accredit researchers to carry out the necessary research.
The Government’s response to date has been extremely disappointing, given the history behind this and the pressure and importance of this issue. This dates from discussions some way back, even before the Joint Committee met and heard the case for this kind of researcher access. This Bill is now the best vehicle by which to introduce a proper regime on access for researchers. As the noble Baroness, Lady Kidron, asked, why, having had ministerial assurances, are we not seeing further progress? Are we just going to wait until Ofcom produces its review, which will be at the tail end of a huge programme of work which it has to carry out in order to implement the Online Safety Act?
My Lords, as ever, many thanks to all noble Lords who spoke in the debate.
Amendment 135, tabled by my noble friend Lord Bethell, would enable researchers to access data from data controllers and processors in relation to systemic risks to the UK and non-compliance with regulatory law. The regime would be overseen by the ICO. Let me take this opportunity to thank both my noble friend for the ongoing discussions we have had and the honourable Members in the other place who are also interested in this measure.
Following debates during the passage of the Online Safety Act, the Government have been undertaking further work in relation to access to data for online safety researchers. This work is ongoing and, as my noble friend Lord Bethell will be aware, the Government are having ongoing conversations on this issue. As he knows, the online safety regime is very broad and covers issues that have an impact on national security and fraud. I intend to write to the Committee with an update on this matter, setting out our progress ahead of Report, which should move us forward.
While we recognise the benefits of improving researchers’ access to data—for example, using data to better understand the impact of social media on users—this is a highly complex issue with several risks that are not currently well understood. Further analysis has reiterated the complexities of the issue. My noble friend will agree that it is vital that we get this right and that any policy interventions are grounded in the evidence base. For example, there are risks in relation to personal data protection, user consent and the disclosure of commercially sensitive information. Introducing a framework to give researchers access to data without better understanding these risks could have significant consequences for data security and commercially sensitive information, and could potentially destabilise any data access regime as it is implemented.
In the meantime, the Online Safety Act will improve the information available to researchers by empowering Ofcom to require major providers to publish a broad range of online safety information through annual transparency reports. Ofcom will also be able to appoint a skilled person to undertake a report to assess compliance or to develop its understanding of the risk of non-compliance and how to mitigate it. This may include the appointment of independent researchers as skilled persons. Further, Ofcom is required to conduct research into online harms and has the power to require companies to provide information to support this research activity.
Moving on to the amendment specifically, it is significantly broader than online safety and the EU’s parallel Digital Services Act regime. Any data controllers and processors would be in scope if they have more than 1 million UK users or customers, if there is a large concentration of child users or if the service is high-risk. This would include not just social media platforms but any organisation, including those in financial services, broadcasting and telecoms as well as any other large businesses. Although we are carefully considering international approaches to this issue, it is worth noting that much of the detail about how the data access provisions in the Digital Services Act will work in practice is yet to be determined. Any policy interventions in this space should be predicated on a robust evidence base, which we are in the process of developing.
The amendment would also enable researchers to access data to research systemic risks to compliance with any UK regulatory law that is upheld by the ICO, Ofcom, the Competition and Markets Authority, and the Financial Conduct Authority. The benefits and risks of such a broad regime are not understood and are likely to vary across sectors. It is also likely to be inappropriate for the ICO to be the sole regulator tasked with vetting researchers across the remits of the other regulators. The ICO may not have the necessary expertise to make this determination about areas of law that it does not regulate.
Ofcom already has the power to gather information that it requires for the purpose of exercising its online safety functions. This power applies to companies in scope of the duties and, where necessary, to other organisations or persons who may have relevant information. Ofcom can also issue information request notices to overseas companies as well as to UK-based companies. The amendment is also not clear about the different types of information that a researcher may want to access. It refers to a data controller and processors—concepts that relate to the processing of personal data under data protection law—yet researchers may also be interested in other kinds of data, such as information about a service’s systems and processes.
Although the Government continue to consider this issue—I look forward to setting out our progress between now and Report—for the reasons I have set out, I am not able to accept this amendment. I will certainly write to the Committee on this matter and to the noble Baroness, Lady Kidron, with a more detailed response to her questions—there were more than four of them, I think—in particular those about Ofcom.
Perhaps I could encourage the Minister to say at least whether he is concerned that a lack of evidence might be impacting on the codes and powers that we have given to Ofcom in order to create the regime. I share his slight regret that Ofcom does not have this provision that is in front of us. It may be that more than one regulator needs access to research data but it is the independents that we are talking about. We are not talking about Ofcom doing things and the ICO doing things. We are talking about independent researchers doing things so that the evidence exists. I would like to hear just a little concern that the regime is suffering from a lack of evidence.
I am thinking very carefully about how best to answer. Yes, I do share that concern. I will set this out in more detail when I write to the noble Baroness and will place that letter in the House of Lords Library. In the meantime, I hope that my noble friend will withdraw his amendment.
My Lords, I will speak to Amendments 142, 143 and 150 in my name, and I thank other noble Lords for their support.
We have spent considerable time across the digital Bills—the online safety, digital markets and data Bills—talking about the speed at which industry moves and the corresponding need for a more agile regulatory system. Sadly, we have not really got to the root of what that might look like. In the meantime, we have to make sure that regulators and Governments are asked to fulfil their duties in a timely manner.
Amendment 142 puts a timeframe on the creation of codes under the Act at 18 months. Data protection is a mature area of regulatory oversight, and 18 months is a long time for people to wait for the benefits that accrue to them under legislation. Similarly, Amendment 143 ensures that the transition period from the code being set to it being implemented is no more than 12 months. Together, that creates a minimum of two and half years. In future legislation on digital matters, I would like to see a very different approach that starts with the outcome and gives companies 12 months to comply, in any way they like, to ensure that outcome. But while we remain in the world of statutory code creation, it must be bound by a timeframe.
I have seen time and again, after the passage of a Bill, Parliament and civil society move on, including Ministers and key officials—as well as those who work at the regulator—and codes lose their champions. It would be wonderful to imagine that matters progress as intended, but they do not. In the absence of champions, and without ongoing parliamentary scrutiny, codes can languish in the inboxes of people who have many calls on their time. Amendments 142 and 143 simply mirror what the Government agreed to in the OSA—it is a piece of good housekeeping to ensure continuity of attention.
I am conscious that I have spent most of my time highlighting areas where the Bill falls short, so I will take a moment to welcome the reporting provisions that the Government have put forward. Transparency is a critical aspect of effective oversight, and the introduction of an annual report on regulatory action would be a valuable source of information for all stakeholders with an interest in understanding the work of the ICO and its impact.
Amendment 150 proposes that those reporting obligations also include a requirement to provide details of all activities carried out by the Information Commissioner to support, strengthen and uphold the age-appropriate design code. It also proposes that, when meeting its general reporting obligations, it should provide the information separately for children. The ICO published an evaluation of the AADC as a one-off in March 2023 and its code strategy on 3 April this year. I recognise the effort that the commissioner has made towards transparency, and the timing of his report indicates that having reporting on children specifically is something that the ICO sees as relevant and useful. However, neither of those are sufficient in terms of the level of detail provided, the reporting cadence or the focus on impact rather than the efforts that the ICO has made.
There are many frustrations for those of us who spend our time advocating for children’s privacy and safety. Among them is having to try to extrapolate child-specific data from generalised reporting. When it is not reported separately, it is usually to hide inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snap provides a breakdown of the violation rate data by age group, even though this would provide valuable information for academics, Governments, legislators and NGOs. Amendment 150 would go some way to addressing this gap by ensuring that the ICO is required to break down its reporting for children.
Having been momentarily positive, I would like to put on the record my concerns about the following extract from the email that accompanied the ICO’s children’s code strategy of 2 April. Having set out the very major changes to companies that the code has ushered in and explained how the Information Commissioner would spend the next few months looking at default settings, geolocation, profiling, targeting children and protecting under-13s, the email goes on to say:
“With the ongoing passage of the bill, our strategy deliberately focusses in the near term on compliance with the current code. However, once we have more clarity on the final version of the bill we will of course look to publicly signal intentions about our work on implementation and children’s privacy into the rest of the year and beyond”.
The use of the phrase “current code”, and the fact that the ICO has decided it is necessary to put its long-term enforcement strategy on hold, contradict government assurances that standards will remain the same.
The email from the ICO arrived in my inbox on the same day as a report from the US Institute of Digital Media and Child Development, which was accompanied by an impact assessment on the UK’s age-appropriate design code. It stated:
“The Institute’s review identifies an unprecedented wave of … changes made across leading social media and digital platforms, including YouTube, TikTok, Snapchat, Instagram, Amazon Marketplace, and Google Search. The changes, aimed at fostering a safer, more secure, and age-appropriate online environment, underscore the crucial role of regulation in improving the digital landscape for children and teens”.
In June, the Digital Futures Commission will be publishing a similar report written by the ex-Deputy Information Commissioner, Steve Wood, which has similarly positive but much more detailed findings. Meanwhile, we hear the steady drumbeat of adoption of the code in South America, Australia and Asia, and in additional US states following California’s lead. Experts in both the US and here in the UK evidence that this is a regulation that works to make digital services safer and better for children.
I therefore have to ask the Minister once again why the Government are downgrading child protection. If he, or those in the Box advising him, are even slightly tempted to say that they are not, I ask that they reread the debates from the last two days in Committee, in which the Government removed the balancing test to automated decision-making and the Secretary of State’s powers were changed to have regard to children rather than to mandate child protections. The data impact assessment provisions have also been downgraded, among the other sleights of hand that diminish the AADC.
The ICO has gone on record to say that it has put its medium to long-term enforcement strategy on hold, and the Minister’s letter sent on the last day before recess says that the AADC will be updated to reflect the Bill. I would like nothing more than a proposal from the Government to put the AADC back on a firm footing. I echo the words said earlier by the noble Baroness, Lady Jones, that it is time to start talking and stop writing. I am afraid that, otherwise, I will be tabling amendments on Report that will test the appetite of the House for protecting children online. In the meantime, I hope the Minister will welcome and accept the very modest proposals in this group.
My Lords, as is so often the case on this subject, I support the noble Baroness, Lady Kidron, and the three amendments that I have added my name to: Amendments 142, 143 and 150. I will speak first to Amendments 142 and 143, and highlight a couple of issues that the noble Baroness, Lady Kidron, has already covered.
I thank the noble Lord, Lord Clement-Jones, the noble Baroness, Lady Kidron, and other noble Lords who have tabled and signed amendments in this group. I also observe what a pleasure it is to be on a Committee with Batman and Robin—which I was not expecting to say, and which may be Hansard’s first mention of those two.
The reforms to the Information Commissioner’s Office within the Bill introduce a strategic framework of objectives and duties to provide context and clarity on the commissioner’s overarching objectives. The reforms also put best regulatory practice on to a statutory footing and bring the ICO’s responsibilities into line with that of other regulators.
With regard to Amendment 138, the principal objective upholds data protection in an outcomes-focused manner that highlights the discretion of the Information Commissioner in securing those objectives, while reinforcing the primacy of data protection. The requirement to promote trust and confidence in the use of data will encourage innovation across current and emerging technologies.
I turn now to the question of Clause 32 standing part. As part of our further reforms, the Secretary of State can prepare a statement of strategic priorities for data protection, which positions these aims within its wider policy agenda, thereby giving the commissioner helpful context for its activities. While the commissioner must take the statement into account when carrying out functions, they are not required to act in accordance with it. This means that the statement will not be used in a way to direct what the commissioner may and may not do when carrying out their functions.
Turning to Amendment 140, we believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. This amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without necessarily full knowledge of the facts, undermining their regulatory enforcement role.
In response to the amendments concerning Clauses 33 to 35 standing part, I can say that we are introducing a series of measures to increase accountability, robustness and transparency in the codes of practice process, while safeguarding the Information Commissioner’s role. The requirements for impact assessments and panel of experts mean that the codes will consider the application to, and impact on, all potential use cases. Given that the codes will have the force of law, the Secretary of State must have the ability to give her or his comments. The Information Commissioner is required to consider but not to act on those comments, preserving the commissioner’s independence. It remains for Parliament to give approval for any statutory code produced.
Amendments 142 and 143 impose a requirement on the ICO to prepare codes and for the Secretary of State to lay them in Parliament as quickly as practicable. They also limit the time that transitional provisions can be in place to a maximum of 12 months. This could mean that drafting processes are truncated or valid concerns are overlooked to hit a statutory deadline, rather than the codes being considered properly to reflect the relevant perspectives.
Given the importance of ensuring that any new codes are robust, comprehensive and considered, we do not consider imposing time limits on the production of codes to be a useful tool.
Finally, Amendment 150—
We had this debate during the passage of the Online Safety Act. In the end, we all agreed—the House, including the Government, came to the view—that two and a half years, which is 18 months plus a transition period, was an almost egregious amount of time considering the rate at which the digital world moves. So, to consider that more than two and a half years might be required seems a little bit strange.
I absolutely recognise the need for speed, and my noble friend Lady Harding made this point very powerfully as well, but what we are trying to do is juggle that need with the need to go through the process properly to design these things well. Let me take it away and think about it more, to make sure that we have the right balancing point. I very much see the need; it is a question of the machinery that produces the right outcome in the right timing.
Before the Minister sits down, I would very much welcome a meeting, as the noble Baroness, Lady Harding, suggested. I do not think it is useful for me to keep standing up and saying, “You are watering down the code”, and for the Minister to stand up and say, “Oh no, we’re not”. We are not in panto here, we are in Parliament, and it would be a fantastic use of all our time to sit down and work it out. I would like to believe that the Government are committed to data protection for children, because they have brought forward important legislation in this area. I would also like to believe that the Government are proud of a piece of legislation that has spread so far and wide—and been so impactful—and that they would not want to undermine it. On that basis, I ask the Minister to accede to the noble Baroness’s request.
I am very happy to try to find a way forward on this. Let me think about how best to take this forward.
My Lords, Amendment 146 is in my name and those of the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Harding and Lady Jones; I thank them all for their support. Before I set out the amendment that would provide a code of practice for edtech and why it is so urgently required, I thank the noble Baroness, Lady Barran, and officials in the Department for Education for their engagement on this issue. I hope the Minister can approach this issue with the same desire they have shown to fill the gap that it seeks to address.
A child does not have a choice about whether they go to school. For those who do not fall into the minority who are homeschooled or who, for a reason of health or development, fall outside the education system, it is compulsory. The reason I make this point at the outset is that, if school is compulsory, it must follow that a child should enjoy the same level of privacy and safety at school as they do in any other environment. Yet we have allowed a gap in our data legislation, meaning that a child’s data is unprotected at school and, at the same time, invested in an unregulated and uncertified edtech market to develop promises of learning outcomes that range from unsubstantiated to false.
Schools are keen to adopt new technologies and say that they feel pressure to do so. In both cases, they lack the knowledge and time to assess the privacy and safety risks of the technology products that they are being sold. Amendment 146 would enable children and schools to benefit from emerging technologies. It would reduce the burden on schools in ensuring compliance so that they can get on with the job of teaching our children in a safe, developmentally appropriate and rights-respecting environment, and it would deal with companies that fail to provide evidence for their products and routinely exploit the complexity of data protection law to children’s detriment. In sum, the amendment brings forward a code of conduct for edtech.
Subsections (1) and (2) would require the ICO to bring forward a data code for edtech and tech used in education settings. In doing so, the commissioner would be required to consider children’s fundamental rights, as set out in the Convention on the Rights of the Child, and their relevance to the digital world, as adopted by the Committee on the Rights of the Child in general comment 25 in 2021. The commissioner would have to consider the fact that children are legally entitled to a higher standard of protection in respect to their personal data than adults. In keeping with other data codes, the amendment also sets out whom the ICO must consult when preparing the code, including children, parents and teachers, as well as edtech companies.
Subsection (3) would require edtech companies to provide schools with transparent information about their data-processing practices and their impact on children. This is of particular importance because the department’s own consultation showed that schools are struggling to understand the implications of being a data controller and most often accept the default settings of products and services. Having a code of conduct would allow the Information Commissioner not only to set the standards in subsections (1) and (2) but to insist on the way that information is given in order to support schools to make the right choices for their pupils.
Subsection (4) would allow schools to use edtech providers’ adherence to the code as proof of fulfilling their own data protection duties. Once again, this would alleviate the burden on teachers and school leaders.
Subsection (5) would simply give the commissioner a role in supporting a certification scheme to enable the industry to demonstrate both the compliance of edtech services and products with the UK GDPR and conformity with the age-appropriate design code of practice and the edtech code of practice. The IEEE Standards Association and For Humanity have published certification standards for the AADC but they have not yet been approved by the ICO or UKAS standards. Subsection (5) would act as a catalyst, ensuring that the ICO and the certification partners work together efficiently. Ultimately, schools will respond better to certification than to pure data law.
If the edtech sector was formally in scope of the AADC and it was robustly applied, that would do some, though not all, of what the amendment seeks to do. But in 2018, Her Majesty’s Government, as they were then, made the decision that schools are responsible for children and that the AADC would be confusing. I am not sure whether the Government of the day did not understand the AADC. It requires companies to offer children privacy by design and default. Nothing in the code would have infringed—or will infringe—on a school’s safeguarding duties, but leaving schools out of scope leaves teachers or school data protection officers with vast responsibilities for wilfully leaky products that simply should not fall to them. Many in this House thought that the Government were wrong, and since then we have seen grand abuse of the gap that was created. This is an opportunity to put that error right.
I am grateful, as ever, to the noble Baroness, Lady Kidron, for both Amendment 146 and her continued work in championing the protection of children.
Let me start by saying that the Government strongly agree with the noble Baroness that all providers of edtech services must comply with the law when collecting and making decisions about the use of children’s data throughout the duration of their processing activities. That said, I respectfully submit that this amendment is not necessary, for the reasons I shall set out.
The ICO already has existing codes and guidance for children and has set out guidance about how the children’s code, data protection and e-privacy legislation apply to edtech providers. Although the Government recognise the value that ICO codes can have in promoting good practice and improving compliance, they do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by them.
The guidance covers broad topics, including choosing a lawful basis for the processing; rules around information society services; targeting children with marketing; profiling children or making automated decisions about them; data sharing; children’s data rights; and exemptions relating to children’s data. Separately, as we have discussed throughout this debate, the age-appropriate design code deals specifically with the provision of online services likely to be accessed by children in the UK; this includes online edtech services. I am pleased to say that the Department for Education has begun discussions with commercial specialists to look at strengthening the contractual clauses relating to the procurement of edtech resources to ensure that they comply with the standards set out in the UK GDPR and the age-appropriate design code.
On the subject of requiring the ICO to develop a report with the edtech sector, with a view to creating a certification scheme and assessing compliance and conformity with data protection, we believe that such an approach should be at the discretion of the independent regulator.
The issues that have been raised in this very good, short debate are deeply important. Edtech is an issue that the Government are considering carefully—especially the Department for Education, given the increasing time spent online for education. I note that the DPA 2018 already contains a power for the Secretary of State to request new codes of practice, which could include one on edtech if the evidence warranted it. I would be happy to return to this in future but consider the amendment unnecessary at this time. For the reasons I have set out, I am not able to accept the amendment and hope that the noble Baroness will withdraw it.
I thank everyone who spoke, particularly for making it absolutely clear that not one of us, including myself, is against edtech. We just want it to be fair and want the rules to be adequate.
I am particularly grateful to the noble Baroness, Lady Jones, for detailing what education data includes. It might feel as though it is just about someone’s exam results or something that might already be public but it can include things such as how often they go to see the nurse, what their parents’ immigration status is or whether they are late. There is a lot of information quite apart from this personalised education provision, to which the noble Baroness referred. In fact, we have a great deal of emerging evidence that it has no pedagogical background to it. There is also the question of huge investment right across the sector in things where we do not know what they are. I thank the noble Baroness for that.
As to the Minister’s response, I hope that he will forgive me for being disappointed. I am grateful to him for reminding us that the Secretary of State has that power under the DPA 2018. I would love for her to use that power but, so far, it has not been forthcoming. The evidence we saw from the freedom of information request is that the scheme the department wanted to put in place has been totally retracted—and clearly for resource reasons rather than because it is not needed. I find it quite surprising that the Minister can suggest that it is all gung ho here in the UK but that Germany, Holland, France, et cetera are being hysterical in regard to this issue. Each one of them has found it to be egregious.
Finally, the AADC applies only to internet society services; there is an exception for education. Where they are joint controllers, they are outsourcing the problems to the schools, which have no level of expertise in this and just take default settings. It is not good enough, I am afraid. I feel bound to say this: I understand the needs of parliamentary business, which puts just a handful of us in this Room to discuss things out of sight, but, if the Government are not willing to protect children’s data at school, when they are in loco parentis to our children, I am really bewildered as to what this Bill is for. Education is widely understood to be a social good but we are downgrading the data protections for children and rejecting every single positive move that anybody has made in Committee. I beg leave to withdraw my amendment but I will bring this back on Report.
(10 months ago)
Grand CommitteeMy Lords, once more into the trenches we go before Easter. In moving Amendment 53, I will also speak to Amendments 54, 55, 57, 69, 70, 71 and 72 and the Clause 14 stand part notice.
The Bill contains a number of wide delegated powers, giving the Secretary of State the power to amend the UK GDPR via statutory instrument. The Government have said that the UK GDPR’s key elements remain sound and that they want to continue to offer a high level of protection for the public’s data, but that is no guarantee against significant reforms being brought in through a process that eludes full parliamentary scrutiny through primary legislation. Proposed changes to the UK GDPR should be contained in the Bill, where they can be debated and scrutinised properly via the primary legislation process. As it stands, key provisions of the UK GDPR can subsequently be amended via statutory instrument, which, in this case, is an inappropriate legislative process that affords much less scrutiny and debate, if debates are held at all.
The UK GDPR treats a solely automated decision as one without “meaningful human involvement”. The public are protected from being subject to solely automated decision-making where the decision has a legal or “similarly significant effect”. Clause 14(1) inserts new Article 22D(1) into the UK GDPR, which allows the Secretary of State to make regulations that deem a decision to have involved “meaningful human involvement”, even if there was no active review by a human decision-maker. New Article 22D(2) similarly allows the Secretary of State to make regulations to determine whether a decision made had a “similarly significant effect” to a legal effect. For example, in summer 2021 there was the A-level algorithm grading scandal. If something like that were to reoccur, under this new power a Minister could lay regulations stating that the decision to use an algorithm in grading A-levels was not a decision with a “similarly significant effect”.
New Article 22D(4) also allows the Secretary of State to add or remove, via regulations, any of the listed safeguards for automated decision-making. If the Government wish to amend or remove safeguards on automated decision-making, that should also be specified in the Bill and not left to delegated legislation. Amendments 53 to 55 and 69 to 72 would limit the Secretary of State’s power, so that they may add safeguards but cannot vary or remove those in the new Article 22D, as they stand, when the legislation comes into force.
If the clause is to be retained, we support Amendment 59A in the name of the noble Lord, Lord Holmes, which requires the Information Commissioner’s Office to develop guidance on the interpretation of the safeguards in new Article 22C and on important terms such as “similarly significant effect” and “meaningful human involvement”. It is within the Information Commissioner’s Office’s duties to issue guidance and to harmonise the interpretation of the law. As the dedicated regulator, the ICO is best placed and equipped to publish guidance and ensure consistency of application.
As a way to increase protections and incorporate more participation from those affected, Amendment 59A would add a new paragraph (7) to new Article 22D, which specifies that the Secretary of State needs to consult with the Information Commissioner’s Office if developing regulations. It also includes an obligation for the Secretary of State to consult with data subjects or their representatives, such as trade union or civil society organisations, at least every two years from the commencement of the Bill.
Our preference is for Clause 14 not to stand part of the Bill. The deployment of automated decision-making under Clause 14 risks automating harm, including discrimination, without adequate safeguards. Clause 14 creates a new starting point for all ADM using personal, but not special category, data. It is allowed, including for profiling, provided that certain safeguards are in place. The Minister said those safeguards are “appropriate” and “robust” and provide “certainty”, but I preferred what the noble Lord, Lord Bassam, said about the clause:
“We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts”.—[Official Report, 25/3/24; col. GC 150.]
That is very much my feeling about the clause as well.
I refer back to the impact assessment, which we discussed at some point during our discussions about Clause 9. It is very interesting that, in table 15 of the impact assessment, the savings on compliance costs are something like £7.3 million as regards AI and machine learning, which does not seem a very big number compared with the total savings on compliance costs, which the Government have put rather optimistically at £295 million.
In passing, I should say that, when I look at the savings regarding subject access requests, I see that the figure is £153 million, which is half of those so-called savings on compliance costs. I do not square that at all with what the Minister says about the total savings on compliance costs for subject access requests being 1%. I do not know quite where those figures come from, but it is a far more significant percentage: it is 50% of what the Government believe that the savings on compliance costs will be. I know that it is not part of this group, but I would be very grateful if the Minister could write to clarify that issue in due course.
Although the Minister has called these adequate, we believe that they are inadequate for three reasons. First, they shift the burden to the individual. Secondly, there is no obligation to provide any safeguards before the decision is made. Neither the Bill nor any of the material associated with it indicates what the content of this information is expected to be, nor the timescales in which that information is to be given. There is nothing to say when representations or contest may be heard, when human intervention may be sought or the level of that intervention. Thirdly, the Secretary of State has delegated powers to vary the safeguards by regulations.
Article 22 is currently one of the strongest prohibitions in the GDPR. As we know, the current starting point is that using solely automated decision-making is prohibited unless certain exemptions apply. The exemptions are limited. Now, as a result of the Government’s changes, you can use solely automated decision-making in an employment context in the UK, which you cannot do in the EU. That is a clear watering down of the restriction. The Minister keeps returning to the safeguards, but I have referred to those. We know that they are not being applied in practice even now and that hiring and firing is taking place without any kind of human review.
There is therefore an entirely inadequate basis on which we can be satisfied that the Bill will safeguard individuals from harmful automated decision-making before it is too late. In fact, the effect of the Bill will be to do the opposite: to permit unfair and unsafe ADM to occur, including discriminatory profiling ADM, which causes harm to individuals. It then places the burden on the individual to complain, without providing for any adequate safeguards to guarantee their ability to do so before the harm is already incurred. While I beg to move Amendment 53, our preference would be that Clause 14 is deleted from the Bill entirely.
My Lords, I will speak to Amendment 57 in my name, Amendment 59 in the name of the noble Baroness, Lady Jones, and the Clause 14 stand part notice from the noble Lord, Lord Clement-Jones. In doing so, I register my support for Amendment 59A in the name of the noble Lord, Lord Holmes.
The Government assert that there is no diminution of rights in the Bill, yet Clause 14 removes the right not to be subject to an automated decision and replaces that right with inadequate safeguards, as the noble Lord, Lord Clement-Jones, said. On the previous day in Committee, the Minister made the argument that:
“These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles”,—[Official Report, 25/3/24; col. GC 146.]
but I hope he will at least accept that safeguards do not constitute a right. The fact that the Secretary of State has delegated powers to change the safeguards at will undermines his argument that UK citizens have lost nothing at all; they have lost the right not to be subject to an automated decision.
The fact that the Government have left some guard-rails for special category data is in itself an indication that they know they are downgrading UK data rights, because the safeguards in place are not adequate. If they were adequate, it would be unnecessary to separate out SPC data in this way. I hammer the point home by asking the Minister to explain how the protections will work in practice in an era of AI when risks can come from inference and data analytics that do not use special category data but will still have a profound impact on the work lives, health, finances and opportunities of data subjects. If it is the case that data about your neighbourhood, shopping habits, search results, steps or entertainment choices is used to infer an important decision, how would a data subject activate their rights in that case?
As an illustration of this point, the daughter of a colleague of mine, who, as it happens, has a deep expertise in data law, this year undertook a video-based interview for a Russell group university with no human contact. It was not yet an ADM system, but we are inching ever closer to it. Removing the right, as the Government propose, would place the onus on students to complain or intervene—in a non-vexatious manner, of course. Will the Minister set out how UK citizens will be protected from life-changing decisions after government changes to Article 22, particularly as, in conjunction with other changes such as subject access requests and data impact assessments, UK citizens are about to have fewer routes to justice and less transparency of what is happening to their data?
I would also be grateful if the Minister could speak to whether he believes that the granularity and precision of current profiling deployed by AI and machine learning is sufficiently guaranteed to take this fundamental right away. Similarly, I hope that the known concerns about bias and fairness in ADM will be resolved over time, but we are not there yet, so why is it that the Government have a wait-and-see policy on regulation but are not offering the same “wait and see” in relation to data rights?
On Amendment 59 in the name of the noble Baroness, Lady Jones, the number of workers anticipated to be impacted by AI is simply eye-watering. In last Friday’s debate on AI, it was said to be 300 million worldwide, and one in four across Europe. But how workers work with AI is not simply a scary vision of the near future; it is here now.
I have a family member who last year left an otherwise well-paid and socially useful job when they introduced surveillance on to his computer during his working from home. At the time, he said that the way in which it impacted on both his self-esteem and autonomy was so devastating that he felt like
“a cog in a machine or an Amazon worker with no agency or creativity”.
He was an exemplary employee: top of the bonus list and in all measurable ways the right person in the right job. Efficiency in work has a vital role but it is not the whole picture. We know that, if able and skilled workers lose their will to work, it comes at a considerable cost to the well-being of the nation and the public purse. Most jobs in future will involve working with or even collaborating with technology; ensuring that work is dignified and fair to the human components of this arrangement is not a drag on productivity but a necessity if society is to benefit from changes to technology.
Certainly. Being prescriptive and applying one-size-fits-all measures for all processes covered by the Bill encourages organisations to follow a process, but focusing on outcomes encourages organisations to take better ownership of the outcomes and pursue the optimal privacy and safety mechanisms for those organisations. That is guidance that came out very strongly in the Data: A New Direction consultation. Indeed, in the debate on a later group we will discuss the use of senior responsible individuals rather than data protection officers, which is a good example of removing prescriptiveness to enhance adherence to the overall framework and enhance safety.
This seems like a very good moment to ask whether, if the variation is based on outcome and necessity, the Minister agrees that the higher bar of safety for children should be specifically required as an outcome.
I absolutely agree about the outcome of higher safety for children. We will come to debate whether the mechanism for determining or specifying that outcome is writing that down specifically, as suggested.
I am sure the Minister knew I was going to stand up to say that, if it is not part of the regulatory instruction, it will not be part of the outcome. The point of regulation is to determine a floor— never a ceiling—below which people cannot go. Therefore, if we wish to safeguard children, we must have that floor as part of the regulatory instruction.
Indeed. That may well be the case, but how that regulatory instruction is expressed can be done in multiple ways. Let me continue; otherwise, I will run out of time.
Let me make the broad point that there is no single list of outcomes for the whole Bill but, as we go through clause by clause, I hope the philosophy behind it, of being less prescriptive about process and more prescriptive about the results of the process that we desire, should emerge—not just on Clause 14 but as the overall philosophy underlying the Bill. Regulation-making powers can also be used to vary the existing safeguards, add additional safeguards and remove additional safeguards added at a later date.
On the point about having regard, it is important that the law is drafted in a way that allows it to adapt as technology advances. Including prescriptive requirements in the legislation reduces this flexibility and undermines the purpose of this clause and these powers to provide additional legal clarity when it is deemed necessary and appropriate in the light of the fast-moving advances in and adoption of technologies relevant to automated decision-making. I would like to reassure noble Lords that the powers can be used only to vary the existing safeguards, add additional ones and remove them. They cannot remove any of the safeguards written into the legislation.
Amendments 53 to 55 and 69 to 71 concern the Secretary of State powers relating to the terms “significant decisions” and “meaningful human involvement”. These powers enable the Secretary of State to provide a description of decisions that do or do not have a significant effect on data subjects, and describe cases that can be taken to have, or not to have, meaningful human involvement. As technology adoption grows and new technologies emerge, these powers will enable the Government to provide legal clarity, if and when deemed necessary, to ensure that people are protected and have access to safeguards when they matter most. In respect of Amendment 59A, Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22.
Also, as has been observed—I take the point about the limitations of this, but I would like to make the point anyway—any changes to the regulations are subject to the affirmative procedure and so must be approved by both Houses. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, I would ask the noble Lord, Lord Clement Jones, and my noble friend Lord Holmes—were he here—not to press their amendments.
Amendment 57 in the name of the noble Baroness, Lady Kidron, seeks to ensure that, when exercising regulation-making powers in relation to the safeguards in Article 22 of the UK GDPR, the Secretary of State should uphold the level of protection that children are entitled to in the Data Protection Act 2018. As I have said before, Clause 50 requires the Secretary of State to consult the ICO and other persons he or she considers appropriate. The digital landscape and its technologies evolve rapidly, presenting new challenges in safeguarding children. Regular consultations with the ICO and stakeholders ensure that regulations remain relevant and responsive to emerging risks associated with solely automated decision-making. The ICO has a robust position on the protection of children, as evidenced through its guidance and, in particular, the age-appropriate design code. As such, I ask the noble Baroness not to press her amendment.
Amendments 58, 72 and 73 seek to prevent the Secretary of State varying any of the safeguards mentioned in the reformed clauses. As I assured noble Lords earlier, the powers in this provision can be used only to vary the existing safeguards, add additional safeguards and remove additional safeguards added by regulation in future; there is not a power to remove any of the safeguards.
I apologise for breaking the Minister’s flow, especially as he had moved on a little, but I have a number of questions. Given the time, perhaps he can write to me to answer them specifically. They are all designed to show the difference between what children now have and what they will have under the Bill.
I have to put on the record that I do not accept what the Minister just said—that, without instruction, the ICO can use its old instruction to uphold the current safety for children—if the Government are taking the instruction out of the Bill and leaving it with the old regulator. I ask the Minister to tell the Committee whether it is envisaged that the ICO will have to rewrite the age-appropriate design code to marry it with the new Bill, rather than it being the reason why it is upheld. I do not think the Government can have it both ways where, on the one hand, the ICO is the keeper of the children, and, on the other, they take out things that allow the ICO to be the keeper of the children in this Bill.
I absolutely recognise the seriousness and importance of the points made by the noble Baroness. Of course, I would be happy to write to her and meet her, as I would be for any Member in the Committee, to give—I hope—more satisfactory answers on these important points.
As an initial clarification before I write, it is perhaps worth me saying that the ICO has a responsibility to keep guidance up to date but, because it is an independent regulator, it is not for the Government to prescribe this, only to allow it to do so for flexibility. As I say, I will write and set out that important point in more detail.
Amendment 59 relates to workplace rights. I reiterate that the existing data protection legislation and our proposed reforms—
My Lords, I speak to Amendment 144 in my name, which is supported by the noble Baronesses, Lady Harding and Lady Jones, and the noble Lord, Lord Clement-Jones. The amendment would introduce a code of practice on children and AI. Before I speak to it, I declare an interest: I am working with academic NGO colleagues in the UK, EU and US on such a code, and I am part of the UN Secretary-General’s AI advisory body’s expert group, which is currently working on sections on both AI and children and AI and education.
AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, people they follow and products they buy. But it no longer concerns simply the elective parts of life where, arguably, a child—or a parent on their behalf—can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors—the first of which is compulsory and the second of which is necessary.
The proposed code has three parts. The first requires the ICO to create the code and sets out expectations of its scope. The second considers who and what should be consulted and considered, including experts, children and the frameworks that codify children’s existing rights. The third defines elements of the process, including risk assessment, defines language and puts the principles to which the code must adhere in the Bill.
I am going to get my defence in early. I anticipate that the Minister will say that the ICO has published guidance, that we do not want to exclude children from the benefits of AI and that we are in a time of “wait and see”. He might even ask why children need something different or why the AADC, which I mention so frequently, is not sufficient. Let me take each of those in turn.
On the sufficiency of the current guidance, the ICO’s non-binding Guidance on AI and Data Protection, which was last updated on 15 March 2023, has a single mention of a child in its 140 pages, in a case study about child benefits. The accompanying AI and data protection toolkit makes no mention of children, nor does the ICO’s advice to developers on generative AI, issued on 3 April 2023. There are hundreds of pages of guidance but it fails entirely to consider the specific needs of children, their rights, their development vulnerabilities or that their lives will be entirely dominated by AI systems in a way that is still unimaginable to those in this Room. Similarly, there is little mention of children in the Government’s own White Paper on AI. The only such references are limited to AI-generated child sexual abuse material; we will come to that later when we discuss Amendment 291. Even the AI summit had no main-stage event relating to children.
Of course we do not want to exclude children from the benefits of AI. A code on the use of children’s data in the development and deployment of AI technology increases their prospects of enjoying the benefits of AI while ensuring that they are protected from the pitfalls. Last week’s debate in the name of the noble Lord, Lord Holmes, showed the broad welcome of the benefits while urgently speaking to the need for certain principles and fundamental protections to be mandatory.
As for saying, “We are in a time of ‘wait and see’”, that is not good enough. In the course of this Committee, we will explore edtech that has only advertising and no learning content, children being left out of classrooms because their parents will not accept the data leaks of Google Classroom, social media being scraped to create AI-generated CSAM and how rapid advances in generative AI capabilities mark a new stage in its evolution. Some of the consequences of that include ready access to models that create illegal and abusive material at scale and chatbots that offer illegal or dangerous advice. Long before we get on to the existential threat, we have “here and now” issues. Childhood is a very short period of life. The impacts of AI are here and now in our homes, our classrooms, our universities and our hospitals. We cannot afford to wait and see.
Children are different for three reasons. First, as has been established over decades, there are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony, and learn different social skills. This means that, equally, there are ages and stages at which they cannot do that. The long-established consensus is that family, social groups and society more broadly—including government—step in to support that journey.
Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces that they inhabit have to be fit for childhood.
Thirdly, we have a responsibility towards children that extends even beyond our responsibilities to each other; this means that it is not okay for us to legitimise profit at their expense, whether it is allowing an unregulated edtech market that exploits their data and teaches them nothing or the untrammelled use of their pictures to create child sexual abuse material.
Finally, what about the AADC? I hope that, in the course of our deliberations, we will put that on a more secure footing. The AADC addresses recommender systems in standard 12. However, the code published in August 2020 does not address generative AI which, as we have repeatedly heard, is a game-changer. Moreover, the AADC is currently restricted to information society services, which leaves a gaping hole. This amendment would address this gap.
There is an argument that the proposed code could be combined with the AADC as an update to its provisions. However, unless and until we sort out the status of the AADC in relation to the Bill, an AI kids code would be better formed as a stand-alone code. A UK code of practice on children and AI would ensure that data processors consider the fundamental rights and freedoms of children, including their safety, as they develop their products and perhaps even give innovators the appetite to innovate with children in mind.
As I pointed out at the beginning, there are many people globally working on this agenda. I hope that as we are the birthplace of the AADC and the Online Safety Act, the Government will adopt this suggestion and again be a forerunner in child privacy and safety. If, however, the Minister once again says that protections for children are not necessary, let me assure him that they will be put in place by others, and we will be a rule taker not a rule maker.
My Lords, I rise with the advantage over the noble Lord, Lord Clement-Jones, in that I will speak to only one amendment in this group; I therefore have the right page in front of me and can note that I will speak to Amendment 252, tabled by the noble Lord, Lord Clement-Jones, and signed by me and the noble Lords, Lord Watson of Wyre Forest and Lord Maude of Horsham.
I apologise that I was not with the Committee earlier today, but I was chairing a meeting about the microbiome, which was curiously related to this Committee. One issue that came up in that meeting was data and data management and the great uncertainties that remain. For example, if a part of your microbiome is sampled and the data is put into a database, who owns that data about your microbiome? In fact, there is no legal framework at the moment to cover this. There is a legal framework about your genome, but not your microbiome. That is a useful illustration of how fast this whole area is moving and how fast technology, science and society are changing. I will actually say that I do not blame the Government for the fact of this gaping hole as it is an international hole. It is a demonstration of how we need to race to catch up as legislators and regulators to deal with the problem.
This relates to Amendment 252 in the sense that perhaps this is an issue that has arisen over time, kind of accidentally. However, I want to credit a number of campaigners, among them James O’Malley, who was the man who draw my attention to this issue, as well as Peter Wells, Anna Powell-Smith and Hadley Beeman. They are people who have seen a really simple and basic problem in the way that regulation is working and are reaching out including, I am sure, to many noble Lords in this Committee. This is a great demonstration of how campaigning has at least gone part of the way to working. I very much hope that, if not today, then some time soon, we can see this working.
What we are talking about here, as the noble Lord, Lord Clement-Jones, said, is the postal address file. It is held as a piece of private property by Royal Mail. It is important to stress that this is not people’s private information or who lives at what address; it is about where the address is. As the noble Lord, Lord Clement-Jones, set out, all kinds of companies have to pay Royal Mail to have access to this basic information about society, basic information that is assembled by society, for society.
The noble Lord mentioned Amazon having to pay for the file. I must admit that I feel absolutely no sympathy there. I am no fan of the great parasite. It is an interesting contrast to think of Amazon paying, but also to think of an innovative new start-up company, which wants to be able to access and reach people to deliver things to their homes. For this company, the cost of acquiring this file could be prohibitive. It could stop it getting started and competing against Amazon.
I believe that the AADC already has statutory standing.
On that point, I think that the Minister said—forgive me if I am misquoting him —risk, rules and rights, or some list to that effect. While the intention of what he said was that we have to be careful where children are using it, and the ICO has to make them aware of the risks, the purpose of a code—whether it is part of the AADC or stand-alone—is to put those responsibilities on the designers of service products and so on by default. It is upstream where we need the action, not downstream, where the children are.
Yes, I entirely agree with that, but I add that we need it upstream and downstream.
For the reasons I have set out, the Government do not believe that it would be appropriate to add these provisions to the Bill at this time without further detailed consultation with the ICO and the other organisations involved in regulating AI in the United Kingdom. Clause 33—
Can we agree that there will be some discussions with the ICO between now and Report? If those take place, I will not bring this point back on Report unnecessarily.
Yes, I am happy to commit to that. As I said, we look forward to talking with the noble Baroness and others who take an interest in this important area.
Clause 33 already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that she sees fit, so this is an issue that we could return to in the future, if the evidence supports it, but, as I said, we consider the amendments unnecessary at this time.
Finally, Amendment 252 would place a legislative obligation on the Secretary of State regularly to publish address data maintained by local authorities under open terms—that is, accessible by anyone for any purpose and for free. High-quality, authoritative address data for the UK is currently used by more than 50,000 public and private sector organisations, which demonstrates that current licensing arrangements are not prohibitive. This data is already accessible for a reasonable fee from local authorities and Royal Mail, with prices starting at 1.68p per address or £95 for national coverage.
My Lords, I will speak to a number of amendments in this group—Amendments 79, 83, 85, 86, 96, 97, 105 and 107.
Amendment 79 proposes an addition to the amendments to Article 28 of the UK GDPR in Clause 15(4). Article 28 sets out the obligations on processors when processing personal data on behalf of controllers. Currently, paragraph 3(c) requires processors to comply with Article 32 of the UK GDPR, which relates to data security. Amendment 79 adds the requirement for processors also to comply with the privacy-by-design provision in Article 25. Article 25 requires controllers to
“at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects”.
I am not proposing an abdication of responsibility by the controller when it instructs a processor to act on its behalf but, in practice, it is hard for a controller to meet this responsibility at the time of processing if it has delegated the processing to a third party that is not bound by the same requirement. I am not normally associated with the edtech sector, but the amendment is of particular importance to it as schools are controllers but the data of children is being processed.
The amendment ensures that processors would be contractually committed to complying with Article 25. It is particularly relevant to situations where controllers procure AI systems, including facial recognition technology and edtech products. It would be helpful in both the public and private sectors and would address the power asymmetry between controller and processor when the processor is a multinational and solutions are often presented on a take-it-or-leave-it basis.
I hope noble Lords will forgive me if I take Amendment 97 out of turn, as all the others in my name relate to children’s data, whereas Amendment 97, like Amendment 79, applies to all data subjects. Amendment 97 would require public bodies to publish risk assessments to create transparency and accountability. This would also place in statute a provision that is already contained in the ICO’s freedom of information publication scheme guidance. The amendment would also require the Cabinet Office to create and maintain an accessible register of public sector risk assessments to improve accountability.
In the last group, we heard that the way in which public bodies collect and process personal data has far-reaching consequences for all of us. I was moved to lay this amendment after witnessing some egregious examples from the education system. The public have a right to know how bodies such as health authorities, schools, universities, police forces, local authorities and government departments comply with their obligations under UK data law. This amendment is simply about creating trust.
The child-related amendments in this group are in my name and those of the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Harding and Lady Jones. Clause 17 sets out the obligations for the newly created role of “senior responsible individual”, which replaces the GDPR requirement to appoint a data protection officer. The two roles are not equivalent: a DPO is an independent adviser to senior management, while a senior responsible individual would be a member of senior management. Amendment 83 would ensure that those appointed senior responsible individuals have an understanding of the heightened risks and the protections to which children are entitled.
Over the years, I have had many conversations with senior executives at major tech companies and, beyond the lines prepared by their public affairs teams, their understanding of children’s protection is often superficial and their grasp of key issues very limited. In fact, if I had a dollar for every time a tech leader, government affairs person or engineer has said, “I never thought of it that way before”, I would be sitting on quite a fortune.
Amendment 83 would simply ensure that a senior leader who is tasked with overseeing compliance with UK data law knows what he or she is talking about when it comes to children’s privacy, and that it informs the decisions they make. It is a modest proposal, and I hope the Minister will find a way to accept it.
Amendments 85 and 86 would require a controller to consider children’s right to higher standards of privacy than adults for their personal data when carrying out its record-keeping duties. Specifically, Amendment 85 sets out what is appropriate when maintaining records of high-risk processing and Amendment 87 relates to processing that is non-high risk. Creating an express requirement to include consideration of these rights in a data controller’s processing record-keeping obligation is a simple but effective way of ensuring that systems and processes are designed with the needs and rights of children front of mind.
Clause 20 is one of the many fault lines where the gap between the assurances given that children will be just as safe and the words on the page is clear. I make clear that the amendments to Clause 18 that I put forward are, as the noble Lord, Lord Clement-Jones, said on Monday, belt and braces. They do not reach the standard of protection that children currently enjoy under the risk-assessment provisions in Article 35 of the UK GDPR and the age-appropriate design code.
A comparison of what controllers must include in a data protection impact assessment under Article 35(7) and what they would need to cover in an assessment of high-risk processing under Clause 20(3)(d) shows the inadequacies of the latter. Instead of a controller having to include
“a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller”,
under the Bill, the controller needs to include only
“a summary of the purposes of the processing”.
They need to include no systematic description—just a summary. There is no obligation to include information about the processing operations or to explain when and how the controller has determined they are entitled to rely on legitimate interest purpose. Instead of
“an assessment of the necessity and proportionality of the processing operations in relation to the purposes”,
under the Bill, a controller needs to assess only necessity, not proportionality. Instead of
“an assessment of the risks to the rights and freedoms of data subjects”,
under the Bill, a controller does not need to consider rights and freedoms.
As an aside, I note that this conflicts with the proposed amendments to Section 64 of the Data Protection Act 2018 in Clause 20(7)(d), which retains the “rights and freedoms” wording but otherwise mirrors the new downgraded requirements in Clause 20(3)(d). I would be grateful for clarification from the Minister on this point.
Instead of requiring the controller to include information about
“the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned”,
as currently prescribed in Article 35, under the Bill, the controller needs to provide only
“a description of how the controller proposes to mitigate those risks”.
The granularity of what is currently required is replaced by a generalised reference to “a description”. These are not the same bar. My argument throughout Committee is that we need to maintain the bar for processing children’s data.
My Lords, just for clarification, because a number of questions were raised, if the Committee feels that it would like to hear more from the Minister, it can. It is for the mood of the Committee to decide.
I apologise for going over. I will try to be as quick as possible.
I turn now to the amendments on the new provisions on assessments of high-risk processing in Clause 20. Amendments 87, 88, 89, 91, 92, 93, 94, 95, 97, 98 and 101 seek to reinstate requirements in new Article 35 of the UK GDPR on data protection impact assessments, and, in some areas, make them even more onerous for public authorities. Amendment 90 seeks to reintroduce a list of high-risk processing activities drawn from new Article 35, with a view to help data controllers comply with the new requirements on carrying out assessments of high-risk processing.
Amendment 96, tabled by the noble Baroness, Lady Kidron, seeks to amend Clause 20, so that, where an internet service is likely to be accessed by children, the processing is automatically classed as high risk and the controller must do a children’s data protection impact assessment. Of course, I fully understand why the noble Baroness would like those measures to apply automatically to organisations processing children’s data, and particularly to internet services likely to be accessed by children. It is highly likely that many of the internet services that she is most concerned about will be undertaking high-risk activities, and they would therefore need to undertake a risk assessment.
Under the current provisions in Clause 20, organisations will still have to undertake risk assessments where their processing activities are likely to pose high risks to individuals, but they should have the ability to assess the level of risk based on the specific nature, scale and context of their own processing activities. Data controllers do not need to be directed by government or Parliament about every processing activity that will likely require a risk assessment, but the amendments would reintroduce a level of prescriptiveness that we were seeking to remove.
Clause 20 requires the ICO to publish a list of examples of the types of processing activities that it considers would pose high risks for the purposes of these provisions, which will help controllers to determine whether a risk assessment is needed. This will provide organisations with more contemporary and practical help than a fixed list of examples in primary legislation could. The ICO will be required to publish a document with a list of examples that it considers to be high-risk processing activities, and we fully expect the vulnerability age of data subjects to be a feature of that. The commissioner’s current guidance on data protection impact assessments already describes the use of the personal data of children or other vulnerable individuals for marketing purposes, profiling or offering internet services directly to children as examples of high-risk processing, although the Government cannot of course tell the ICO what to include in its new guidance.
Similarly, in relation to Amendments 99, 100 and 102 from the noble Baroness, Lady Jones, it should not be necessary for this clause to specifically require organisations to consider risks associated with automated decision-making or obligations under equalities legislation. That is because the existing clause already requires controllers to consider any risks to individuals and to describe
“how the controller proposes to mitigate those risks”.
I am being asked to wrap up and so, in the interests of time, I shall write with my remaining comments. I have no doubt that noble Lords are sick of the sound of my voice by now.
My Lords, I hope that no noble Lord expects me to pull all that together. However, I will mention a couple of things.
With this group, the Minister finally has said all the reasons why everything will be different and less. Those responsible for writing the Minister’s speeches should be more transparent about the Government’s intention, because “organisations are best placed to determine what is high-risk”—not the ICO, not Parliament, not existing data law. Organisations are also for themselves. They are “best placed to decide on their representation”, whether it is here or there and whether it speaks English or not, and they “get to decide whether they have a DPO or a senior responsible individual”. Those are three quotes from the Minister’s speech. If organisations are in charge of the bar of data protection and the definition of data protection, I do believe that this is a weakening of the data protection regime. He also said that organisations are responsible for the quality of their risk assessment. Those are four places in this group alone.
At the beginning, the noble Baroness, Lady Harding, talked about the trust of consumers and citizens. I do not think that this engenders trust. The architecture is so keen to get rid of ways of accessing rights that some organisations may have to have a DPO and a DPIA—a doubling rather than a reducing of burden. Very early on—it feels a long time ago—a number of noble Lords talked about the granular detail. I tried in my own contribution to show how very different it is in detail. So I ask the Minister to reflect on the assertion that you can take out the detail and have the same outcome. All the burden being removed is on one side of the equation, just as we enter into a world in which AI, which is built on people’s data, is coming in the other direction.
I will of course withdraw my amendment, but I believe that Clauses 20, 18 and the other clauses we just discussed are deregulation measures. That should be made clear from the Dispatch Box, and that is a choice that the House will have to make.
Before I sit down, I do want to recognise one thing, which is that the Minister said that he would work alongside us between now and Report; I thank him for that, and I accept that. I also noted that he said that it was a responsibility to take care of children by default. I agree with him; I would like to see that in the Bill. I beg leave to withdraw my amendment.
My Lords, I am somewhat disappointed to be talking to these amendments in the dying hours of our Committee before we take a break because many noble Lords—indeed, many people outside the House—have contacted me about them. I particularly want to record the regret of the noble Lord, Lord Black, who is a signatory to these amendments, that he is unable to be with us today.
The battle between rights-holders and the tech sector is nothing new. Many noble Lords will remember the arrival and demise of file-sharing platform Napster and the subsequent settlement between the sector and the giant creative industries. Napster argued that it was merely providing a platform for users to share files and was not responsible for the actions of its users; the courts sided with the music industry, and Napster was ordered to shut down its operations in 2001. The “mere conduit” argument was debunked two decades ago. To the frustration of many of us, the lawsuits led to a perverse outcome that violent bullying or sexually explicit content would be left up for days, weeks or forever, while a birthday video with the temerity to have music in the background would be deleted almost immediately.
The emergence of the large language models—LLMs—and the desire on the part of LLM developers to scrape the open web to capture as much text, data and images as possible raise some of the same issues. The scale of scraping is, by their own admission, unprecedented, and their hunger for data at any cost in an arms race for AI dominance is publicly acknowledged, setting up a tension between the companies that want the data and data subjects and creative rights holders. A data controller who publishes personal data as part of a news story, for example, may do so on the basis of an exemption under data protection law for journalism, only for that data to be scraped and commingled with other data scraped from the open web to train an LLM.
This raises issues of copyright infringement and, more importantly—whether for individuals, creative communities or businesses that depend on the value of what they produce—these scraping activities happen invisibly. Anonymous bots acting on behalf of AI developers, or conducting a scrape as a potential supplier to AI developers, are scraping websites without notifying data controllers or data subjects. In doing so, they are also silent on whether processes are in place to minimise risks or balance competing interests, as required by current data law.
Amendment 103 would address those risks by requiring documentation and transparency. Proposed new paragraph (e) would require an AI developer to document how the data controller will enforce purpose limitation. This is essential, given that invisible data processing enabled through web scraping can pick up material that is published for a legitimate purpose, such as journalism, but the combination of such information with other data accessed through invisible data processing could change the purpose and application of that data in ways that the individual may wish to object to using their existing data rights. Proposed new paragraph (f) would require a data processor seeking to use legitimate interest as the basis for web scraping and invisible processing to build LLMs to document evidence of how they have ensured that individual information rights have been enabled at the point of collection and after processing.
Together, those proposed new paragraphs would mean that anyone who scrapes web data must be able to show that the data subjects have meaningful control and can access their information rights ahead of processing. These would be mandatory, unless they have incorporated an easily accessible machine-readable protocol on an opt-in basis, which is then the subject of Amendment 104.
Amendment 104 would require web scrapers to establish an easily accessible machine-readable protocol that works on an opt-in basis rather than the current opt-out. Undoubtedly, the words “easily”, “accessible”, “machine readable” and “web protocols” would all benefit from guidance from the ICO but, for the absence of doubt, the intention of the amendment is that a web scraper would proactively notify individuals and website owners that scraping of their data will take place, including stating the identity of the data processor and the purpose for which that data is to be scraped. In addition, the data processor will provide information on how data subjects and data controllers can exercise their information rights to opt out of their data being scraped before any such scraping takes place, with an option to object after the event if taken without permission.
We are in a situation in which not only is IP being taken at scale, potentially impoverishing our very valuable creative industries, journalism and academic work that is then regurgitated inaccurately, but which is making a mockery of individual data rights. In its recent consultation into the lawful basis for web scraping, the ICO determined that use of web-scraped data
“can be feasible if generative AI developers take their legal obligations seriously and can evidence and demonstrate this in practice”.
These amendments would operationalise that demonstration. As it stands, there is routine failure, particularly regarding new models. For example, the ICO’s preliminary enforcement notice against Snap is that its risk assessment for its AI tool was inadequate.
Noble Lords will appreciate the significance of the connection that the ICO draws between innovative technology and children’s personal data, given the heightened data rights and protections that children are afforded under the age-appropriate design code. While I welcome the ICO’s action, holders of intellectual copyright have been left to fend for themselves, since government talks have failed and individual data subjects are left exposed. Whether it is the scraping of social media or work and school websites, these will not be pursued by the ICO because regulating action in such small increments is disproportionate, yet this lack of compliance is happening at scale.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for their support. I will read the Minister’s speech, because this is a somewhat technical matter. I am not entirely sure that I agree with what he said, but I am also not sure that I could disagree with it adequately in the moment.
I will make two general points, however. First, I hear the Minister loud and clear on the question of the Government’s announcement on AI and IP but, at the beginning of my speech, I referenced Napster and how we ended up with personal data. The big guys won the battle for copyright, so we will see the likes of the New York Times, EMI and so on winning this battle, but small creatives and individuals will not be protected. I hope that, when that announcement comes, it includes the personal data issue as well.
Secondly, I say to the Minister that, if it is working now in the way he outlined from the ICO, then I do not think anybody thinks it is working very well. Either the ICO needs to do something, or we need to do something in this Bill. If not, we are letting all our data be taken for free to build the new world with no permission.
I know that the noble Viscount is interested in this area. It is one in which we could be creative. I suggest that we try to solve the conundrum about whether the ICO is not doing its work or we are not doing ours. I beg leave to withdraw my amendment.
(10 months, 1 week ago)
Grand CommitteeMy Lords, I rise to speak to my Amendment 11 and to Amendments 14, 16, 17, 18, Clause 5 stand part and Clause 7 stand part. I will attempt to be as brief as I can, but Clause 5 involves rather a large number of issues.
Processing personal data is currently lawful only if it is performed for at least one lawful purpose, one of which is that the processing is for legitimate interests pursued by the controller or a third party, except where those interests are overridden by the interests or fundamental rights of the data subject. As such, if a data controller relies on their legitimate interest as a legal basis for processing data, they must conduct a balancing test of their interest and those of the data subject.
Clause 5 amends the UK GDPR’s legitimate interest provisions by introducing the concept of recognised legitimate interest, which allows data to be processed without a legitimate interest balancing test. This provides businesses and other organisations with a broader scope of justification for data processing. Clause 5 would amend Article 6 of the UK GDPR to equip the Secretary of State with a power to determine these new recognised legitimate interests. Under the proposed amendment, the Secretary of State must have regard to,
“among other things … the interests and fundamental rights and freedoms of data subjects”.
The usual legitimate interest test is much stronger: rather than merely a topic to have regard to, a legitimate interest basis cannot lawfully apply if the data subject’s interests override those of the data controller.
Annexe 1, as inserted by the Bill, now provides a list of exemptions but is overly broad and vague. It includes national security, public security and defence, and emergencies and crime as legitimate interests for data processing without an assessment. Conservative MP, Marcus Fysh, said on Third Reading:
“Before companies share data or use data, they should have to think about what the balance is between a legitimate interest and the data rights, privacy rights and all the other rights that people may have in relation to their data. We do not want to give them a loophole or a way out of having to think about that.” —[Official Report, Commons, 29/11/23; col. 896.]
I entirely agree with that.
The amendment in Clause 5 also provides examples of processing that may be considered legitimate interests under the existing legitimate interest purpose, under Article 6(1)(f), rather than under the new recognised legitimate interest purpose. These include direct marketing, intra-group transmission of personal data for internal administrative purposes, and processing necessary to ensure the security of a network.
The Bill also provides a much more litigious data environment. Currently, an organisation’s assessment of its lawful purposes for processing data can be challenged through correspondence or an ICO complaint, whereas, under the proposed system, an individual may be forced to legally challenge a statutory instrument in order to contest the basis on which their data is processed.
As I will explain later, our preference is that the clause not stand part, but I accept that there are some areas that need clarification and Amendment 11 is designed to do this. The UK GDPR sets out conditions in which processing of data is lawful. The Bill inserts in Article 6(1) a provision specifying that processing shall be lawful for the purposes of a recognised legitimate interest, as I referred to earlier, an example of which may be for the purposes of direct marketing.
Many companies obtain data from the open electoral register. The register is maintained by local authorities, which have the right to sell this data to businesses. Amendment 11 would insert new Article (6)(1)(aa) and (ab), which provide that data processing shall be lawful where individuals have consented for their data
“to enter the public domain via a public body”,
or where processing is carried out by public bodies pursuant to their duties and rights, which may include making such data available to the public. Individuals are free to opt out of the open electoral register if they so wish and it would be disproportionate—in fact, irritating—to consumers to notify those who have consented to their data being processed that their data is being processed.
On Amendment 14, as mentioned, the Bill would give the Secretary of State the power to determine recognised legitimate interests through secondary legislation, which is subject to minimal levels of parliamentary scrutiny. Although the affirmative procedure is required, this does not entail much scrutiny or much of a debate. The last time MPs did not approve a statutory instrument under the affirmative procedure was in 1978. In practice, interests could be added to this list at any time and for any reason, facilitating the flow and use of personal data for limitless potential purposes. Businesses could be obligated to share the public’s personal data with government or law enforcement agencies beyond what they are currently required to do, all based on the Secretary of State’s inclination at the time.
We are concerned that this Henry VIII power is unjustified and undermines the very purpose of data protection legislation, which is to protect the privacy of individuals in a democratic data environment, as it vests undue power over personal data rights in the Executive. This amendment is designed to prevent the Secretary of State from having the ability to pre-authorise data processing outside the usual legally defined route. It is important to avoid a two-tier data protection framework in which the Secretary of State can decide that certain processing is effectively above the law.
On Amendment 17, some of the most common settings where data protection law is broken relate to the sharing of HIV status of an individual living with HIV in their personal life in relation to employment, healthcare services and the police. The sharing of an individual’s HIV status can lead to further discrimination being experienced by people living with HIV and can increase their risk of harassment or even violence. The National AIDS Trust is concerned that the Bill as drafted does not go far enough to prevent individuals’ HIV status from being shared with others without their consent. They and we believe that the Bill must clarify what an “administrative purpose” is for organisations processing employees’ personal data. Amendment 17 would add wording to clarify that, in paragraph 9(b) of Article 6,
“intra-group transmission of personal data”
in the workplace, within an organisation or in a group of organisations should be permitted only for individuals who need to access an employee’s personal data as part of their work.
As far as Amendment 18 is concerned, as it stands Clause 5 gives an advantage to large undertakings with numerous companies that can transmit data intra-group purely because they are affiliated to one central body. However, this contradicts both the ICO’s and the CMA’s repeated position that first party versus third party is not a meaningful distinction to cover privacy risk. Instead, it is the distinction of what data is processed, rather than the corporate ownership of the systems doing the processing. The amendment reflects the organisational measures that undertakings should have as safeguards. The groups of undertakings transmitting data should have organisational measures via contract to be able to take advantage of this transmission of data.
Then we come to the question of Clause 5 standing part of the Bill. This clause is unnecessary and creates risks. It is unnecessary because the legitimate interest balancing test is, in fact, flexible and practical; it already allows processing for emergencies, safeguarding and so on. It is risky because creating lists of specified legitimate interests inevitably narrows this concept and may make controllers less certain about whether a legitimate interest that is not a recognised legitimate interest can be characterised as such. In the age of AI, where change is exponential, we need principles and outcome-based legislation that are flexible and can be supplemented with guidance from an independent regulator, rather than setting up a system that requires the Government to legislate more and faster in order to catch up.
There is also a risk that the drafting of this provision does not dispense with the need to conduct a legitimate interest balancing test because all the recognised legitimate interests contain a test, of necessity. Established case law interprets the concept of necessity under data protection law as requiring a human rights balancing test to be carried out. This rather points to the smoke-and-mirrors effect of this drafting, which does nothing to improve legal certainty for organisations or protections for individuals.
I now come to Clause 7 standing part. This clause creates a presumption that processing will always be in the public interest or substantial public interest if done in reliance on a condition listed in proposed new Schedule A1 to the Data Protection Act 2018. The schedule will list international treaties that have been ratified by the UK. At present, the Bill lists only the UK-US data-sharing agreement as constituting relevant international law. Clause 7 seeks to remove the requirement for a controller to consider whether the legal basis on which they rely is in the public interest or substantial public interest, has appropriate safeguards and respects data subjects’ fundamental rights and freedoms. But the conditions in proposed new Schedule A1 in respect of the UK-US agreement also state that the processing must be necessary, as assessed by the controller, to respond to a request made under the agreement.
It is likely that a court would interpret “necessity” in the light of the ECHR. The court may therefore consider that the inclusion of a necessity test means that a controller would have to consider whether the UK-US agreement, or any other treaty added to the schedule, is proportionate to a legitimate aim pursued. Not only is it unreasonable to expect a controller to do such an assessment; it is also highly unusual. International treaties are drafted on a state-to-state basis and not in a way that necessarily corresponds clearly with domestic law. Further, domestic courts would normally consider the rights under the domestic law implementing a treaty, rather than having to interpret an international instrument without reference to a domestic implementing scheme. Being required to do so may make it more difficult for courts to enforce data subjects’ rights.
The Government have not really explained why it is necessary to amend the law in this way rather than simply implementing the UK-US agreement domestically. That would be the normal approach; it would remove the need to add this new legal basis and enable controllers to use the existing framework to identify a legal basis to process data in domestic law. Instead, this amendment makes it more difficult to understand how the law operates, which could in turn deter data sharing in important situations. Perhaps the Minister could explain why Clause 7 is there.
I beg to move.
My Lords, I rise to speak to Amendments 13 and 15. Before I do, let me say that I strongly support the comments of the noble Lord, Lord Clement-Jones, about HIV and the related vulnerability, and his assertion—almost—that Clause 5 is a solution in search of a problem. “Legitimate interest” is a flexible concept and I am somewhat bewildered as to why the Government are seeking to create change where none is needed. In this context, it follows that, were the noble Lord successful in his argument that Clause 5 should not stand part, Amendments 13 and 15 would be unnecessary.
On the first day in Committee, we debated a smaller group of amendments that sought to establish the principle that nothing in the Bill should lessen the privacy protections of children. In his response, the Minister said:
“if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data”.—[Official Report, 20/3/24; col. GC 75.]
I am glad the Minister is open to listening and that the Government’s intention is to protect children, but, as discussed previously, widening the definition of “research” in Clause 3 and watering down purpose limitation protections in Clause 6 negatively impacts children’s data rights. Again, in Clause 5, lowering the protections for all data subjects has consequences for children.
Indeed. Needless to say, we take the recommendations of the DPRRC very seriously, as they deserve. However, because this is an exhaustive list, and because the technologies and practices around data are likely to evolve very rapidly in ways we are unable currently to predict, it is important to retain as a safety measure the ability to update that list. That is the position the Government are coming from. We will obviously continue to consider the DPRRC’s recommendations, but that has to come with a certain amount of adaptiveness as we go. Any addition to the list would of course be subject to parliamentary debate, via the affirmative resolution procedure, as well as the safeguards listed in the provision itself.
Clause 50 ensures that the ICO and any other interested persons should be consulted before making regulations.
Amendments 15, 16, 17 and 18 would amend the part of Clause 5 that is concerned with the types of activities that might be carried out under the current legitimate interest lawful ground, under Article 6(1)(f). Amendment 15 would prevent direct marketing organisations relying on the legitimate interest lawful ground under Article 6(1)(f) if the personal data being processed related to children. However, the age and vulnerability in general of data subjects is already an important factor for direct marketing organisations when considering whether the processing is justified. The ICO already provides specific guidance for controllers carrying out this balancing test in relation to children’s data. The fact that a data subject is a child, and the age of the child in question, will still be relevant factors to take into account in this process. For these reasons, the Government consider this amendment unnecessary.
My Lords, am I to take it from that that none of the changes currently in the Bill will expose children on a routine basis to direct marketing?
As is the case today and will be going forward, direct marketing organisations will be required to perform the balancing test; and as in the ICO guidance today and, no doubt, going forward—
I am sorry if I am a little confused—I may well be—but the balancing test that is no longer going to be there allows a certain level of processing, which was the subject of the first amendment. The suggestion now is that children will be protected by a balancing test. I would love to know where that balancing test exists.
The balancing test remains there for legitimate interests, under Article 6(1)(f).
Amendment 16 seeks to prevent organisations that undertake third-party marketing relying on the legitimate interest lawful ground under Article 6(1)(f) of the UK GDPR. As I have set out, organisations can rely on that ground for processing personal data without consent when they are satisfied that they have a legitimate interest to do so and that their commercial interests are not outweighed by the rights and interests of data subjects.
Clause 5(4) inserts in Article 6 new paragraph (9), which provides some illustrative examples of activities that may constitute legitimate interests, including direct marketing activities, but it does not mean that they will necessarily be able to process personal data for that purpose. Organisations will need to assess on a case-by-case basis where the balance of interest lies. If the impact on the individual’s privacy is too great, they will not be able to rely on the legitimate interest lawful ground. I should emphasise that this is not a new concept created by this Bill. Indeed, the provisions inserted by Clause 5(4) are drawn directly from the recitals to the UK GDPR, as incorporated from the EU GDPR.
I recognise that direct marketing can be a sensitive—indeed, disagreeable—issue for some, but direct marketing information can be very important for businesses as well as individuals and can be dealt with in a way that respects people’s privacy. The provisions in this Bill do not change the fact that direct marketing activities must be compliant with the data protection and privacy legislation and continue to respect the data subject’s absolute right to opt out of receiving direct marketing communications.
Amendment 17 would make sure that the processing of employee data for “internal administrative purposes” is subject to heightened safeguards, particularly when it relates to health. I understand that this amendment relates to representations made by the National AIDS Trust concerning the level of protection afforded to employees’ health data. We agree that the protection of people’s HIV status is vital and that it is right that it is subject to extra protection, as is the case for all health data and special category data. We have committed to further engagement and to working with the National AIDS Trust to explore solutions in order to prevent data breaches of people’s HIV status, which we feel is best achieved through non-legislative means given the continued high data protection standards afforded by our existing legislation. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press this amendment.
Amendment 18 seeks to allow businesses more confidently to rely on the existing legitimate interest lawful ground for the transmission of personal data within a group of businesses affiliated by contract for internal administrative purposes. In Clause 5, the list of activities in proposed new paragraphs (9) and (10) are intended to be illustrative of the types of activities that may be legitimate interests for the purposes of Article 6(1)(f). They are focused on processing activities that are currently listed in the recitals to the EU GDPR but are simply examples. Many other processing activities may be legitimate interests for the purposes of Article 6(1)(f) of the UK GDPR. It is possible that the transmission of personal data for internal administrative purposes within a group affiliated by contract may constitute a legitimate interest, as may many other commercial activities. It would be for the controller to determine this on a case-by-case basis after carrying out a balancing test to assess the impact on the individual.
Finally, I turn to the clause stand part debate that seeks to remove Clause 7 from the Bill. I am grateful to the noble Lord, Lord Clement-Jones, for this amendment because it allows me to explain why this clause is important to the success of the UK-US data access agreement. As noble Lords will know, that agreement helps the law enforcement agencies in both countries tackle crime. Under the UK GDPR, data controllers can process personal data without consent on public interest grounds if the basis for the processing is set out in domestic law. Clause 7 makes it clear that the processing of personal data can also be carried out on public interest grounds if the basis for the processing is set out in a relevant international treaty such as the UK-US data access agreement.
The agreement permits telecommunications operators in the UK to disclose data about serious crimes with law enforcement agencies in the US, and vice versa. The DAA has been operational since October 2022 and disclosures made by UK organisations under it are already lawful under the UK GDPR. Recent ICO guidance confirms this, but the Government want to remove any doubt in the minds of UK data controllers that disclosures under the DAA are permitted by the UK GDPR. Clause 7 makes it absolutely clear to telecoms operators in the UK that disclosures under the DAA can be made in reliance on the UK GDPR’s public tasks processing grounds; the clause therefore contributes to the continued, effective functioning of the agreement and to keeping the public in both the UK and the US safe.
For these reasons, I hope that the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment.
My Lords, this whole area of democratic engagement is one that the Minister will need to explain in some detail. This is an Alice in Wonderland schedule: “These words mean what I want them to mean”. If, for instance, you are engaging with the children of a voter—at 14, they are children—is that democratic engagement? You could drive a coach and horses through Schedule 1. The Minister used the word “necessary”, but he must give us rather more than that. It was not very reassuring.
The Minister mentioned a presumption that the ICO will update its guidance. Is there a timeframe for that? Will the guidance be updated before this comes into effect? How does the age of 14 relate to the AADC, which sets the age of adulthood at 18?
Before the Minister replies, we may as well do the full round. I agree with him, in that I very much believe in votes at 16 and possibly younger. I have been on many a climate demonstration with young people of 14 and under, so they can be involved, but the issue here is bigger than age. The main issue is not age but whether anybody should be subjected to a potential barrage of material in which they have not in any way expressed an interest. I am keen to make sure that this debate is not diverted to the age question and that we do not lose the bigger issue. I wanted to say that I sort of agree with the Minister on one element.
A fair number of points were made there. I will look at ages under 16 and see what further steps, in addition to being necessary and proportionate, we can think about to provide some reassurance. Guidance would need to be in effect before any of this is acted on by any of the political parties. I and my fellow Ministers will continue to work with the ICO—
I am sorry to press the Minister, but does the Bill state that guidance will be in place before this comes into effect?
I am not sure whether it is written in the Bill. I will check, but the Bill would not function without the existence of the guidance.
Indeed. I will make absolutely sure that we provide a full answer. By the way, I sincerely thank the noble Lord for taking the time to go through what is perhaps not the most rewarding of reads but is useful none the less.
On the question of the ICO being responsible to Parliament, in the then Online Safety Bill and the digital markets Bill we consistently asked for regulators to be directly responsible to Parliament. If that is something the Government believe they are, we would like to see an expression of it.
I would be happy to provide such an expression. I will be astonished if that is not the subject of a later group of amendments. I have not yet prepared for that group, I am afraid, but yes, that is the intention.
My Lords, it is a pleasure to follow the noble Lord, Lord Bassam, who has already set out very clearly what the group is about. I will chiefly confine myself to speaking to my Amendment 38A, which seeks to put in the Bill a clear idea of what having a human in the loop actually means. We need to have a human in the loop to ensure that a human interpreted, assessed and, perhaps most crucially, was able to intervene in the decision and any information on which it is based.
Noble Lords will be aware of many situations that have already arisen in which artificial intelligence is used—I would say that what we are currently describing is artificial intelligence but, in real terms, it is not truly that at all. What we have is a very large use of big data and, as the noble Lord, Lord Bassam, said, big data can be a very useful and powerful tool to be used for many positive purposes. However, we know that the quality of decision-making often depends on the quality of the data going in. A human is able to see whether something looks astray or wrong; there is a kind of intelligence that humans apply to this, which machines simply do not have the capacity for.
I pay credit to Justice, the law reform and human rights organisation which produced an excellent briefing on the issues around Clause 14. It asserts that, as it is currently written, it inadequately protects individuals from automated harm.
The noble Lord, Lord Bassam, referred to the Horizon case in the UK; that is the obvious example but, while we may think of some of the most vulnerable people in the UK, the Robodebt case in Australia is another case where crunching big data, and then crunching down on individuals, had truly awful outcomes. We know that there is a real risk of unfairness and discrimination in the use of these kinds of tools. I note that the UK has signed the Bletchley declaration, which says that
“AI should be designed, developed, deployed, and used, in a manner that is … human-centric, trustworthy and responsible”.
I focus particularly on “human-centric”: human beings can sympathise with and understand other human beings in a way that big data simply does not.
I draw a parallel with something covered by a special Select Committee of your Lordships’ House, last year: lethal autonomous weapon systems, or so-called killer robots. This is an obvious example of where there is a very strong argument for having a human in the loop, as the terminology goes. From the last I understood and heard about this, I am afraid that the UK Government are not fully committed to a human in the loop in the case of killer robots, but I hope that we get to that point.
When we talk about how humans’ data is used and managed, we are also talking about situations that are—almost equally—life and death: whether people get a benefit, whether they are fairly treated and whether they do not suddenly disappear off the system. Only this morning, I was reading a case study of a woman aged over 80, highlighting how she had been through multiple government departments, but could not get her national insurance number. Without a national insurance number, she could not get the pension to which she was entitled. If there is no human in the loop to cut through those kinds of situations, there is a real risk that people will find themselves just going around and around machines—a circumstance with which we are personally all too familiar, I am sure. My amendment is an attempt to put a real explanation in the Bill for having that human in the loop.
My Lords, the number of amendments proposed to Clause 14 reflects the Committee’s very real concern about the impact of automated decision-making on the privacy, safety and prospects of UK data subjects. I have specific amendments in groups 7 and 8, so I will speak to the impact of Clause 14 on children later. I will again be making arguments about the vulnerability of these systems in relation to the Government’s proposals on the DWP.
Without repeating the arguments made, I associate myself with most the proposals and the intention behind them—the need to safeguard the prospects of a fair outcome when algorithms hold sway over a person’s future. It seems entirely logical that, if the definition of solely automated decision-making requires “no meaningful human involvement”, we should be clear, as Amendment 40 proposes, about what is considered “meaningful”, so that the system cannot be gamed by providing human involvement that provides an ineffective safeguard and is therefore not meaningful.
I have sympathy with many of these amendments—Amendments 38A, 39, 47, 62, 64 and 109—and ultimately believe, as was suggested by the noble Lord, Lord Bassam, that it is a matter of trust. I refer briefly to the parliamentary briefing from the BMA, which boldly says that:
“Clause 14 risks eroding trust in AI”.
That would be a very sad outcome.
My Lords, we have heard some powerful concerns on this group already. This clause is in one of the most significant parts of the Bill for the future. The Government’s AI policy is of long standing. They started it many years ago, then had a National AI Strategy in 2021, followed by a road map, a White Paper and a consultation response to the White Paper. Yet this part of the Bill, which is overtly about artificial intelligence and automated decision-making, does not seem to be woven into their thinking at all.
My Lords, the amendments in this group highlight that Clause 14 lacks the necessary checks and balances to uphold equality legislation, individual rights and freedoms, data protection rights, access to services, fairness in the exercise of public functions and workers’ rights. I add my voice to that of the noble Lord, Lord Clement-Jones, in his attempt to make Clause 14 not stand part, which he will speak to in the next group.
I note, as the noble Lord, Lord Bassam, has, that all the current frameworks have fundamental rights at their heart, whether it is the White House blueprint, the UN Secretary-General’s advisory body on AI, with which I am currently involved, or the EU’s AI Act. I am concerned that the UK does not want to work within this consensus.
With that in mind, I particularly note the importance of Amendment 41. As the noble Lord said, we are all supposed to adhere to the Equality Act 2010. I support Amendments 48 and 49, which are virtually inter-changeable in wanting to ensure that the standard of decisions being “solely” based on automated decision-making cannot be gamed by adding a trivial human element to avoid that designation.
Again, I suggest that the Government cannot have it both ways—with nothing diminished but everything liberated and changed—so I find myself in agreement with Amendment 52A and Amendment 59A, which is in the next group, from the noble Lord, Lord Holmes, who is not in his place. These seek clarity from the Information Commissioner.
I turn to my Amendment 46. My sole concern is to minimise the impact of Clause 14 on children’s safety, privacy and life chances. The amendment provides that a significant decision about a data subject must not be based solely on automated processing if
“the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child”,
taking into account the full gamut of their rights and development stage. Children have enhanced rights under the UNCRC, to which the UK is a signatory. Due to their evolving capacities as they make the journey from infancy to adulthood, they need special protections. If their rights are diminished in the digital world, their rights are diminished full stop. Algorithms determine almost every aspect of a child’s digital experience, from the videos they watch to their social network and from the sums they are asked to do in their maths homework to the team they are assigned when gaming. We have seen young boys wrongly profiled as criminal and girls wrongly associated with gangs.
In a later group, I will speak to a proposal for a code of practice on children and AI, which would codify standards and expectations for the use of AI in all aspects of children’s lives, but for now, I hope the Minister will see that, without these amendments to automated decision-making, children’s data protection will be clearly weakened. I hope he will agree to act to make true his earlier assertion that nothing in the Bill will undermine child protection. The Minister is the Minister for AI. He knows the impact this will have. I understand that, right now, he will probably stick to the brief, but I ask him to go away, consider this from the perspective of children and parents, and ask, “Is it okay for children’s life chances to be automated in this fashion?”
My Lords, I will speak to my Amendment 48. By some quirk of fate, I failed to sign up to the amendments that the noble Lord, Lord Bassam, so cogently introduced. I would have signed up if I had realised that I had not, so to speak.
It is a pleasure to follow the noble Baroness, Lady Kidron. She has a track record of being extremely persuasive, so I hope the Minister pays heed in what happens between Committee and Report. I very much hope that there will be some room for manoeuvre and that there is not just permanent push-back, with the Minister saying that everything is about clarifying and us saying that everything is about dilution. There comes a point when we have to find some accommodation on some of these areas.
Amendments 48 and 49 are very similar—I was going to say, “Great minds think alike”, but I am not sure that my brain feels like much of a great mind at the moment. “Partly” or “predominantly” rather than “solely”, if you look at it the other way round, is really the crux of what I think many of us are concerned about. It is easy to avoid the terms of Article 22 just by slipping in some sort of token human involvement. Defining “meaningful” is so difficult in these circumstances. I am concerned that we are opening the door to something that could be avoided. Even then, the terms of the new clause—we will have a clause stand part debate on Wednesday, obviously—put all the onus on the data subject, whereas that was not the case previously under Article 22. The Minister has not really explained why that change has been made.
I conclude by saying that I very much support Amendment 41. This whole suite of amendments is well drafted. The point about the Equality Act is extremely well made. The noble Lord, Lord Holmes, also has a very good amendment here. It seems to me that involving the ICO right in the middle of this will be absolutely crucial—and we are back to public trust again. If nothing else, I would like explicitly to include that under Clause 14 in relation to Article 22 by the time this Bill goes through.
Can the Minister give me an indication of the level at which that kicks in? For example, say there is a child in a classroom and a decision has been made about their ability in a particular subject. Is it automatic that the parent and the child get some sort of read-out on that? I would be curious to know where the Government feel that possibility starts.
In that example, where a child was subject to a solely ADM decision, the school would be required to inform the child of the decision and the reasons behind it. The child and their parent would have the right to seek a human review of the decision.
We may come on to this when we get to edtech but a lot of those decisions are happening automatically right now, without any kind of review. I am curious as to why it is on the school whereas the person actually doing the processing may well be a technology company.