Read Bill Ministerial Extracts
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateBaroness Bennett of Manor Castle
Main Page: Baroness Bennett of Manor Castle (Green Party - Life peer)Department Debates - View all Baroness Bennett of Manor Castle's debates with the Department for Science, Innovation & Technology
(11 months, 2 weeks ago)
Lords ChamberMy Lords, it is a pleasure to follow the noble Lord, Lord Sikka. I very much share his concerns about the Government prying into the bank accounts of benefit recipients and pensioners. This is a historic moment, for all the wrong reasons, with the Government looking to pry through the private lives of millions of people, with no evidence that it is in any way necessary. The biggest problem with benefits, of course, is the large amount of money that is left unclaimed or unpaid, due to errors made by the Department for Work and Pensions.
I will also pick up the noble Lord’s point about economic crime. I note that this happens to be the week that, in a Frankfurt court, the former global head of tax at Freshfields Bruckhaus Deringer acknowledged in his testimony that he had
“glossed over the fact that my legal advice was used for illegal means”.
This was a man who, until 2019, was earning €1.9 million a year.
I have a direct question for the Minister. The Government have talked a great deal about the DWP and their plans in that area. What does the Bill do to tackle economic crime, given that the head of UK Finance described the UK as
“the fraud capital of the world”
and that we have an enormous problem with enablers, down the road in the City of London, who we know are getting around sanctions from the UK Government and others, swishing so much dirty money through London that it is now known as the “London Laundromat”? What does the Bill do on these issues?
I will tick off some points of agreement and concern from previous speeches. The Minister spoke of
“the highest standards of data protection”.
From what I recollect of the Minister’s speech, there was a surprising lack of the Government’s favourite word, “world-leading”. What does it mean if these data protections are not world-leading?
The Minister also said the Bill was “codesigned all the way”. A number of noble Lords pointed to the 260 amendments on Report at the other place. That really does not look like a codesigning process. The benefit of working across many Bills is that this Bill reminds me—and not in a good way—of the Procurement Bill, where your Lordships’ House saw a similar deluge of government amendments and had to try to disentangle the mess. I fear that we are in the same position with this Bill.
I pick up the speech of the noble Baroness, Lady Kidron —spectacularly excellent, as always—and her points about edtech and the situation with technology and education systems, and the utter impossibility of teachers, nursery nurses or people in similar positions dredging through the fine detail of every app they might want to use to ensure that their charges are protected. That is obviously not a viable situation. There have to be strong, protective general standards, particularly for apps aimed at children. The Government have to be able to guarantee that those nursery nurses and teachers can just pick up something—“It’s approved, it’s okay”—and use it.
I will also pick up the points that the noble Baroness, Lady Kidron, made about the importance of data being available to be used for the public good. She referred to research, but I would like—and I invite NGOs that are interested—to think about community uses. I was recently with the National Association of Local Councils, of which I declare that I am a vice-president, in Shropshire, where we saw parish and town councils doing amazing work to institute climate action. I am talking about small villages where data protection is not really an issue, as everyone knows everything about everybody. But we might think of a suburb of Liverpool or a market town, where people do not have the same personal knowledge of each other but where a council or community group could access data for good reasons. How can we make it possible to use these tools for positive purposes?
Briefly picking up on the points made by the noble Lord, Lord Allan—another of our experts—I echo his stress on the importance of EU equivalency. We have dumped our small businesses, in particular, in the economic mire again and again through the whole process of Brexit. There is a reason why #brexitreality trends regularly. We have also dumped many of our citizens and residents in that situation. We really must not do it again in the technology field.
I have a couple of what I believe to be original points. I want to address specifically Clauses 28 and 30, and I acknowledge here a briefing from Rights and Security International. It notes that that these clauses enable the Government to grant an opt-out to police forces from having to comply with many of the data protection requirements when they are working with the intelligence services. For example, they could grant police immunity from handling personal data unlawfully and reduce people’s right of access to their personal data held by the authorities.
In the Commons, the Minister said these provisions would be “helpful” and “efficient”. I put it to your Lordships’ House that to interfere with rights such as these, at the very least the Government should claim, to have any justification, that they are “proportionate” and “necessary”. That is an area that I suspect my noble friend Lady Jones of Moulsecoomb will pick up in Committee. There are also issues raised by the Ada Lovelace Institute and by other noble Lord, about the oversight of biometric technologies, including live facial recognition systems, emotion detection and the foundation models that underlie apps such as ChatGPT. These already limited legal safeguards are being further undermined by the Bill, at a point when there is general acknowledgement in the community that we should be heading in the opposite direction. I think we all acknowledge that this a fast-moving area, but the Government are already very clearly behind.
There are two more areas that I particularly want to pick up. One is elections. There has only just started to be focus on this. The Bill would allow the Government to tear up long-standing campaign rules with new exemptions. Now we have safeguards against direct marketing. These are being removed and,
“for the purposes of democratic engagement”,
anyone from 14 years and above can be targeted. I feel like warning the Government: my experience with young people is that the more they see of the Government, the less they like them, so they might want to think about what messages they send them. Seriously, I note that the Information Commissioner’s Office said during the public consultation on the Bill—and we can really hear the bureaucratic speak here—
“This is an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
The discussion of the Bill has reflected how this could put us in a situation where our elections are even more like those in the United States of America, which is of course no recommendation at all with the place of big money in their politics. I note that we really need to link this with the Government’s recent decision to massively increase election spending limits. Put those two things together and I suggest that is a real threat to what limited democracy we already have left in this country.
There is a further area which I am not going to go into in great detail, given the hour and the day, but which I will probably come back to in Committee. There is an extensive briefing, which I am sure many have seen from Understanding Patient Data. It is really important how the Bill comes up with a different definition of identifiable data. In the health sector, it is very common to use pseudonymous information from which key bits are removed, but it is still quite possible to go backwards and identify an individual from their data because they have an extremely rare disease and they live in this area of the country, or something like that.
This new Bill has, instead, more of a subjective test; the definition seems to rely on the judgment of the data controller and what they know. If the Minister has not looked at the briefing from Understanding Patient Data, I really urge him to because there are concerns here and we already have very grave concern in our community about the use of medical data, the possible loss of anonymity, and the reuse of data for commercial research. We are, again, coming to an Americanisation of our health system.
I conclude by saying that we have an enormous amount of work to do here in your Lordships’ House; I am trying not to let my head sink quietly on to the Bench in front of me, but we are going to have a break first, of course. I say to all noble Lords and—echoing the comments earlier—the many members of staff who support us by working so hard and often so late: thank you very much and Merry Christmas all.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateBaroness Bennett of Manor Castle
Main Page: Baroness Bennett of Manor Castle (Green Party - Life peer)Department Debates - View all Baroness Bennett of Manor Castle's debates with the Department for Science, Innovation & Technology
(8 months, 1 week ago)
Grand CommitteeMy Lords, Amendment 19 is consequential on my more substantive Clauses 114 and 115 stand part notices, which are also in this group. I am grateful to the noble Lord, Lord Clement-Jones, for his support.
These amendments all relate to the 150 or so pages of late amendments tabled in the Commons on Report and therefore not given adequate scrutiny before now. No real explanation has been given for why the Government felt it necessary to table the amendments in this way, and this group of amendments comes under the heading of so-called “democratic engagement”. Clause 113 extends a soft opt-in for direct mail marketing for furthering charitable or political objectives, while Clause 114 goes further and allows the Secretary of State to change the direct marketing rules through secondary legislation for the purpose of democratic engagement. This would allow the Government, in the run-up to an election, to switch off the direct mailing rules that apply to political parties.
Like many others, we are highly suspicious of the Government’s motives in introducing these amendments in the run-up to this election. Although we do not have a problem with a softer opt-in for direct mailing for charities, the application of Clause 114 to political parties gives politicians carte blanche to mine voters’ data given in good faith for completely different purposes. It would allow voters to be bombarded with calls, texts and personalised social media without their explicit consent.
When you consider these proposals in the context of other recent moves by the Government to make it harder for some people to vote and to vastly increase the amount of money that can be spent on campaigning in the run-up to an election, you have to wonder what the Government are up to, because these measures have certainly not been requested by Labour. In fact, these measures were not supported by the majority of respondents to the Government’s initial consultation, who wanted the existing rules upheld.
The Advertising Association has told us that it is concerned that switching off the rules could result in an increase in poor practice, such as political lobbying under the guise of research. This is apparently a practice known as “plugging”. It referred us to a report from the previous Information Commissioner on how political parties manage data protection, which provided key recommendations for how political parties could improve. These included providing clearer information about how data will be used and being more transparent about how voters are profiled and targeted via social media platforms. This is the direction our democratic engagement should be going in, with stronger and more honest rules that treat the electorate with respect, not watering down the rules that already exist.
When these proposals were challenged in the Commons on Report, the Minister, John Whittingdale, said:
“We have no immediate plans to use the regulation powers”.—[Official Report, Commons, 29/11/23; col. 912.]
If that is the case, why do the Government not take the proposals off the table, go back to the drawing board by conducting a proper consultation and test whether there is any appetite for these changes? They should also involve the Information Commissioner at an early stage, as he has already gone on record to say that this is
“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
Finally, if there are to be any changes, they should be subject to full parliamentary scrutiny and approval.
We believe that Clauses 114 and 115 are taking us in fundamentally the wrong direction, against the interests of the electorate. I look forward to the Minister’s response, but I give notice now that, unless the Government adopt a very different strategy on this issue, we will return to this on Report. I beg to move.
My Lords, I follow the noble Baroness, Lady Jones of Whitchurch, with pleasure, as I agree with everything that she just said. I apologise for having failed to notice this in time to attach my name; I certainly would have done, if I had had the chance.
As the noble Baroness said, we are in an area of great concern for the level of democracy that we already have in our country. Downgrading it further is the last thing that we should be looking at doing. Last week, I was in the Chamber looking at the statutory instrument that saw a massive increase in the spending limits for the London mayoral and assembly elections and other mayoral elections—six weeks before they are held. This is a chance to spend an enormous amount of money; in reality, it is the chance for one party that has the money from donations from interesting and dubious sources, such as the £10 million, to bombard voters in clearly deeply dubious and concerning ways.
We see a great deal of concern about issues such as deepfakes, what might happen in the next general election, malicious actors and foreign actors potentially interfering in our elections. We have to make sure, however, that the main actors conduct elections fairly on the ground. As the noble Baroness, Lady Jones, just set out, this potentially drives a cart and horses through that. As she said, these clauses did not get proper scrutiny in the Commons—as much as that ever happens. As I understand it, there is the potential for us to remove them entirely later, but I should like to ask the Minister some direct questions, to understand what the Government’s intentions are and how they understand the meaning of the clauses.
Perhaps no one would have any problems with these clauses if they were for campaigns to encourage people to register to vote, given that we do not have automatic voter registration, as so many other countries do. Would that be covered by these clauses? If someone were conducting a “get out the vote” campaign in a non-partisan way, simply saying, “Please go out and vote. The election is on this day. You will need to bring along your voter ID”, would it be covered by these clauses? What about an NGO campaigning to stop a proposed new nuclear power station, or a group campaigning for stronger regulations on pesticides or for the Government to take stronger action against ultra-processed food? How do those kinds of politics fit with Clauses 114 and 115? As they are currently written, I am not sure that it is clear what is covered.
There is cause for deep concern, because no justification has been made for these two clauses. I look forward to hearing the Minister’s responses.
My Lords, this weekend, as I was preparing for the amendments to which I have put my name, I made the huge mistake of looking at the other amendments being discussed. As a result, I had a look at this group. I probably should declare an interest as the wife of a Conservative MP; therefore, our household is directly affected by this amendment and these clause stand part notices. I wholeheartedly agree with everything said by the noble Baronesses, Lady Jones and Lady Bennett of Manor Castle.
I have two additional points to make, because I am horrified by these clauses. First, did I miss something, in that we are now defining an adult as being 14-plus? At what point did that happen? I thought that you had the right to vote at 18, so I do not understand why electoral direct marketing should be free to bombard our 14 year-olds. That was my first additional point.
Secondly, I come back to what I said on the first day of Committee: this is all about trust. I really worry that Clauses 114 and 115 risk undermining two important areas where trust really matters. The first is our electoral system and the second is the data that we give our elected representatives, when we go to them not as party representatives but as our representatives elected to help us.
The Minister mentioned a presumption that the ICO will update its guidance. Is there a timeframe for that? Will the guidance be updated before this comes into effect? How does the age of 14 relate to the AADC, which sets the age of adulthood at 18?
Before the Minister replies, we may as well do the full round. I agree with him, in that I very much believe in votes at 16 and possibly younger. I have been on many a climate demonstration with young people of 14 and under, so they can be involved, but the issue here is bigger than age. The main issue is not age but whether anybody should be subjected to a potential barrage of material in which they have not in any way expressed an interest. I am keen to make sure that this debate is not diverted to the age question and that we do not lose the bigger issue. I wanted to say that I sort of agree with the Minister on one element.
I agree with the noble Baroness, but with one rider. We will keep coming back to the need for children to have a higher level of data protection than adults, and this is but one of many examples we will debate. However, I agree with her underlying point. The reason why I support removing both these clauses is the hubris of believing that you will engage the electorate by bombarding them with things they did not ask to receive.
My Lords, this is the first group of amendments covering issues relating to automated decision-making, one of the most interesting areas of data use but also one of the most contested and, for the public at large, one of the most controversial and difficult to navigate. The development of AI and data systems that easily enable automatable decisions could offer huge efficiencies for consumers of public services. Equally, the use of such systems can, if used and regulated in the wrong way, have a devastating impact on people’s lives. If we have learned one thing from the Horizon scandal it is simply that, in the wrong hands and with the wrong system in place, the misuse of data can destroy lives and livelihoods.
Our country has a massive social security system, which includes everything from pension payments to disability income support and, of course, the universal credit system, which covers people entitled to in-work and out-of-work benefits. Over 22 million people receive DWP benefits of one sort or another. If automated decisions make errors in this field the potential to damage lives is enormous, as I am sure the Minister will appreciate.
I turn to the four amendments in the group in the name of my noble friend Lady Jones. Amendments 36 and 37 seek to amend new Article 22A of the UK GDPR and make it clear that protection is provided for profiling operations that lead to decisions. This is important, not least because the clause further reduces the scope for the human review of automated decision-making. Profiling is used as part of this process, and these amendments seek to protect individual data subjects from its effect. We take the view that it is essential that human interaction is involved in making subject access decisions.
Amendment 40 also makes it clear that, in the context of the new Article 22A, for human involvement to be considered meaningful, the review of the decision must be completed by a competent person. One of the positive changes made by the Bill is the introduction of the concept of “meaningful human involvement” in a decision. Meaningful human review is a key component for achieving an appropriate level of oversight over automated decision-making, for protecting individuals from unfair treatment and for offering an avenue for redress. The aim of the amendment is to bring more clarity around what “meaningful human involvement” should consist of. It would require that a review needs to be performed by a person with the necessary competence, training and understanding of the data, and, of course, the authority to alter the decision.
Our Amendment 109 is not so much about building protections as introducing something new and adding to the strength of what is already there. Users have never been able to get personalised explanations of automated decisions but, given the impact that these can have, we feel that systems should be in place for people to understand why a computer has simply said yes or no.
As it stands, the Bill deletes Section 14 of the Data Protection Act 2018 in its entirety. Our amendment would undo that and then add personalisation in. The amendment would retain Section 14 of that Act, which is where most automated decision-making safeguards are currently detailed in law. It would introduce an entitlement for data subjects to receive a personalised explanation of an automated decision made about them. This is based on public attitudes research conducted by the Ada Lovelace Institute, which shows a clear demand for greater transparency over these sorts of decisions.
The amendment also draws on independent legal analysis commissioned by the Ada Lovelace Institute, which found that the generic nature of explanations provided under current law are insufficient for individuals to understand how they have been affected by automated decision-making. This was considered to be a major barrier to meaningful protection from and redress for harms caused by AI. As many noble Lords have made clear in these debates, we have put building trust at the heart of how we get the most from AI and, more particularly, ADM systems.
I turn to the amendments in the name of the noble Lord, Lord Clement-Jones. In essence, they are about—as the noble Lord will, I am sure, explain better than I possibly could—the level of engagement of individuals in decisions about data subject automated decision-making processes. The common thread through the amendments is that they raise the bar in terms of the safeguards for data subjects’ rights and freedoms. We have joined the noble Lord, Lord Clement-Jones, on Amendment 47, and might equally have added our names to the other amendments in the group as we broadly support those too.
Amendment 38A, in the name of the noble Baroness, Lady Bennett, would place an additional requirement under new Article 22A to ensure human engagement in the automated decision-making processes.
I am sure the Committee will want more than warm words from the Minister when he comes to wind up the debate. For all of us, ADM is the here and now; it shapes how we use and consume public services and defines what and who we are. Reducing our protections from its downsides is not to be done lightly and we cannot easily see how that can be justified. I want to hear from the Minister how the Government came to conclude that this was acceptable, not least because, as we will hear in later debates on the Bill, the Government are seeking powers that provide for invasive bulk access to potentially every citizen’s bank accounts. I beg to move the amendments in the name of the noble Baroness, Lady Jones.
My Lords, it is a pleasure to follow the noble Lord, Lord Bassam, who has already set out very clearly what the group is about. I will chiefly confine myself to speaking to my Amendment 38A, which seeks to put in the Bill a clear idea of what having a human in the loop actually means. We need to have a human in the loop to ensure that a human interpreted, assessed and, perhaps most crucially, was able to intervene in the decision and any information on which it is based.
Noble Lords will be aware of many situations that have already arisen in which artificial intelligence is used—I would say that what we are currently describing is artificial intelligence but, in real terms, it is not truly that at all. What we have is a very large use of big data and, as the noble Lord, Lord Bassam, said, big data can be a very useful and powerful tool to be used for many positive purposes. However, we know that the quality of decision-making often depends on the quality of the data going in. A human is able to see whether something looks astray or wrong; there is a kind of intelligence that humans apply to this, which machines simply do not have the capacity for.
I pay credit to Justice, the law reform and human rights organisation which produced an excellent briefing on the issues around Clause 14. It asserts that, as it is currently written, it inadequately protects individuals from automated harm.
The noble Lord, Lord Bassam, referred to the Horizon case in the UK; that is the obvious example but, while we may think of some of the most vulnerable people in the UK, the Robodebt case in Australia is another case where crunching big data, and then crunching down on individuals, had truly awful outcomes. We know that there is a real risk of unfairness and discrimination in the use of these kinds of tools. I note that the UK has signed the Bletchley declaration, which says that
“AI should be designed, developed, deployed, and used, in a manner that is … human-centric, trustworthy and responsible”.
I focus particularly on “human-centric”: human beings can sympathise with and understand other human beings in a way that big data simply does not.
I draw a parallel with something covered by a special Select Committee of your Lordships’ House, last year: lethal autonomous weapon systems, or so-called killer robots. This is an obvious example of where there is a very strong argument for having a human in the loop, as the terminology goes. From the last I understood and heard about this, I am afraid that the UK Government are not fully committed to a human in the loop in the case of killer robots, but I hope that we get to that point.
When we talk about how humans’ data is used and managed, we are also talking about situations that are—almost equally—life and death: whether people get a benefit, whether they are fairly treated and whether they do not suddenly disappear off the system. Only this morning, I was reading a case study of a woman aged over 80, highlighting how she had been through multiple government departments, but could not get her national insurance number. Without a national insurance number, she could not get the pension to which she was entitled. If there is no human in the loop to cut through those kinds of situations, there is a real risk that people will find themselves just going around and around machines—a circumstance with which we are personally all too familiar, I am sure. My amendment is an attempt to put a real explanation in the Bill for having that human in the loop.
My Lords, the number of amendments proposed to Clause 14 reflects the Committee’s very real concern about the impact of automated decision-making on the privacy, safety and prospects of UK data subjects. I have specific amendments in groups 7 and 8, so I will speak to the impact of Clause 14 on children later. I will again be making arguments about the vulnerability of these systems in relation to the Government’s proposals on the DWP.
Without repeating the arguments made, I associate myself with most the proposals and the intention behind them—the need to safeguard the prospects of a fair outcome when algorithms hold sway over a person’s future. It seems entirely logical that, if the definition of solely automated decision-making requires “no meaningful human involvement”, we should be clear, as Amendment 40 proposes, about what is considered “meaningful”, so that the system cannot be gamed by providing human involvement that provides an ineffective safeguard and is therefore not meaningful.
I have sympathy with many of these amendments—Amendments 38A, 39, 47, 62, 64 and 109—and ultimately believe, as was suggested by the noble Lord, Lord Bassam, that it is a matter of trust. I refer briefly to the parliamentary briefing from the BMA, which boldly says that:
“Clause 14 risks eroding trust in AI”.
That would be a very sad outcome.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateBaroness Bennett of Manor Castle
Main Page: Baroness Bennett of Manor Castle (Green Party - Life peer)Department Debates - View all Baroness Bennett of Manor Castle's debates with the Department for Science, Innovation & Technology
(8 months, 1 week ago)
Grand CommitteeMy Lords, I speak to Amendment 144 in my name, which is supported by the noble Baronesses, Lady Harding and Lady Jones, and the noble Lord, Lord Clement-Jones. The amendment would introduce a code of practice on children and AI. Before I speak to it, I declare an interest: I am working with academic NGO colleagues in the UK, EU and US on such a code, and I am part of the UN Secretary-General’s AI advisory body’s expert group, which is currently working on sections on both AI and children and AI and education.
AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, people they follow and products they buy. But it no longer concerns simply the elective parts of life where, arguably, a child—or a parent on their behalf—can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors—the first of which is compulsory and the second of which is necessary.
The proposed code has three parts. The first requires the ICO to create the code and sets out expectations of its scope. The second considers who and what should be consulted and considered, including experts, children and the frameworks that codify children’s existing rights. The third defines elements of the process, including risk assessment, defines language and puts the principles to which the code must adhere in the Bill.
I am going to get my defence in early. I anticipate that the Minister will say that the ICO has published guidance, that we do not want to exclude children from the benefits of AI and that we are in a time of “wait and see”. He might even ask why children need something different or why the AADC, which I mention so frequently, is not sufficient. Let me take each of those in turn.
On the sufficiency of the current guidance, the ICO’s non-binding Guidance on AI and Data Protection, which was last updated on 15 March 2023, has a single mention of a child in its 140 pages, in a case study about child benefits. The accompanying AI and data protection toolkit makes no mention of children, nor does the ICO’s advice to developers on generative AI, issued on 3 April 2023. There are hundreds of pages of guidance but it fails entirely to consider the specific needs of children, their rights, their development vulnerabilities or that their lives will be entirely dominated by AI systems in a way that is still unimaginable to those in this Room. Similarly, there is little mention of children in the Government’s own White Paper on AI. The only such references are limited to AI-generated child sexual abuse material; we will come to that later when we discuss Amendment 291. Even the AI summit had no main-stage event relating to children.
Of course we do not want to exclude children from the benefits of AI. A code on the use of children’s data in the development and deployment of AI technology increases their prospects of enjoying the benefits of AI while ensuring that they are protected from the pitfalls. Last week’s debate in the name of the noble Lord, Lord Holmes, showed the broad welcome of the benefits while urgently speaking to the need for certain principles and fundamental protections to be mandatory.
As for saying, “We are in a time of ‘wait and see’”, that is not good enough. In the course of this Committee, we will explore edtech that has only advertising and no learning content, children being left out of classrooms because their parents will not accept the data leaks of Google Classroom, social media being scraped to create AI-generated CSAM and how rapid advances in generative AI capabilities mark a new stage in its evolution. Some of the consequences of that include ready access to models that create illegal and abusive material at scale and chatbots that offer illegal or dangerous advice. Long before we get on to the existential threat, we have “here and now” issues. Childhood is a very short period of life. The impacts of AI are here and now in our homes, our classrooms, our universities and our hospitals. We cannot afford to wait and see.
Children are different for three reasons. First, as has been established over decades, there are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony, and learn different social skills. This means that, equally, there are ages and stages at which they cannot do that. The long-established consensus is that family, social groups and society more broadly—including government—step in to support that journey.
Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces that they inhabit have to be fit for childhood.
Thirdly, we have a responsibility towards children that extends even beyond our responsibilities to each other; this means that it is not okay for us to legitimise profit at their expense, whether it is allowing an unregulated edtech market that exploits their data and teaches them nothing or the untrammelled use of their pictures to create child sexual abuse material.
Finally, what about the AADC? I hope that, in the course of our deliberations, we will put that on a more secure footing. The AADC addresses recommender systems in standard 12. However, the code published in August 2020 does not address generative AI which, as we have repeatedly heard, is a game-changer. Moreover, the AADC is currently restricted to information society services, which leaves a gaping hole. This amendment would address this gap.
There is an argument that the proposed code could be combined with the AADC as an update to its provisions. However, unless and until we sort out the status of the AADC in relation to the Bill, an AI kids code would be better formed as a stand-alone code. A UK code of practice on children and AI would ensure that data processors consider the fundamental rights and freedoms of children, including their safety, as they develop their products and perhaps even give innovators the appetite to innovate with children in mind.
As I pointed out at the beginning, there are many people globally working on this agenda. I hope that as we are the birthplace of the AADC and the Online Safety Act, the Government will adopt this suggestion and again be a forerunner in child privacy and safety. If, however, the Minister once again says that protections for children are not necessary, let me assure him that they will be put in place by others, and we will be a rule taker not a rule maker.
My Lords, I rise with the advantage over the noble Lord, Lord Clement-Jones, in that I will speak to only one amendment in this group; I therefore have the right page in front of me and can note that I will speak to Amendment 252, tabled by the noble Lord, Lord Clement-Jones, and signed by me and the noble Lords, Lord Watson of Wyre Forest and Lord Maude of Horsham.
I apologise that I was not with the Committee earlier today, but I was chairing a meeting about the microbiome, which was curiously related to this Committee. One issue that came up in that meeting was data and data management and the great uncertainties that remain. For example, if a part of your microbiome is sampled and the data is put into a database, who owns that data about your microbiome? In fact, there is no legal framework at the moment to cover this. There is a legal framework about your genome, but not your microbiome. That is a useful illustration of how fast this whole area is moving and how fast technology, science and society are changing. I will actually say that I do not blame the Government for the fact of this gaping hole as it is an international hole. It is a demonstration of how we need to race to catch up as legislators and regulators to deal with the problem.
This relates to Amendment 252 in the sense that perhaps this is an issue that has arisen over time, kind of accidentally. However, I want to credit a number of campaigners, among them James O’Malley, who was the man who draw my attention to this issue, as well as Peter Wells, Anna Powell-Smith and Hadley Beeman. They are people who have seen a really simple and basic problem in the way that regulation is working and are reaching out including, I am sure, to many noble Lords in this Committee. This is a great demonstration of how campaigning has at least gone part of the way to working. I very much hope that, if not today, then some time soon, we can see this working.
What we are talking about here, as the noble Lord, Lord Clement-Jones, said, is the postal address file. It is held as a piece of private property by Royal Mail. It is important to stress that this is not people’s private information or who lives at what address; it is about where the address is. As the noble Lord, Lord Clement-Jones, set out, all kinds of companies have to pay Royal Mail to have access to this basic information about society, basic information that is assembled by society, for society.
The noble Lord mentioned Amazon having to pay for the file. I must admit that I feel absolutely no sympathy there. I am no fan of the great parasite. It is an interesting contrast to think of Amazon paying, but also to think of an innovative new start-up company, which wants to be able to access and reach people to deliver things to their homes. For this company, the cost of acquiring this file could be prohibitive. It could stop it getting started and competing against Amazon.
Yes, I am happy to commit to that. As I said, we look forward to talking with the noble Baroness and others who take an interest in this important area.
Clause 33 already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that she sees fit, so this is an issue that we could return to in the future, if the evidence supports it, but, as I said, we consider the amendments unnecessary at this time.
Finally, Amendment 252 would place a legislative obligation on the Secretary of State regularly to publish address data maintained by local authorities under open terms—that is, accessible by anyone for any purpose and for free. High-quality, authoritative address data for the UK is currently used by more than 50,000 public and private sector organisations, which demonstrates that current licensing arrangements are not prohibitive. This data is already accessible for a reasonable fee from local authorities and Royal Mail, with prices starting at 1.68p per address or £95 for national coverage.
Some 50,000 organisations access that information, but does the Government have any data on it? I am not asking for it now, but maybe the Minister could go away and have a look at this. We have heard that other countries have opened up this data. Are they seeing an increase? That is just a number; it does not tell us how many people are denied access to the data.
We have some numbers that I will come to, but I am very happy to share deeper analysis of that with all noble Lords.
There is also free access to this data for developers to innovate in the market. The Government also make this data available for free at the point of use to more than 6,000 public sector organisations, as well as postcode, unique identifier and location data available under open terms. The Government explored opening address data in 2016. At that time, it became clear that the Government would have to pay to make this data available openly or to recreate it. That was previously attempted, and the resulting dataset had, I am afraid, critical quality issues. As such, it was determined at that time that the changes would result in significant additional cost to taxpayers and represent low value for money, given the current widespread accessibility of the data. For the reasons I have set out, I hope that the noble Lords will withdraw their amendments.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateBaroness Bennett of Manor Castle
Main Page: Baroness Bennett of Manor Castle (Green Party - Life peer)Department Debates - View all Baroness Bennett of Manor Castle's debates with the Department for Science, Innovation & Technology
(7 months, 3 weeks ago)
Grand CommitteeMy Lords, this is a very small and modest amendment, adding a fifth element to a list. Clause 85 is very long, so I will try to keep to its key elements. The clause
“confers powers on the Secretary of State and the Treasury to make provision in connection with access to customer data and business data”.
It is particularly focused on information about
“the supply or provision of goods, services and digital content”
by a business. The four elements are these. The first is where it is “supplied or provided”; the second is “prices or other terms; the third is “how they are used”; and the fourth is “performance or quality”. That fourth element does not cover the specific issue that my modest Amendment 195A proposes to add: the energy and carbon intensity of goods, services or digital content.
This might be seen as an attempt at future-proofing and including something which is a fast-growing area of great consumer concern—it should be of government concern too in the light of the Climate Change Act and the Government’s responsibilities. It would add a modest piece of possibility. I stress that, as the explanatory statement says, this can be required; it does not demand that it has to be required, but it provides the possibility that it can be.
There is a parallel here. When you go into a shop to think about buying white goods because you need to replace a fridge or washing machine, you expect, as a matter of standard, to see an energy performance certificate that will tell you how much electricity it will use, or, in the case of gas cookers, how much energy. We now expect that as standard, but of course, that is not focused on what is in the appliance but on what it will use.
The other obvious example is energy performance certificates in relation to housing. Again, that is something that could probably be considerably improved, but there has been some step towards thinking about issues around energy use rather than what is put in. In that context of building, we are seeing a great deal of focus—and, increasingly, a great deal of planning focus —on the issue of embodied carbon in buildings. This is taking that further, in terms of goods, services and digital provision.
Perhaps the obvious reason why a future Government might want to do this is that, if we think of the many areas of this so-called green rating in environmental standards, we have seen a profusion of different standards, labels and models. That has caused considerable confusion and uncertainty for consumers. If a Government were to say that this was the kind of step that would be used, it would give a standard to apply across the digital fields that would be clearly understood and not open to gaming by bad actors, by just creating their own standard, and so on.
Take, for example, the Mintel sustainability barometer —it is a global study but is reflective, I think, of what is happening in the UK. Consumers are increasingly demanding this information; they really want to know the environmental impact, including the impact of the production of whatever they are purchasing. This is information that consumers really want.
The other thing that I would point to in terms of this future-proofing approach is the OECD’s Inclusive Forum on Carbon Mitigation Approaches. That is rather a mouthful. In February, it put out a study entitled—another mouthful—Towards more accurate, timely, and granular product-level carbon intensity metrics: A Scoping Note. That makes it clear that we are talking here about something that is for the future; something that is being developed, but developed fast. If we think about the Government’s responsibilities within the Climate Change Act and the public desire, this modest addition, providing the legislative framework for future action, is a small positive step. I beg to move.
My Lords, I shall speak to Amendment 218, which is in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Parminter. I thank them for their support.
I apologise to the Minister, because I think this amendment is typical of the increasing way in which we will see environmental and particularly climate change issues popping up in Bills that belong not to Defra, DESNZ or DLUHC but to other departments. Because there is the fundamental issue of many economic and other activities impacting on these issues, that will be a pattern for Bills. He is playing on unfamiliar turf on this one, I am sure, so I sympathise with him.
“This amendment would require Ministers and public authorities, such as regulators”
when they make significant announcements about policy change, to disclose any analysis they have done of the
“impact of announcements … on UK climate change mitigation targets, adaptation to climate impacts and nature targets”.
The sorts of announcements that this amendment refers to include the introduction of primary legislation, obviously; changes to the timing, level and scope of government targets; large public sector procurement contracts; big infrastructure spending commitments; and any other policies that have the potential to have significant impact on climate and nature targets and climate change adaptation.
I firmly believe, and I have the support of the clerks, that this accords with the provision in the Long Title of the Bill
“to make provision about the disclosure of information to improve public service delivery”
The information disclosed has to be accurate, timely and machine-readable. The Secretary of State would give guidance on the format of that disclosure following wide consultation with those involved, especially across all departments, because it will be an issue that involves all departments.
So why is the amendment needed? At the moment, the Government are required to publish a whole load of reports on environmental impacts but many of them are periodic, or possibly only annual and high level. For example, the Government are required to publish periodic high-level delivery plans on net zero under Sections 13 and 14 of the Climate Change Act. However, these leave unquantified many emissions savings and they are not revised at all when policies change.
The Government recently decided to delay the date of a ban on new fossil fuel cars and vans; to delay the proposed ban on further installation of oil, LPG and coal heating systems; and to delay the rollout of the clean heat market mechanism. The Government failed to report any greenhouse gas impacts from these measures, which were pretty substantial announcements. Indeed, the Secretary of State for DESNZ argued that it would not be appropriate, or a requirement, to update and publish a revised version of the carbon budget delivery plan every time that there was a change in policy. That is not what this amendment argues for; it reflects that one would think that, when such significant announcements were being made, the Government would have looked at what the impact on climate change issues would be.
The amendment would simply require the Government to publish any analysis that they have done on impact assessments or to publish the fact that they have not done any such analysis—one can draw one’s own conclusions from the fact that they have not done that. The Environmental Audit Committee in the other place, around the time of the announcements of which I gave examples, went so far as to challenge the Prime Minister to provide clarity on how the Government intended to fill the emission reduction gap caused by the proposed rollback of existing policies and did not get a satisfactory answer.
There are similar current arrangements for reports on adaptation and resilience to climate change. Section 56 and 58 of the Climate Change Act require, again, periodic reporting at a high level on adaptation to climate change. That legislation has not been updated when policies have changed. As far as the introduction of new legislation is concerned, Section 20 of the Environment Act requires a statement on environmental law by government when there is environmental content in any new Bill. However, we already know from bitter experience that the Government interpret “environmental content” rather tightly.
All but one of the 28 Bills considered by Parliament in this current Session stated that they did not contain environmental law at all, whereas we can see that several of them have a clear environmental impact. For example, the Economic Activity of Public Bodies (Overseas Matters) Bill—I should be talking now about an amendment on it across the way, as indeed, should the noble Baroness, Lady Bennett—could prevent public bodies from taking important environmental matters into account in their decision-making. However, at the time of that Bill being published, it was certified by Ministers as not containing any environmental law.
Currently, the Government publish impact assessments for new legislation, including environmental impact assessments where the proposals are expected to have an environmental impact. Again, this is interpreted very tightly by the Government. Of the 28 government Bills that we have considered in this Session, 24 reported negligible impact, zero impact or being not applicable in the greenhouse gas box of the appraisal form—or the whole box was left blank. No account was available of the evidence on which such ratings of not having any impact was based because we did not then get any environmental impact assessment. To give one example: the Offshore Petroleum Licensing Bill simply reported that impacts were not quantified, which is pretty staggering, bearing in mind the clear environmental implications of that Bill. One would think that licensing additional petroleum extraction from the North Sea has some environmental ramification.
We have talked about climate change impacts and adaptation impacts, and we have talked about legislation. With regard to public procurement, the Government and contracting authorities are not required to publish the greenhouse gas emissions associated with individual procurement contracts. We argued that one in the Procurement Bill and failed to get any movement. There is a procurement policy note guiding government departments to seek emission reductions plans from the firms that they are contracting with, but this is a non-statutory note—it is advice only—and it covers only the contracting companies’ own operations and not the impact emissions of the products of services being contracted for.
My Lords, I thank the Minister for his answer. This has been a fairly short but fruitful debate. We can perhaps commend the Minister for his resilience, although it feels like he was pounded back on the ropes a few times along the way.
I will briefly run through the amendments. I listened carefully to the Minister, although I will have to read it back in Hansard. I think he was trying to say that my Amendment 195A, which adds energy and carbon intensity to this list, is already covered. However, I really cannot see how that can be claimed to be the case. The one that appears to be closest is sub-paragraph (iv), which refers to “performance or quality”, but surely that does not include energy and carbon intensity. I will consider whether to come back to this issue.
The noble Baroness, Lady Young of Old Scone, presented a wonderfully clear explanation of why Amendment 218 is needed. I particularly welcome the comments from the noble Lord, Lord Bassam, expressing strong Labour support for this. Even if the Government do not see the light and include it in the Bill, I hope that the noble Lord’s support can be taken as a commitment that a future Labour Government intend to follow that practice in all their approaches.
I hope that the noble Baroness does not get too carried away on that one.
I am sure that we will revisit this at some point in future. Perhaps the noble Lord will like the fact that I am saying that it is certain that we will revisit it from a different place.
These are all really serious amendments. This is a long Committee stage but, in the whole issue of data, having regard to data adequacy is absolutely crucial, as the degree of intervention on the Minister indicated. The Green Party’s position is that we want to be rejoin-ready: we want to remain as close as possible to EU standards so that we can rejoin the EU as soon as possible.
Even without taking that approach, this is a crucial issue as so many businesses are reliant on this adequacy ruling. I was taken by a comment from the Minister, who said that the UK is committed to data adequacy. The issue here is not what the UK is saying but convincing the EU, which is not in our hands or under our control, as numerous noble Lords said.
I have no doubt that we will return to data adequacy and I hope that we will return to the innovative and creative intervention from the noble Baroness, Lady Young of Old Scone. In the meantime, I beg leave to withdraw Amendment 195A.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateBaroness Bennett of Manor Castle
Main Page: Baroness Bennett of Manor Castle (Green Party - Life peer)Department Debates - View all Baroness Bennett of Manor Castle's debates with the Department for Work and Pensions
(7 months, 2 weeks ago)
Grand CommitteeBecause of the way the amendments are grouped, I have the opportunity to repeat my questions. The first one is relatively straightforward. Does the Minister accept that introducing these provisions—obviously we are talking about Amendment 234 on pensions—will discourage people from claiming pension credit? Despite all the efforts of the Government to encourage people to claim pension credit, clearly this will discourage them. Have the Government made any effort to estimate what impact this will have? Obviously, it is a very difficult task, but have they thought about it and does the Minister accept that it will have a deterrent effect.
My second question relates to the issue I have already raised. The state pension or state pension equivalent is paid by the state, by a pension fund or by a personal pension provider. Does the Minister think it odd that there is a difference in treatment? Everyone is receiving their pension from the state, but with a person who receives their pension from a private pension scheme or personal pension provider there is not the same right to look at their bank accounts in relation to those benefits. Now I am not advocating that as a solution. The question is: does this not indicate the illogicality and extent of the Government’s powers over some people’s incomes that they do not have over other types of income? To me, particularly when it comes to the payment of a pension—a benefit paid as of right—this discontinuity points to the extent of the Government’s overreach.
My Lords, I must begin by joining the general applause for the characteristic tour de force from the noble Baroness, Lady Sherlock. I was having a flashback because it was the noble Baroness in debate on what is now the Pension Schemes Act 2021 who taught me how to cope with Committee stage very kindly a long time ago —and we are very used to that. I rise briefly to address this group, but I start by saying in relation to the last group that I entirely agree with the proposition that Clause 128 should not stand part: the spying clause should not be part of the Bill.
I have a couple of points to make on the amendments in this group, one of which was raised by the noble Lord, Lord Clement-Jones, on the last group and is about protecting the Government from themselves. The amendments put down by the noble Baroness, Lady Sherlock, are probing. However, if we were to restrict the Government’s use of these powers, they might end up at a vaguely manageable scale. It is worth raising that point when we look at these groups.
My Lords, I ask the Minister for clarification. The noble Baroness, Lady Sherlock, asked about the number of individuals; I guess it may be 24 million or 25 million. However, from what the Minister has said, the number of bank accounts subject to surveillance would be far greater than that. For example, I receive a state pension and am also a trustee of a small not-for-profit organisation; from what the Minister said, I would be caught, as would that organisation. Landlords and many others could possibly be added. It seems that the number of bank accounts would be far greater than the number of individuals. When he provides the data, can the Minister estimate how many bank accounts and transactions there might be?
I will add to that the issue of overseas bank accounts. I cannot see how the British Government can apply this measure to them. Will this not push people to go to overseas bank accounts? Or will the Government try to pursue them through challenger banks—including multiple accounts from one person who may have one original, normal current account here?
How many accounts of “signalling” already exist in the current backlog in the business-as-usual version? What kind of investment will it take when you supercharge these powers and get many more tens of thousands of signals?