Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Business and Trade
(2 days, 13 hours ago)
Grand CommitteeMy Lords, I was in such a hurry to apologise just now for missing Second Reading that I forgot to declare my interests and remind the Committee of my technology and, with regard to this group, charitable interests as set out in the register.
I shall speak to Amendments 95, 96, 98, 101, 102 and 104 in my name and those of the noble Lords, Lord Clement-Jones and Lord Stevenson of Balmacara, and my noble friend Lord Black of Brentwood, and Amendments 103 and 106 in my name and those of the noble Lords, Lord Clement-Jones and Lord Stevenson. I also support Amendment 162 in the name of the noble Lord, Lord Clement-Jones. I will speak only on the marketing amendments in my name and leave the noble Lord, Lord Clement-Jones, to do, I am sure, great justice to the charitable soft opt-in.
These amendments are nothing like as philosophical and emotive as the last amendment on children and AI. They aim to address a practical issue that we debated in the late spring on the Data Protection and Digital Information Bill. I will not rehearse the arguments that we made, not least because the Minister was the co-signatory of those amendments, so I know she is well versed in them.
Instead, I shall update the Committee on what has happened since then and draw noble Lords’ attention to a couple of the issues that are very real and present now. It is strange that all Governments seem reluctant to restrict the new technology companies’ use of our data but extremely keen to get into the micro detail of restricting older forms of our using data that we have all got quite used to.
That is very much the case for the open electoral register. Some 63% of people opt out of being marketed at, because they have put their name as such on the electoral register. This is a well known and well understood use of personal data. Yet, because of the tribunal ruling, it is increasingly the case that companies cannot use the open electoral register and target the 37% of people who have said that they are quite happy to receive marketing unless the company lets every single one of those users know that they are about to market to them. The danger is that we create a new cookie problem—a physical cookie problem—where, if you want to use a data source that has been commonplace for 40 years, you have to send some marketing to tell people that you are about to use it. That of course means that you will not do so, which means that you reduce the data available to a lot of small and medium-sized businesses to market their products and hand them straight to the very big tech companies, which are really happy to scrape our data all over the place.
This is a strange one, where I find myself arguing that we should just allow something that is not broken not to need to be fixed. I appreciate that the Minister will probably tell us that the wording in these amendments is not appropriate. As I said earlier in the year—in April, in the previous incarnation—I very much hope that if the wording is incorrect we could, between Committee and Report, have a discussion and agree on some wording that achieves what seems just practical common sense.
The tribunal ruling that created this problem recognised that it was causing a problem. It stated that it accepted that the loophole it created would allow one company, Experian, a sizeable competitive advantage. It is a slightly perverse one: it means that it has to let only 5 million people know that it might be about to use the open electoral register, while its competitors have to let 22 million people know. That just does not pass the common-sense test of practical use of data. Given the prior support that the Minister has shown for this issue, I very much hope that we can resolve it between Committee and Report. I beg to move.
My Lords, I have a couple of amendments in this group, Amendments 158 and 161. Amendment 158 is largely self-evident; it tries to make sure that, where there is a legal requirement to communicate, that communication is not obstructed by the Bill. I would say much the same of Amendment 161; that, again, it is obvious that there ought to be easy communication where a person’s pension is concerned and the Bill should not obstruct it. I am not saying that these are the only ways to achieve these things, but they should be achieved.
I declare an interest on Amendment 160, in that I control the website of the Good Schools Guide, which has advertising on it. The function of advertising on the web is to enable people to see things for free. It is why it does not close down to a subscription-only service. If people put advertisements on the web, they want to know that they are effective and have been seen, and some information about who they have been seen by. I moved a similar amendment to the previous Government’s Bill and encountered some difficulty. If the Government are of the same mind—that this requires us to be careful—I would very much welcome the opportunity of a meeting between now and Report, and I imagine others would too, to try to understand how best to make sure that advertising can flourish on the internet.
I am very happy to talk to the noble Baroness about this issue. She asked what the Government’s view is; we are listening very carefully to the Information Commissioner and the advice that he is putting together on this issue.
My Lords, I am very grateful for the answers the noble Baroness gave to my amendments. I will study carefully what she said in Hansard, and if I have anything further to ask, I will write to her.
My Lords, in response—and very briefly, given the technical nature of all these amendments—I think that we should just note that there are a number of different issues in this group, all of which I think noble Lords in this debate will want to follow up. I thank the many noble Lords who have contributed both this time round and in the previous iterations, and ask that we follow up on each of the different issues, probably separately rather than in one group, as we will get ourselves quite tangled in the web of data if we are not careful. With that, I beg leave to withdraw the amendment.
My Lords, my Amendment 115 would similarly act in that way by making automated decision-making processes explain themselves to the people affected by them. This would be a much better way of controlling the quality of what is going on with automated decision-making than restricting that sort of information to professionals—to people who are anyway overworked and have a lot of other things to do. There is no one more interested in the decision of an automated process than the person about whom it is being made. If we are to trust these systems then their ability, which is way beyond the human ability, to have the time to explain why they took the decision they did—which, if the machine is any good, it knows and can easily set out—is surely the way to generate trust: you can absolutely see what decision has been made and why, and you can respond to it.
This would, beyond anything else, produce a much better system for our young people when they apply for their first job. My daughter’s friends in that position are getting into the hundreds of unexplained rejections. This is not a good way to treat young people. It does not help them to improve and understand what is going on. I completely understand why firms do not explain; they have so many applications that they just do not have the time or the personnel to sit down and write a response—but that does not apply to an automated decision-making machine. It could produce a much better situation when it comes to hiring.
As I said, my principal concern, to echo that of the noble Viscount, is that it would give us sight of the decisions that have been taken and why. If it becomes evident that they are taken well and for good reasons, we shall learn to trust them. If it becomes evident that they really are not fair or understandable, we shall be in a position to demand changes.
My Lords, it is a pleasure to take part in the debate on this group. I support the spirit of all the amendments debated thus far.
Speaking of spirits, and it being the season, I have more than a degree of sympathy for the Minister. With so many references to her previous work, this Christmas is turning into a bit of the Ghost of Amendments Past for her. That is good, because all the amendments she put down in the past were of an excellent quality, well thought through, equally considered and even-handed.
As has been mentioned many times, we have had three versions of a data Bill so far over just over three years. One wonders whether all the elements of this current draft have kept up with what has happened in the outside world over those three years, not least when it comes to artificial intelligence. This goes to the heart of the amendments in this group on automated decision-making.
When the first of these data Bills emerged, ADM was present—but relatively discreetly present—in our society and our economy. Now it would be fair to say that it proliferates across many areas of our economy and our society, often in situations where people find themselves at the sharpest end of the economy and the sharpest end of these automated decisions, often without even knowing that ADM was present. More than that, even on the discovery that ADM was in the mix, depending on which sector of the economy or society they find that decision being made in, they may find themselves with no or precious little redress—employment and recruitment, to name but one sector.
It being the season, it is high time when it comes to ADM that we start to talk turkey. In all the comments thus far, we are talking not just about ADM but about the principles that should underpin all elements of artificial intelligence—that is, they should be human led. These technologies should be in our human hands, with our human values feeding into human oversight: human in the loop and indeed, where appropriate, human over the loop.
That goes to elements in my two amendments in this group, Amendments 123A and 123B. Amendment 123A simply posits, through a number of paragraphs, the point that if someone is subject to an automated decision then they have the right to a personalised explanation of that decision. That explanation should be accessible in its being in plain language of their choice, not having a cost attached to it and not being in any sense technically or technologically convoluted or opaque. That would be relatively straightforward to achieve, but the positive impact for all those citizens would certainly be more than material.
Amendment 123B goes to the heart of those humans charged with the delivery of these personalised explanations. It is not enough to simply say that there are individuals within an organisation responsible for the provision of personalised explanations for automated decisions; it is critical that those individuals have the training, the capabilities and, perhaps most importantly, the authority within that organisation to make a meaningful impact regarding those personalised explanations. If not, this measure may have a small voice but would have absolutely no teeth when it comes to the citizen.
In short, ADM is proliferating so we need to ensure that we have a symmetrical situation for citizens, for consumers, and for anyone who finds themselves in any domain or sector of our economy and society. We must assert the principles: human-led, human in the loop, “Our decisions, our data”, and “We determine, we decide, we choose”. That is how I believe we can have an effective, positive, enabling and empowering AI future. I look forward to the Minister’s comments.
My understanding is that it would be. Every individual who was affected would receive their own notification rather than it just being on a website, for example.
Let me just make sure I have not missed anyone out. On Amendment 123B on addressing bias in automated decision-making, compliance with the data protection principles, including accuracy, transparency and fairness, will ensure that organisations take the necessary measures to address the risk of bias.
On Amendment 123C from the noble Lord, Lord Clement-Jones, I reassure him that the Government strongly agree that employment rights should be fit for a modern economy. The plan to make work pay will achieve this by addressing the challenges introduced by new trends and technologies. I agree very much with my noble friend Lord Knight that although we have to get this right, there are opportunities for a different form of work, and we should not just see this as being potentially a negative impact on people’s lives. However, we want to get the balance right with regard to the impact on individuals to make sure that we get the best rather than the possible negative effects out of it.
Employment rights law is more suitable for regulating the specific use of data and technology in the workplace rather than data protection law in isolation, as data protection law sets out general rules and principles for processing that apply in all contexts. Noble Lords can rest assured that we take the impact on employment and work very seriously, and as part of our plan to make work pay and the Employment Rights Bill, we will return to these issues.
On Amendments 119, 120, 121 and 122, tabled by the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and my noble friend Lord Knight, the Government share the noble Lords’ belief in the importance of public sector algorithmic transparency, and, as the noble Lord, Lord Clement-Jones, reminded us, we had a very good debate on this last week. The algorithmic transparency recording standard is already mandatory for government departments and arm’s-length bodies. This is a cross-government policy mandate underpinned by digital spend controls, which means that when budget is requested for a relevant tool, the team in question must commit to publishing an ATRS record before receiving the funds.
As I said on Friday, we are implementing this policy accordingly, and I hope to publish further records imminently. I very much hope that when noble Lords see what I hope will be a significant number of new records on this, they will be reassured that the nature of the mandation and the obligation on public sector departments is working.
Policy routes also enable us to provide detailed guidance to the public sector on how to carry out its responsibilities and monitor compliance. Examples include the data ethics framework, the generative AI framework, and the guidelines for AI procurement. Additionally, the data protection framework already achieves some of the intended outcomes of these amendments. It requires organisations, including public authorities, to demonstrate how they have identified and mitigated risks when processing personal data. The ICO provides guidance on how organisations can audit their privacy management and ensure a high level of data protection compliance.
I know I have given a great deal of detail there. If I have not covered all the points that the noble Lords have raised, I will write. In the meantime, given the above assurances, I hope that the noble Lord will withdraw his amendment.
My Lords, I would be very grateful if the Minister wrote to me about Amendment 115. I have done my best before and after to study Clause 80 to understand how it provides the safeguards she describes, and have failed. If she or her officials could take the example of a job application and the responses expected from it, and take me through the clauses to understand what sort of response would be expected and how that is set out in the legislation, I would be most grateful.
My Lords, I have Amendment 201 in this group. At the moment, Action Fraud does not record attempted fraud; it has to have been successful for the website to agree to record it. I think that results in the Government taking decisions based on distorted and incomplete data. Collecting full data must be the right thing to do.
My Lords, I had expected the noble Baroness, Lady Owen of Alderley Edge, to be in the Room at this point. She is not, so I wish to draw the Committee’s attention to her Amendment 210. On Friday, many of us were in the Chamber when she made a fantastic case for her Private Member’s Bill. It obviously dealt with a much broader set of issues but, as we have just heard, the overwhelming feeling of the House was to support her. I think we would all like to see the Government wrap it up, put a bow on it and give it to us all for Christmas. But, given that that was not the indication we got, I believe that the noble Baroness’s intention here is to deal with the fact that the police are giving phones and devices back to perpetrators with the images remaining on them. That is an extraordinary revictimisation of people who have been through enough. So, whether or not this is the exact wording or way to do it, I urge the Government to look on this carefully and positively to find a way of allowing the police the legal right to delete data in those circumstances.
I have Amendment 135A in this group. The Bill provides a new set of duties for the Information Commissioner but no strategic framework, as the DPDI Bill did. The Information Commissioner is a whole-economy regulator. To my mind, the Government’s strategic priorities should bear on it. This amendment would provide an enabling power, such as that which the Competition and Markets Authority, which is in an equivalent economic position, already has.
My Lords, I have huge sympathy for, and experience of, many of the issues raised by the noble Lord, Lord Clement-Jones, but, given the hour, I will speak only to Amendment 145 in my name and those of the noble Baroness, Lady Harding, my noble friend Lord Russell and the noble Lord, Lord Stevenson. Given that I am so critical, I want to say how pleased I am to see the ICO reporting requirements included in the Bill.
Amendment 145 is very narrow. It would require the ICO to report specifically and separately on children. It is fair to say that one of the many frustrations for those of us who spend our time advocating for children’s privacy and safety is trying to extrapolate child-specific data from generalised reporting. Often it is not reported because it is useful to hide some of the inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snapchat provides a breakdown of the violation rate by age group, even though that would provide valuable information for academics, Governments, legislators, NGOs and, of course, regulators. It was a point of contention between many civil society organisations and Ofcom that there was no evidence that children of different ages react in different ways, which, for anyone who has had children, is clearly not the case.
Similarly, for many years we struggled to understand Ofcom’s reporting because older children were included in a group that went up to 24, and it took over 10 years for that to change. It seems to me—I hope the Government agree—that since children are entitled to specific data privacy benefits, it follows that the application and enforcement of those benefits should be reported separately. I hope that the Government can give a quick yes on this small but important amendment.