All 2 Debates between Lord Holmes of Richmond and Lord Knight of Weymouth

Mon 16th Dec 2024
Thu 11th May 2023

Data (Use and Access) Bill [HL]

Debate between Lord Holmes of Richmond and Lord Knight of Weymouth
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, it is a pleasure to take part in the debate on this group. I support the spirit of all the amendments debated thus far.

Speaking of spirits, and it being the season, I have more than a degree of sympathy for the Minister. With so many references to her previous work, this Christmas is turning into a bit of the Ghost of Amendments Past for her. That is good, because all the amendments she put down in the past were of an excellent quality, well thought through, equally considered and even-handed.

As has been mentioned many times, we have had three versions of a data Bill so far over just over three years. One wonders whether all the elements of this current draft have kept up with what has happened in the outside world over those three years, not least when it comes to artificial intelligence. This goes to the heart of the amendments in this group on automated decision-making.

When the first of these data Bills emerged, ADM was present—but relatively discreetly present—in our society and our economy. Now it would be fair to say that it proliferates across many areas of our economy and our society, often in situations where people find themselves at the sharpest end of the economy and the sharpest end of these automated decisions, often without even knowing that ADM was present. More than that, even on the discovery that ADM was in the mix, depending on which sector of the economy or society they find that decision being made in, they may find themselves with no or precious little redress—employment and recruitment, to name but one sector.

It being the season, it is high time when it comes to ADM that we start to talk turkey. In all the comments thus far, we are talking not just about ADM but about the principles that should underpin all elements of artificial intelligence—that is, they should be human led. These technologies should be in our human hands, with our human values feeding into human oversight: human in the loop and indeed, where appropriate, human over the loop.

That goes to elements in my two amendments in this group, Amendments 123A and 123B. Amendment 123A simply posits, through a number of paragraphs, the point that if someone is subject to an automated decision then they have the right to a personalised explanation of that decision. That explanation should be accessible in its being in plain language of their choice, not having a cost attached to it and not being in any sense technically or technologically convoluted or opaque. That would be relatively straightforward to achieve, but the positive impact for all those citizens would certainly be more than material.

Amendment 123B goes to the heart of those humans charged with the delivery of these personalised explanations. It is not enough to simply say that there are individuals within an organisation responsible for the provision of personalised explanations for automated decisions; it is critical that those individuals have the training, the capabilities and, perhaps most importantly, the authority within that organisation to make a meaningful impact regarding those personalised explanations. If not, this measure may have a small voice but would have absolutely no teeth when it comes to the citizen.

In short, ADM is proliferating so we need to ensure that we have a symmetrical situation for citizens, for consumers, and for anyone who finds themselves in any domain or sector of our economy and society. We must assert the principles: human-led, human in the loop, “Our decisions, our data”, and “We determine, we decide, we choose”. That is how I believe we can have an effective, positive, enabling and empowering AI future. I look forward to the Minister’s comments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I shall speak to the series of amendments on automated decision-making to which I have added my name but are mostly in the name of the noble Lord, Lord Clement-Jones. As he said, we had a rehearsal for this debate last Friday when we debated his Private Member’s Bill so I will not delay the Committee by saying much about the generalities of ADMs in the public sector.

Suffice it to say that human involvement in overseeing AIs must be meaningful—for example, without those humans themselves being managed by algorithms. We must ensure that ADMs comply by design with the Equality Act and safeguard data subjects’ other rights and freedoms. As discussed in earlier groups, we must pay particular attention to children’s rights with regard to ADMs, and we must reinforce the obligation on public bodies to use the algorithmic transparency recording standards. I also counsel my noble friend the Minister that, as we have heard, there are many voices from civil society advising me and others that the new Article 22 of the GDPR takes us backwards in terms of protection.

That said, I want to focus on Amendment 123C, relating to ADMs in the workplace, to which I was too late to add my name but would have done. This amendment follows a series of probing amendments tabled by me to the former DPDI Bill. In this, I am informed by my work as the co-chair of the All-Party Parliamentary Group on the Future of Work, assisted by the Institute for the Future of Work. These amendments were also mirrored during the passage of the Procurement Act and competition Act to signal the importance of the workplace, and in particular good work, as a cross-cutting objective and lens for policy orientation.

Online Safety Bill

Debate between Lord Holmes of Richmond and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I move this amendment in my name as part of a group of amendments on media literacy. I am grateful to Full Fact, among others, for some assistance around these issues, and to Lord Puttnam. He has retired from this House, of course, but it was my pleasure to serve on the committee that he chaired on democracy and digital technology. He remains in touch and is watching from his glorious retirement in the Republic of Ireland—and he is pressing that we should address issues around media literacy in particular.

The Committee has been discussing the triple shield. We are all aware of the magic of threes—the holy trinity. Three is certainly a magic number, but we also heard about the three-legged stool. There is more stability in four, and I put it to your Lordships that, having thought about “illegal” as the first leg, “terms of service” as the second and “user empowerment tools” as the third, we should now have, as a fourth leg underpinning a better and safer environment for the online world, “better media literacy”, so that users have confidence and competence online as a result.

To use the user empowerment tools effectively, we need to be able to understand the business models of the platforms, and how we are paying for their services with our data and our attention; how platforms use our data; our data rights as individuals; and the threat of scams, catfishing, phishing and fraud, which we will discuss shortly. Then there is the national cyber threat. I was really struck, when we were on that committee that Lord Puttnam chaired, by hearing how nations such as Finland and the Baltic states regard media literacy as a national mission to protect them particularly from the threat of cyberwarfare from Russia.

We have heard about misinformation and disinformation. There are issues of emerging technologies that we all need to be more literate about. I remember, some six or seven years ago, my wife was in a supermarket queue with her then four year-old daughter who turned to her and asked what an algorithm was. Could any of us then confidently be able to reply and give a good answer? I know that some would be happy to do so, but we equally need to be able to answer what machine learning is, what large-language models are, or what neural networks are in order to understand the emerging world of artificial intelligence.

Ofcom already has a duty under the Communications Act 2002. Incidentally, Lord Puttnam chaired the Joint Committee on that Act. It is worth asking ourselves: how is it going for Ofcom in the exercise of that duty? We can recall, I am sure, the comments last Tuesday in this Committee of the noble Baroness, Lady Buscombe, who said:

“I took the Communications Act 2003 through for Her Majesty’s Opposition, and we were doing our absolute best to future-proof the legislation. There was no mention of the internet in that piece of legislation”.—[Official Report, 9/5/23; col. 1709.]


There is no doubt in my mind that, as a result of all the changes that have taken place in the last 20 years, the duty in that Act needs updating, and that is what we are seeking to do.

It is also possible to look at the outcomes. What is the state of media literacy in the nation at the moment? I was lucky enough this weekend to share a platform at a conference with a young woman, Monica. She lives in Greenwich, goes to Alleyn’s School, is articulate and is studying computer science at A-level. When asked about the content of the computer science curriculum, which is often prayed in aid in terms of the digital and media literacy of our young people, she reminded the audience that she still has to learn about floppy disks because the curriculum struggles to keep up to date. She is not learning about artificial intelligence in school because of that very problem. The only way in which she could do so, and she did, was through an extended project qualification last year.

We then see Ofcom’s own reporting on levels of media literacy in adults. Among 16 to 24 year-olds, which would cover Monica, for example, according to the most recent report out earlier this year or at the end of last, only two-thirds are confident and able to recognise scam ads, compared to 76% of the population in England. Young people are less confident in recognising search-engine advertising than the majority: only 42% of young people are confident around differentiating between organic and advertising content on search. Of course, young people are better at thinking about the truthfulness of “factual” information online. For adults generally, the report showed that only 45% of us are confident and able to recognise search-engine advertising, and a quarter of us struggle to identify scam emails and factful truthfulness online. You are less media literate and therefore more vulnerable if you are from the poorer parts of the population. If you are older, you are still yet more vulnerable to scam emails, although above average on questioning online truth and spotting ads in search engines. Finally, in 2022, Ofcom also found that 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to be able to do so. A lot of us are kidding ourselves in terms of how safe we are and how much we know about the online world.

So, much more is to be done. Hence, Amendment 52A probes what the duty on platforms should be to improve media literacy and thereby establish the reliability and accuracy of journalistic content. Amendment 91 in my name requires social media and search services to put in place measures to improve media literacy and thereby explain things like the business model that currently is too often skated over by the media literacy content provided by platforms to schools and others. The noble Lord, Lord Holmes, has Amendment 91A, which is similar in intent, and I look forward to hearing his comments on that.

Amendment 98 in my name would require a code of practice from Ofcom in support of these duties and Amendment 186 would ensure that Ofcom has sufficient funds for its media literacy duties. Amendment 188 would update the Communications Act to reflect the online world that we are addressing in this Bill. I look forward to the comments from the noble Baroness, Lady Prashar, in respect of her Amendment 236, which, she may argue, does a more comprehensive job than my amendment.

Finally, my Amendment 189 in this group states that Ofsted would have to collaborate with Ofcom in pursuance of its duties, so that Ofcom could have further influence into the quality of provision in schools. Even this afternoon, I was exchanging messages with an educator in Cornwall called Giles Hill, who said to me that it is truly dreadful for schools having to mop up problems caused by this unregulated mess.

This may not be the perfect package in respect of media literacy and the need to get this right and prop up the three-legged stool, but there is no doubt from Second Reading and other comments through the Bill’s passage that this is an area where the Bill needs to be amended to raise the priority and the impact of media literacy among both service providers and the regulator. I beg to move.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, it is a pleasure to take part in today’s proceedings. As it is my first contribution on this Bill, I declare my technology and financial services interests, as set out in the register. I also apologise for not being able to take part in the Second Reading deliberations.

It is a particular pleasure to follow my friend, the noble Lord, Lord Knight; I congratulate him on all the work that he has done in this area. Like other Members, I also say how delighted I was to be part of Lord Puttnam’s Democracy and Digital Technologies Committee. It is great to know that he is watching—hopefully on wide-movie screen from Skibbereen—because the contribution that he has made to this area over decades is beyond parallel. To that end, I ask my noble friend the Minister whether he has had a chance to remind himself of the recommendations in our 2020 report. Although it is coming up to three years old, so much of what is in that report is completely pertinent today, as it was on the date of publication.

I am in the happy position to support all the amendments in this group; they all have similar intent. I have been following the debate up to this point and have been in the Chamber for a number of previous sessions. Critically important issues have been raised in every group of amendments but, in so many ways, this group is perhaps particularly critical, because this is one of the groups that enables individuals, particularly young people, to have the tools that they—and we—need in their hands to enable them to grip this stuff, in all its positive and, indeed, all its less-positive elements.

My Amendment 91A covers much of the same ground as Amendment 91 from the noble Lord, Lord Knight. It is critical that, when we talk about media literacy, we go into some detail around the subsets of data literacy, data privacy, digital literacy and, as I will come on to in a moment, financial literacy. We need to ensure that every person has an understanding of how this online world works, how it is currently constructed and how there is no inevitability about that whatever. People need to understand how the algorithms are set up. As was mentioned on a previous group, it is not necessarily that much of a problem if somebody is spouting bile in the corner; it is not ideal, but it is not necessarily a huge problem. The problem in this world is the programmability, the focus, the targeting and the weaponising of algorithms to amplify such content for monetary return. Nothing is inevitable; it is all utterly determined by the models currently in play.

It is critical for young people, and all people, to understand how data is used and deployed. In that media literacy, perhaps the greatest understanding of all is that it is not “the data” but “our data”. It is for us, through media literacy, to determine how our data is deployed, for what purpose, to what intent and in what circumstances, rather than, all too often, it being sold on, and so on.