(1 year, 7 months ago)
Lords ChamberMy Lords, I spoke at Second Reading about the relationship between online safety and protecting people’s mental health, a theme that runs throughout the Bill. I have not followed the progress in Committee as diligently as I wish, but this group of amendments has caught the eye of the Mental Health Foundation, which has expressed support. It identified Amendment 188, but I think it is the general principle that it supports. The Mental Health Foundation understands the importance of education, because it asked young people what they thought should be done. It sponsored a crucial inquiry through its organisation YoungMinds, which produced a report earlier this year, Putting a Stop to the Endless Scroll.
One of the three major recommendations that emerged from that report, from the feelings of young people themselves, was the need for better education. It found that young people were frustrated at being presented with outdated information about keeping their details safe. They felt that they needed something far more advanced, more relevant to the online world as it is happening at the moment, on how to avoid the risks from such things as image-editing apps. They needed information on more sophisticated risks that they face, essentially what they described as design risks, where the website is designed to drag you in and make you addicted to these algorithms.
The Bill as a whole is designed to protect children and young people from harm, but it must also, as previous speakers have made clear, provide young people themselves with tools so that they can exercise their own judgment to protect themselves and ensure that they do not fall foul, set on that well-worn path between being engaged on a website and ending up with problems with their mental health. Eating is the classic example: you click on a website about a recipe and, step by step, you get dragged into material designed to harm your health through its effect on your diet.
I very much welcome this group of amendments, what it is trying to achieve and the role that it will have by educating young people to protect themselves, recognising the nature of the internet as it is now, so that they do not run the risks of affecting their mental health.
My Lords, this has probably been the most constructive and inspiring debate that we have had on the Bill. In particular, I thank the noble Lord, Lord Knight, for introducing this debate. His passion for this kind of media literacy education absolutely shines through. I thank him for kicking off in such an interesting and constructive way. I am sorry that my noble friend Lord Storey is not here to contribute as well, with his educational background. He likewise has a passion for media literacy education and would otherwise have wanted to contribute to the debate today.
I am delighted that I have found some common ground with the noble Baroness, Lady Fox. The idea of sending my noble friend Lord Allan on tour has great attractions. I am not sure that he would find it quite so attractive. I am looking forward to him coming back before sending him off around the country. I agree that he has made a very constructive contribution. I agree with much of what the noble Baroness said, and the noble Baroness, Lady Prashar, had the same instinct: this is a way of better preserving freedom of speech. If we can have those critical thinking skills so that people can protect themselves from misinformation, disinformation and some of the harms online, we can have greater confidence that people are able to protect themselves against these harms at whatever age they may be.
I was very pleased to hear the references to Lord Puttnam, because I think that the Democracy and Digital Technologies Committee report was ground-breaking in the way it described the need for digital media literacy. This is about equipping not just young people but everybody with the critical thinking skills needed to differentiate fact from fiction—particularly, as we have talked through in Committee, on the way that digital platforms operate through their systems, algorithms and data.
The noble Lord, Lord Holmes, talked about the breadth and depth needed for media and digital literacy education; he had it absolutely right about people being appropriately savvy, and the noble Baroness, Lady Bennett, echoed what he said in that respect.
I think we have some excellent amendments here. If we can distil them into a single amendment in time for Report or a discussion with the Minister, I think we will find ourselves going forward constructively. There are many aspects of this. For instance, the DCMS Select Committee recommended that digital literacy becomes the fourth pillar of education, which seems to me a pretty important aspect alongside reading, writing and maths. That is the kind of age that we are in. I have quoted Parent Zone before. It acknowledges the usefulness of user empowerment tools and so on, but again it stressed the need for media literacy. What kind of media literacy? The noble Baroness, Lady Kidron, was extremely interesting when she said that what is important is not just user behaviour but making the right choices—that sort of critical thinking. The noble Lord, Lord Russell, provided an analogy with preventive health that was very important.
Our Joint Committee used a rather different phrase. It talked about a “whole of government” approach. When we look at all the different aspects, we see that it is something not just for Ofcom—I entirely agree with that—but that should involve a much broader range of stakeholders in government. We know that, out there, there are organisations such as the Good Things Foundation and CILIP, the library association, and I am sorry that the noble Baroness, Lady Lane-Fox, is not in her place to remind us about Doteveryone, an organisation that many of us admire a great deal for the work it carries out.
I think the “appropriately savvy” expression very much applies to the fraud prevention aspect, and it will be interesting when we come to the next group to talk about that as well. The Government have pointed to the DCMS online media strategy, but the noble Lord, Lord Holmes, is absolutely right to ask what its outcome has been, what its results have been, and what resources are being devoted towards it. We are often pointed to that by the Government, here in Committee and at Oral Questions whenever we ask how the media literacy strategy is going, so we need to kick the tyres on that as well as on the kind of priority and resources being devoted to media literacy.
As ever, I shall refer to the Government’s response to the Joint Committee, which I found rather extraordinary. The Government responded to the committee’s recommendation about minimum standards; there is an amendment today about minimum standards. They said:
“Ofcom has recently published a new approach to online media literacy … Clause 103 of the draft Bill”—
the noble Baroness, Lady Prashar, referred to the fact that in the draft Bill there was originally a new duty on Ofcom—
“did not grant Ofcom any additional powers. As such, it is … unnecessary regulation. It has therefore been removed”.
It did add to Ofcom’s duties. Will the Minister say whether he thinks all the amendments here today would constitute unnecessary regulation? As he can see, there is considerable appetite around the Committee for the kind of media literacy duty across the board that we have talked about today. He might make up for some of the disappointment that many of us feel about the Government’s having got rid of that clause by responding to that question.
(2 years, 5 months ago)
Grand CommitteeMy Lords, I will speak to Amendment 46, which comes from a slightly different angle. In our report AI in the UK: Ready, Willing and Able?, our AI Lords Select Committee, which I chair, expressed its strong belief in the value of procurement by the public sector of AI applications. However, as a recent research post put it:
“Public sector bodies in several countries are using algorithms, AI, and similar methods in their administrative functions that have sometimes led to bad outcomes that could have been avoided.”
The solution is:
“In most parliamentary democracies, a variety of laws and standards for public administration combine to set enough rules to guide their proper use in the public sector.”
The challenge is to work out what is lawful, safe and effective to use.
The Government clearly understand this, yet one of the baffling and disappointing aspects of the Bill is the lack of connection to the many government guidelines applying to the procurement and use of tech, such as artificial intelligence and the use and sharing of data by those contracting with government. It is unbelievable, but it is almost as if the Government wanted to be able to issue guidance on the ethical aspects of AI and data without at the same time being accountable if those guidelines are breached and without any duty to ensure compliance.
There is no shortage of guidance available. In June 2020, the UK Government published guidelines for artificial intelligence procurement, which were developed by the UK Government’s Office for Artificial Intelligence in collaboration with the World Economic Forum, the Government Digital Service, the Government Commercial Function and the Crown Commercial Service. The UK was trumpeted as the first Government to pilot these procurement guidelines. Their purpose is to provide central government departments and other public sector bodies with a set of guiding principles for purchasing AI technology. They also cover guidance on tackling challenges that may occur during the procurement process. In connection with this project, the Office for AI also co-created the AI procurement toolkit, which provides a guide for the public sector globally to rethink the procurement of AI.
As the Government said on launch,
“Public procurement can be an enabler for the adoption of AI and could be used to improve public service delivery. Government’s purchasing power can drive this innovation and spur growth in AI technologies development in the UK.
As AI is an emerging technology, it can be more difficult to establish the best route to market for your requirements, to engage effectively with innovative suppliers or to develop the right AI-specific criteria and terms and conditions that allow effective and ethical deployment of AI technologies.”
The guidelines set out a number of AI-specific considerations within the procurement process:
“Include your procurement within a strategy for AI adoption … Conduct a data assessment before starting your procurement process … Develop a plan for governance and information assurance … Avoid Black Box algorithms and vendor lock in”,
to name just a few. The considerations in the guidelines and the toolkit are extremely useful and reassuring, although not as comprehensive or risk-based as some of us would like, but where does any duty to adhere to the principles reflecting them appear in the Bill?
There are many other sets of guidance applicable to the deployment of data and AI in the public sector, including the Technology Code of Practice, the Data Ethics Framework, the guide to using artificial intelligence in the public sector, the data open standards and the algorithmic transparency standard. There is the Ethics, Transparency and Accountability Framework, and this year we have the Digital, Data and Technology Playbook, which is the government guidance on sourcing and contracting for digital, data and technology projects and programmes. There are others in the health and defence sectors. It seems that all these are meant to be informed by the OECD’s and the G20’s ethical principles, but where is the duty to adhere to them?
It is instructive to read the recent government response to Technology Rules?, the excellent report from the Justice and Home Affairs Committee, chaired by my noble friend Lady Hamwee. That response, despite some fine-sounding phrases about responsible, ethical, legitimate, necessary, proportionate and safe Al, displays a marked reluctance to be subject to specific regulation in this area. Procurement and contract guidelines are practical instruments to ensure that public sector authorities deploy AI-enabled systems that comply with fundamental rights and democratic values, but without any legal duty backing up the various guidelines, how will they add up to a row of beans beyond fine aspirations? It is quite clear that the missing link in the chain is the lack of a legal duty to adhere to these guidelines.
My amendment is formulated in general terms to allow for guidance to change from time to time, but the intention is clear: to make sure that the Government turn aspiration into action and to prompt them to adopt a legal duty and a compliance mechanism, whether centrally via the CDDO, or otherwise.
My Lords, I am speaking to my Amendments 128 and 130, although the issues raised there have already been addressed by earlier speakers. I fully support the amendments spoken to by the Front Bench and Amendment 57 tabled by the Liberal Democrats.