Lord Knight of Weymouth debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Wed 6th Sep 2023
Wed 19th Jul 2023
Mon 17th Jul 2023
Wed 12th Jul 2023
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1

Online Safety Bill

Lord Knight of Weymouth Excerpts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for his introduction today and also for his letter which set out the reasons and the very welcome amendments that he has tabled today. First, I must congratulate the noble Baroness, Lady Stowell, for her persistence in pushing amendments of this kind to Clause 45, which will considerably increase the transparency of the Secretary of State’s directions if they are to take place. They are extremely welcome as amendments to Clause 45.

Of course, there is always a “but”—by the way, I am delighted that the Minister took the advice of the House and clearly spent his summer reading through the Bill in great deal, or we would not have seen these amendments, I am sure—but I am just sorry that he did not take the opportunity also to address Clause 176 in terms of the threshold for powers to direct Ofcom in special circumstances, and of course the rather burdensome powers in relation to the Secretary of State’s guidance on Ofcom’s exercise of its functions under the Bill as a whole. No doubt we will see how that works out in practice and whether they are going to be used on a frequent basis.

My noble friend Lord Allan—and I must congratulate both him and the noble Lord, Lord Knight, for their addressing this very important issue—has set out five assurances that he is seeking from the Minister. I very much hope that the Minister can give those today, if possible.

Congratulations are also due to the noble Baroness, Lady Kennedy, for finding a real loophole in the offence, which has now been amended. We are all delighted to see that the point has been well taken.

Finally, on the point raised by the noble Lord, Lord Rooker, clearly it is up to the Minister to respond to the points made by the committee. All of us would have preferred to see a comprehensive scheme in the primary legislation, but we are where we are. We wanted to see action on apps; they have some circumscribing within the terms of the Bill. The terms of the Bill—as we have discussed—particularly with the taking out of “legal but harmful”, do not give a huge amount of leeway, so this is not perhaps as skeleton a provision as one might otherwise have thought. Those are my reflections on what the committee has said.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I do not know how everyone has spent their summer, but this feels a bit like we have been working on a mammoth jigsaw puzzle and we are now putting in the final pieces. At times, through the course of this Bill, it has felt like doing a puzzle in the metaverse, where we have been trying to control an unreliable avatar that is actually assembling the jigsaw—but that would be an unfair description of the Minister. He has done really well in reflecting on what we have said, influencing his ministerial colleagues in a masterclass of managing upwards, and coming up with reasonable resolutions to previously intractable issues.

We are trusting that some of the outcome of that work will be attended to in the Commons, as the noble Baroness, Lady Morgan, has said, particularly the issues that she raised on risk, that the noble Baroness, Lady Kidron, raised on children’s safety by design, and that my noble friend Lady Merron raised on animal cruelty. We are delighted at where we think these issues have got to.

For today, I am pleased that the concerns of the noble Baroness, Lady Stowell, on Secretary of State powers, which we supported, have been addressed. I also associate myself with her comments on parliamentary scrutiny of the work of the regulator. Equally, we are delighted that the Minister has answered the concerns of my noble friend Lady Kennedy and that he has secured the legislative consent orders which he informed us of at the outset today. We would be grateful if the Minister could write to us answering the points of my noble friend Lord Rooker, which were well made by him and by the Delegated Powers Committee.

I am especially pleased to see that the issues which we raised at Report on remote access have been addressed. I feel smug, as I had to press quite hard for the Minister to leave the door open to come back at this stage on this. I am delighted that he is now walking through the door. Like the noble Lord, Lord Allan, I have just a few things that I would like clarification on—the proportional use of the powers, Ofcom taking into account user privacy, especially regarding live user data, and that the duration of the powers be time- limited.

Finally, I thank parliamentarians on all sides for an exemplary team effort. With so much seemingly falling apart around us, it is encouraging that, when we have common purpose, we can achieve a lot, as we have with this Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, let me first address the points made by the noble Lord, Lord Rooker. I am afraid that, like my noble friend Lady Stowell of Beeston, I was not aware of the report of your Lordships’ committee. Unlike her, I should have been. I have checked with my private office and we have not received a letter from the committee, but I will ask them to contact the clerk to the committee immediately and will respond to this today. I am very sorry that this was not brought to my attention, particularly since the members of the committee met during the Recess to look at this issue. I have corresponded with my noble friend Lord McLoughlin, who chairs the committee, on each of its previous reports. Where we have disagreed, we have done so explicitly and set out our reasons. We have agreed with most of its previous recommendations. I am very sorry that I was not aware of this report and have not had the opportunity to provide answers for your Lordships’ House ahead of the debate.

Online Safety Bill

Lord Knight of Weymouth Excerpts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I welcome the Minister’s Amendment 238A, which I think was in response to the DPRRC report. The sentiment around the House is absolutely clear about the noble Baroness’s Amendment 245. Indeed, she made the case conclusively for the risk basis of categorisation. She highlighted Zoe’s experience and I struggle to understand why the Secretary of State is resisting the argument. She knocked down the nine pins of legal uncertainty, and how it was broader than children and illegal by reference to Clause 12. The noble Baroness, Lady Finlay, added to the knocking down of those nine pins.

Smaller social media platforms will, on the current basis of the Bill, fall outside category 1. The Royal College of Psychiatrists made it pretty clear that the smaller platforms might be less well moderated and more permissive of dangerous content. It is particularly concerned about the sharing of information about methods of suicide or dangerous eating disorder content. Those are very good examples that it has put forward.

I return to the scrutiny committee again. It said that

“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”

should be adopted. It seems that many small, high-harm services will be excluded unless we go forward on the basis set out by the noble Baroness, Lady Morgan. The kind of breadcrumbing we have talked about during the passage of the Bill and, on the other hand, sites such as Wikipedia, as mentioned by noble friend, will be swept into the net despite being low risk.

I have read the letter from the Secretary of State which the noble Baroness, Lady Morgan, kindly circulated. I cannot see any argument in it why Amendment 245 should not proceed. If the noble Baroness decides to test the opinion of the House, on these Benches we will support her.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I have good news and bad news for the Minister. The good news is that we have no problem with his amendments. The bad news, for him, is that we strongly support Amendment 245 from the noble Baroness, Lady Morgan of Coates, which, as others have said, we think is a no-brainer.

The beauty of the simple amendment has been demonstrated; it just changes the single word “and” to “or”. It is of course right to give Ofcom leeway—or flexibility, as the noble Baroness, Lady Finlay, described it—in the categorisation and to bring providers into the safety regime. What the noble Baroness, Lady Morgan, said about the smaller platforms, the breadcrumbing relating to the Jake Davison case and the functionality around bombarding Zoe Lyalle with those emails told the story that we needed to hear.

As it stands, the Bill requires Ofcom to always be mindful of size. We need to be more nuanced. From listening to the noble Lord, Lord Allan of Hallam—with his, as ever, more detailed analysis of how things work in practice—my concern is that in the end, if it is all about size, Ofcom will end up having to have a much larger number in scope on the categorisation of size in order to cover all the platforms that it is worried about. If we could give flexibility around size or functionality, that would make the job considerably easier.

We on this side think categorisation should happen with a proportionate, risk-based approach. We think the flexibility should be there, the Minister is reasonable—come on, what’s not to like?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I shall explain why the simple change of one word is not as simple as it may at first seem. My noble friend’s Amendment 245 seeks to amend the rule that a service must meet both a number-of-users threshold and a functionality threshold to be designated as category 1 or 2B. It would instead allow the Secretary of State by regulation to require a service to have to meet only one or other of the two requirements. That would mean that smaller user-to-user services could be so categorised by meeting only a functionality threshold.

In practical terms, that would open up the possibility of a future Secretary of State setting only a threshold condition about the number of users, or alternatively about functionality, in isolation. That would create the risk that services with a high number of users but limited functionality would be caught in scope of category 1. That could be of particular concern to large websites that operate with limited functionality for public interest reasons, and I am sure my noble friend Lord Moylan can think of one that fits that bill. On the other hand, it could capture a vast array of low-risk smaller services merely because they have a specific functionality—for instance, local community fora that have livestreaming capabilities. So we share the concerns of the noble Lord, Lord Allan, but come at it from a different perspective from him.

My noble friend Lady Morgan mentioned the speed of designation. The Bill’s approach to the pace of designation for the category 1 watchlist and register is flexible—deliberately so, to allow Ofcom to act as quickly as is proportionate to each emerging service. Ofcom will have a duty proactively to identify, monitor and evaluate emerging services, which will afford it early visibility when a service is approaching the category 1 threshold. It will therefore be ready to act accordingly to add services to the register should the need arise.

The approach set out in my noble friend’s Amendment 245 would not allow the Secretary of State to designate individual services as category 1 if they met one of the threshold conditions. Services can be designated as category 1 only if they meet all the relevant threshold conditions set out in the regulations made by the Secretary of State. That is the case regardless, whether the regulations set out one condition or a combination of several conditions.

The noble Baroness, Lady Finlay, suggested that the amendment would assist Ofcom in its work. Ofcom itself has raised concerns that amendments such as this—to introduce greater flexibility—could increase the risk of legal challenges to categorisation. My noble friend Lady Morgan was part of the army of lawyers before she came to Parliament, and I am conscious that the noble Lord, Lord Clement-Jones, is one as well. I hope they will heed the words of the regulator; this is not a risk that noble Lords should take lightly.

I will say more clearly that small companies can pose significant harm to users—I have said it before and I am happy to say it again—which is why there is no exemption for small companies. The very sad examples that my noble friend Lady Morgan gave in her speech related to illegal activity. All services, regardless of size, will be required to take action against illegal content, and to protect children if they are likely to be accessed by children. This is a proportionate regime that seeks to protect small but excellent platforms from overbearing regulation. However, I want to be clear that a small platform that is a font of illegal content cannot use the excuse of its size as an excuse for not dealing with it.

Category 1 services are those services that have a major influence over our public discourse online. Again, I want to be clear that designation as a category 1 service is not based only on size. The thresholds for category 1 services will be based on the functionalities of a service as well as the size of the user base. The thresholds can also incorporate other characteristics that the Secretary of State deems relevant, which could include factors such as a service’s business model or its governance. Crucially, Ofcom has been clear that it will prioritise engagement with high-risk or high-impact services, irrespective of their categorisation, to understand their existing safety systems and how they plan to improve them.

--- Later in debate ---
Moved by
270A: After Clause 144, insert the following new Clause—
“Establishment of the Advocacy Body for Children
(1) There is to be a body corporate (“the Advocacy Body for Children”) to represent the interests of child users of regulated services.(2) A “child user”—(a) means any person aged 17 years or under who uses or is likely to use regulated internet services, and(b) includes both any existing child user and any future child user.(3) The functions of the Advocacy Body for Children must include, in relation to regulated services—(a) representing the interests of child users;(b) the protection and promotion of those interests;(c) monitoring implications of this Act’s implementation for those interests;(d) consideration of children’s rights under the United Nations Convention on the Rights of the Child, including (but not limited to) their participation rights;(e) any other matter connected with those interests.(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act, including—(a) safety duties about illegal content, in particular CSEA content,(b) safety duties protecting children,(c) children’s access assessment duties, and(d) other enforceable requirements relating to children. (5) The Advocacy Body for Children must—(a) have due regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010,(b) assess emerging threats to child users of regulated services and bring information regarding those threats to OFCOM, and(c) publish an annual report related to the interests of child users.(6) The Advocacy Body for Children may undertake research on its own account.(7) The Advocacy Body for Children is to be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.(8) To establish the Advocacy Body for Children, OFCOM must—(a) appoint an organisation or organisations known to represent all children in the United Kingdom to be designated with the functions under this section, or(b) create an organisation to carry out the designated functions.(9) The governance functions of the Advocacy Body for Children must—(a) with the exception of the approval of its budget, remain independent of OFCOM, and(b) include representation of child users by young people under the age of 25 years.(10) The budget of the Advocacy Body for Children will be subject to annual approval by the board of OFCOM.(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body for Children, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 75).”Member’s explanatory statement
This new Clause would require Ofcom to establish a new advocacy body for child users of regulated internet services to represent, protect and promote their interests.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I am grateful to the noble Baroness, Lady Newlove, and the noble Lord, Lord Clement-Jones, for adding their names to Amendment 270A, and to the NSPCC for its assistance in tabling this amendment and helping me to think about it.

The Online Safety Bill has the ambition, as we have heard many times, of making the UK the safest place for a child to be online. Yet, as drafted, it could pass into legislation without a system to ensure that children’s voices themselves can be heard. This is a huge gap. Children are experts in their own lives, with a first-hand understanding of the risks that they face online. It is by speaking to, and hearing from, children directly that we can best understand the harms they face online—what needs to change and how the regulation is working in practice.

User advocates are commonplace in most regulated environments and are proven to be effective. Leading children’s charities such as 5Rights, Barnardo’s and YoungMinds, as well as organisations set up by bereaved parents campaigning for child safety online, such as the Molly Rose Foundation and the Breck Foundation, have joined the NSPCC in calling for the introduction of this advocacy body for children, as set out in the amendment.

I do not wish to detain anyone. The Minister’s response when this was raised in Committee was, in essence, that this should go to the Children’s Commissioner for England. I am grateful to her for tracking me down in a Pret A Manger in Russell Square on Monday and having a chat. She reasonably pointed out that much of the amendment reads a bit like her job description, but she also could see that it is desirable to have an organisation such as the NSPCC set up a UK-wide helpline. There are children’s commissioners for Scotland, Wales and Northern Ireland who are supportive of a national advocacy body for children. She was suggesting —if the Minister agrees that this seems like a good solution—that they could commission a national helpline that works across the United Kingdom, and then advises a group that she could convene, including the children’s commissioners from the other nations of the United Kingdom. If that seems a good solution to the Minister, I do not need to press the amendment, we are all happy and we can get on with the next group. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I just want to make some brief comments in support of the principle of what the noble Lord, Lord Knight, is aiming at in this amendment.

The Bill is going to have a profound impact on children in the United Kingdom. We hope that the most profound impact will be that it will significantly advance their interests in terms of safety online. But it will also potentially have a significant impact on what they can access online and the functionality of different services. They are going to experience new forms of age assurance, about which they may have very strong views. For example, the use of their biometric data to estimate their age will be there to protect them, but they may still have strong views about that.

I have said many times that there may be some measures in the Bill that will encourage services to become 18-plus only. That is not adult in the sense of adult content. Ordinary user-to-user social media services may look at the obligations and say, “Frankly, we would much rather restrict ourselves to users from the UK who identify as being 18-plus, rather than have to take on board all the associated liabilities in respect of children”—not because they are irresponsible, but precisely because they are responsible, and they can see that there is a lot of work to do in order to be legally and safely available to those under 18. For all those reasons, it is really important that the child advocacy body looks at things such as the United Nations Convention on the Rights of the Child and the rights of children to access information, and that it is able to take a view on them.

The reason I think that is important—as will any politician who has been out and spoken in schools—is that very often children are surprising in terms of what they see as their priorities. We make assumptions about their priorities, which can often be entirely wrong. There has been some really good work done on this. There was a project called EU Kids Online, back in the days of the EU, which used to look at children right across the European Union and ask them what their experience of being online was like and what was important to them. There are groups such as Childnet International, which for years has been convening groups of children and taking them to places such as the Internet Governance Forum. That always generates a lot of information that we here would not have thought of, about what children feel is really important to them about their online experience.

For all those reasons, it really would be helpful to institutionalise this in the new regime as some kind of body that looks in the round at children’s interests—their interests to stay safe, but also their interests to be able to access a wide variety of online services and to use the internet as they want to use it. I hope that that strengthens the case the noble Lord, Lord Knight, has made for such a body to exist in some kind of coalition-like format.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, first, I reassure noble Lords that the Government are fully committed to making sure that the interests of children are both represented and protected. We believe, however, that this is already achieved through the provisions in the Bill.

Rather than creating a single advocacy body to research harms to children and advocate on their behalf, as the noble Lord’s amendment suggests, the Bill achieves the same effect through a combination of Ofcom’s research functions, the consultation requirements and the super-complaints provisions. Ofcom will be fully resourced with the capacity and technological ability to assess and understand emerging harms and will be required to research children’s experiences online on an ongoing basis.

For the first time, there will be a statutory body in place charged with protecting children from harm online. As well as its enforcement functions, Ofcom’s research will ensure that the framework remains up to date and that Ofcom itself has the latest, in-depth information to aid its decision-making. This will ensure that new harms are not just identified in retrospect when children are already affected by them and complaints are made; instead, the regulator will be looking out for new issues and working proactively to understand concerns as they develop.

Children’s perspectives will play a central role in the development of the framework, as Ofcom will build on its strong track record of qualitative research to ensure that children are directly engaged. For example, Ofcom’s ongoing programme, Children’s Media Lives, involves engaging closely with children and tracking their views and experiences year on year.

Alongside its own research functions, super-complaints will ensure that eligible bodies can make complaints on systemic issues, keeping the regulator up to date with issues as they emerge. This means that if Ofcom does not identify a systemic issue affecting children for any reason, it can be raised and then dealt with appropriately. Ofcom will be required to respond to the super-complaint, ensuring that its subsequent decisions are understood and can be scrutinised. Complaints by users will also play a vital role in Ofcom’s horizon scanning and information gathering, providing a key means by which new issues can be raised.

The extensive requirements for Ofcom to consult on codes of practice and guidance will further ensure that it consistently engages with groups focused on the interests of children as the codes and guidance are developed and revised. Children’s interests are embedded in the implementation and delivery of this framework.

The Children’s Commissioner will play a key and ongoing role. She will be consulted on codes of practice and any further changes to those codes. The Government are confident that she will use her statutory duties and powers effectively to understand children’s experiences of the digital world. Her primary function as Children’s Commissioner for England is promoting and protecting the rights of children in England and to promote and protect the rights of children across the United Kingdom where those rights are or may be affected by reserved matters. As the codes of practice and the wider Bill relate to a reserved area of law—namely, internet services—the Children’s Commissioner for England will be able to represent the interests of children from England, Scotland, Wales and Northern Ireland when she is consulted on the preparation of codes of practice. That will ensure that children’s voices are represented right across the UK. The Children’s Commissioner for England and her office also regularly speak to the other commissioners about ongoing work on devolved and reserved matters. Whether she does that in branches of Pret A Manger, I do not know, but she certainly works with her counterparts across the UK.

I am very happy to take back the idea that the noble Lord has raised and discuss it with the commissioner. There are many means by which she can carry out her duties, so I am very happy to take that forward. I cannot necessarily commit to putting it in legislation, but I shall certainly commit to discussing it with her. On the proposals in the noble Lord’s amendment, we are concerned that a separate child user advocacy body would duplicate the functions that she already has, so I hope with that commitment he will be happy to withdraw.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I am grateful to those who have spoken in this quick debate and for the support from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Fox, about children’s voices being heard. I think that we are getting to the point when there will not be a quango or indeed a minefield, so that makes us all happy. The Minister almost derailed me, because so much of his speaking note was about the interests of children and I am more interested in the voice of children being heard directly rather than people acting on their behalf and representing their interests, but his final comments around being happy to take the idea forward means that I am very happy to withdraw my amendment.

Amendment 270A withdrawn.
--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

This is the philosophical question on which we still disagree. Features and functionality can be harmful but, to manifest that harm, there must be some content which they are functionally, or through their feature, presenting to the user. We therefore keep talking about content, even when we are talking about features and functionality. A feature on its own which has no content is not what the noble Baroness, Lady Kidron, my noble friend Lady Harding and others are envisaging, but to follow the logic of the point they are making, it requires some content for the feature or functionality to cause its harm.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

But the content may not be harmful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, even if the content is not harmful. We keep saying “content” because it is the way the content is disseminated, as the Bill sets out, but the features and functionalities can increase the risks of harm as well. We have addressed this through looking at the cumulative effects and in other ways.

Online Safety Bill

Lord Knight of Weymouth Excerpts
Baroness Neville-Jones Portrait Baroness Neville-Jones (Con)
- Hansard - - - Excerpts

My Lords, I just want to reinforce what my noble friend Lord Bethell said about the amendments to which I have also put my name: Amendments 237ZA, 266AA and 272E. I was originally of the view that it was enough to give Ofcom the powers to enforce its own rulings. I have been persuaded that, pace my noble friend Lord Grade, the powers that have been given to Ofcom represent such a huge expansion that the likelihood of the regulator doing anything other than those things which it is obliged to do is rather remote. So I come to the conclusion that an obligation is the right way to put these things. I also agree with what has been said about the need to ensure that subsequent action is taken, in relation to a regulated service if it does not follow what Ofcom has set out.

I will also say a word about researchers. They are a resource that already exists. Indeed, there has been quite a lot of pushing, not least by me, on using this resource, first, to update the powers of the Computer Misuse Act, but also to enlarge our understanding of and ability to have information about the operation of online services. So this is a welcome move on the part of the Government, that they see the value of researchers in this context.

My noble friend Lord Moylan made a good point that the terms under which this function is exercised have to have regard to privacy as well as to transparency of operations. This is probably one of the reasons why we have not seen movement on this issue in the Computer Misuse Act and its updating, because it is intrinsically quite a difficult issue. But I believe that it has to be tackled, and I hope very much that the Government will not delay in bringing forward the necessary legislation that will ensure both that researchers are protected in the exercise of this function, which has been one of the issues, and that they are enabled to do something worth while. So I believe the Minister when he says that the Government may need to bring forward extra legislation on this; it is almost certainly the case. I hope very much that there will not be a great gap, so that we do not see this part of the proposals not coming into effect.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, we have had an important debate on a range of amendments to the Bill. There are some very important and good ones, to which I would say: “Better late than never”. I probably would not say that to Amendment 247A; I would maybe say “better never”, but we will come on to that. It is interesting that some of this has come to light following the debate on and scrutiny of the Digital Markets, Competition and Consumers Bill in another place. That might reinforce the need for post-legislative review of how this Bill, the competition Bill and the data Bill are working together in practice. Maybe we will need another Joint Committee, which will please the noble Lord, Lord Clement-Jones, no end.

There are many government amendments. The terms of service and takedown policy ones have been signed by my noble friend Lord Stevenson, and we support them. There are amendments on requiring information on algorithms in transparency reports; requiring search to put into transparency reports; how policies on illegal content and content that is harmful for children were arrived at; information about search algorithms; and physical access in an audit to view the operations of algorithms and other systems. Like the noble Baroness, Lady Kidron, I very much welcome, in this section anyway, that focus on systems, algorithms and process rather than solely on content.

However, Amendment 247A is problematic in respect of the trigger words, as the noble Lord, Lord Allan, referred to, of remote access and requiring a demonstration gathering real-time data. That raises a number of, as he said, non-trivial questions. I shall relay what some service providers have been saying to me. The Bill already provides Ofcom with equivalent powers under Schedule 12—such as rights of entry and inspection and extensive auditing powers—that could require them to operate any equipment or algorithms to produce information for Ofcom and/or allow Ofcom to observe the functioning of the regulated service. Crucially, safeguards are built into the provisions in Schedule 12 to ensure that Ofcom exercises them only in circumstances where the service provider is thought to be in breach of its duties and/or under a warrant, which has to have judicial approval, yet there appear to be no equivalent safeguards in relation to this power. I wonder whether, as it has come relatively late, that is an oversight that the Minister might want to address at Third Reading.

The policy intent, as I understand it, is to give Ofcom remote access to algorithms to ensure that service providers located out of the jurisdiction are not out of scope of Ofcom’s powers. Could that have been achieved by small drafting amendments to Schedule 12? In that case, the whole set of safeguards that we are concerned about would be in place because, so to speak, they would be in the right place. As drafted, the amendment appears to be an extension of Ofcom’s information-gathering powers that can be exercised as a first step against a service provider or access facility without any evidence that the service is in breach of its obligations or that any kind of enforcement action is necessary, which would be disproportionate and oppressive.

Given the weight of industry concern about the proportionality of these powers and their late addition, I urge the Minister to look at the addition of further safeguards around the use of these powers in the Bill and further clarification on the scope of the amendment as a power of escalation, including that it should be exercised as a measure of last resort, and only in circumstances where a service provider has not complied with its duty under the Bill or where the service provider has refused to comply with a prior information notice.

Amendment 247B is welcome because it gives the Minister the opportunity to tell us now that he wants to reflect on all this before Third Reading, work with us and, if necessary, come back with a tightening of the language and a resolution of these issues. I know his motivation is not to cause a problem late on in the Bill but he has a problem, and if he could reflect on it and come back at Third Reading then that would be helpful.

I welcome the amendments tabled by the noble Lord, Lord Bethell, on researcher access. This is another area where he has gone to great efforts to engage across the House with concerned parties, and we are grateful to him for doing so. Independent research is vital for us to understand how this new regime that we are creating is working. As he says, it is a UK strength, and we should play to that strength and not let it slip away inadvertently. We will not get the regime right first time, and we should not trust the platforms to tell us. We need access to independent researchers, and the amendments strike a good balance.

We look forward to the Minister deploying his listening ear, particularly to what the noble Baroness, Lady Harding, had to say on backstop powers. When he said in his opening speech that he would reflect, is he keeping open the option of reflecting and coming back at Third Reading, or is he reflecting only on the possibility of coming back in other legislation?

The noble Baroness, Lady Fraser, raised an important issue for the UK regulator, ensuring that it is listening to potential differences in public opinion in the four nations of our union and, similarly, analysing transparency reports. As she says, this is not about reserved matters but about respecting the individual nations and listening to their different voices. It may well be written into the work of Ofcom by design but we cannot assume that. We look forward to the Minister’s response, including on the questions from my noble friend on the consent process for the devolved Administrations to add offences to the regime.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the remote access power was unlawful. I am sure that would be looked at swiftly, but I will write to the noble Lord on the anticipated timelines while that judicial review was pending. Given the serious nature of the issues under consideration, I am sure that would be looked at swiftly. I will write further on that.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

And on Schedule 12?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will write on Schedule 12 as well.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

I am grateful to the Minister for giving way so quickly. I think the House is asking him to indicate now that he will go away and look at this issue, perhaps with some of us, and that, if necessary, he would be willing to look at coming back with something at Third Reading. From my understanding of the Companion, I think he needs to say words to that effect to allow him to do so, if that is what he subsequently wants to do at Third Reading.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am very happy to discuss this further with noble Lords, but I will reserve the right, pending that discussion, to decide whether we need to return to this at Third Reading.

Amendments 270 and 272, tabled by my noble friend Lady Fraser of Craigmaddie, to whom I am very grateful for her careful scrutiny of the devolved aspects of the Bill, seek to require Ofcom to include separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in the research about users’ experiences of regulated services and in Ofcom’s transparency reports. While I am sympathetic to her intention—we have corresponded on it, for which I am grateful—it is important that Ofcom has and retains the discretion to prioritise information requests that will best shed light on the experience of users across the UK.

My noble friend and other noble Lords should be reassured that Ofcom has a strong track record of using this discretion to produce data which are representative of people across the whole United Kingdom. Ofcom is committed to reflecting the online experiences of users across the UK and intends, wherever possible, to publish data at a national level. When conducting research, Ofcom seeks to gather views from a representative sample of the United Kingdom and seeks to set quotas that ensure an analysable sample within each of the home nations.

It is also worth noting the provisions in the Communications Act 2003 that require Ofcom to operate offices in each of the nations of the UK, to maintain advisory committees for each, and to ensure their representation on its various boards and panels—and, indeed, on the point raised by the noble Baroness, Lady Kidron, to capture the experiences of children and users of all ages. While we must give Ofcom the discretion it needs to ensure that the framework is flexible and remains future-proofed, I hope that I have reassured my noble friend that her point will indeed be captured, reported on and be able to be scrutinised, not just in this House but across the UK.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

Before the Minister sits down—he has been extremely generous in taking interventions—I want to put on record my understanding of his slightly ambiguous response to Amendment 247A, so that he can correct it if I have got it wrong. My understanding is that he has agreed to go away and reflect on the amendment and that he will have discussions with us about it. Only if he then believes that it is helpful to bring forward an amendment at Third Reading will he do so.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, but I do not want to raise the hopes of the noble Lord or others, with whom I look forward to discussing this matter. I must manage their expectations about whether we will bring anything forward. With that, I beg to move.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, the Government have moved on this issue, and I very much welcome that. I am grateful to the Minister for listening and for the fact that we now have Section 11 of the Communications Act being brought into the digital age through the Government’s Amendments 274B and 274C. The public can now expect to be informed and educated about content-related harms, reliability and accuracy; technology companies will have to play their part; and Ofcom will have to regularly report on progress, and will commission and partner with others to fulfil those duties. That is great progress.

The importance of this was underscored at a meeting of the United Nations Human Rights Council just two weeks. Nada Al-Nashif, the UN Deputy High Commissioner for Human Rights in an opening statement said that media and digital literacy empowered individuals and

“should be considered an integral part of education efforts”.

Tawfik Jelassi, the assistant director-general of UNESCO, in a statement attached to that meeting, said that

“media and information literacy was essential for individuals to exercise their right to freedom of opinion and expression”—

I put that in to please the noble Baroness, Lady Fox—and

“enabled access to diverse information, cultivated critical thinking, facilitated active engagement in public discourse, combatted misinformation, and safeguarded privacy and security, while respecting the rights of others”.

If only the noble Lord, Lord Moylan, was in his place to hear me use the word privacy. He continued:

“Together, the international community could ensure that media and information literacy became an integral part of everyone’s lives, empowering all to think critically, promote digital well-being, and foster a more inclusive and responsible global digital community”.


I thought those were great words, summarising why we needed to do this.

I am grateful to Members on all sides of the House for the work that they have done on media literacy. Part of repeating those remarks was that this is so much more about empowerment than it is about loading safety on to individuals, as the noble Baroness, Lady Kidron, rightly said in her comments.

Nevertheless, we want the Minister to reflect on a couple of tweaks. Amendment 269C in my name is around an advisory committee being set up within six months and in its first report assessing the need for a code on misinformation. I have a concern that, as the regime that we are putting in place with this Bill comes into place and causes some of the harmful content that people find engaging to be suppressed, the algorithms will go to something else that is engaging, and that something else is likely to be misinformation and disinformation. I have a fear that that will become a growing problem that the regulator will need to be able to address, which is why it should be looking at this early.

Incidentally, that is why the regulator should also look at provenance, as in Amendment 269AA from the noble Lord, Lord Clement-Jones. It was tempting in listening to him to see whether there was an AI tool that could trawl across all the comments that he has made during the deliberations on this Bill to see whether he has quoted the whole of the joint report—but that is a distraction.

My Amendment 269D goes to the need for media literacy on systems, processes and business models, not just on content. Time and again, we have emphasised the need for this Bill to be as much about systems as content. There are contexts where individual, relatively benign pieces of content can magnify if part of a torrent that then creates harm. The Mental Health Foundation has written to many of us to make this point. In the same way that the noble Baroness, Lady Bull, asked about ensuring that those with disability have their own authentic voice heard as these media literacy responsibilities are played out, so the Mental Health Foundation wanted the same kind of involvement from young people; I agree with both. Please can we have some reassurance that this will be very much part of the literacy duties on Ofcom and the obligations it places on service providers?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I am grateful to noble Lords for their comments, and for the recognition from the noble Lord, Lord Knight, of the changes that we have made. I am particularly grateful to him for having raised media literacy throughout our scrutiny of this Bill.

His Amendments 269C and 269D seek to set a date by which the establishment of the advisory committee on misinformation and disinformation must take place and to set requirements for its first report. Ofcom recognises the valuable role that the committee will play in providing advice in relation to its duties on misinformation and disinformation, and has assured us that it will aim to establish the committee as soon as is reasonably possible, in recognition of the threats posed by misinformation and disinformation online.

Given the valuable role of the advisory committee, Ofcom has stressed how crucial it will be to have appropriate time to appoint the best possible committee. Seeking to prescribe a timeframe for its implementation risks impeding Ofcom’s ability to run the thorough and transparent recruitment process that I am sure all noble Lords want and to appoint the most appropriate and expert members. It would also not be appropriate for the Bill to be overly prescriptive on the role of the committee, including with regard to its first report, in order for it to maintain the requisite independence and flexibility to give us the advice that we want.

Amendment 269AA from the noble Lord, Lord Clement-Jones, seeks to add advice on content provenance to the duties of the advisory committee. The new media literacy amendments, which update Ofcom’s media literacy duties, already include a requirement for Ofcom to take steps to help users establish the reliability, accuracy and authenticity of content found on regulated services. Ofcom will have duties and mechanisms to be able to advise platforms on how they can help users to understand whether content is authentic; for example, by promoting tools that assist them to establish the provenance of content, where appropriate. The new media literacy duties will require Ofcom to take tangible steps to prioritise the public’s awareness of and resilience to misinformation and disinformation online. That may include enabling users to establish the reliability, accuracy and authenticity of content, but the new duties will not remove content online; I am happy to reassure the noble Baroness, Lady Fox, on that.

The advisory committee is already required under Clause 141(4)(c) to advise Ofcom on its exercise of its media literacy functions, including its new duties relating to content authenticity. The Bill does not stipulate what tools service providers should use to fulfil their duties, but Ofcom will have the ability to recommend in its codes of practice that companies use tools such as provenance technologies to identify manipulated media which constitute illegal content or content that is harmful to children, where appropriate. Ofcom is also required to take steps to encourage the development and use of technologies that provide users with further context about content that they encounter online. That could include technologies that support users to establish content provenance. I am happy to reassure the noble Lord, Lord Clement-Jones, that the advisory committee will already be required to advise on the issues that he has raised in his amendment.

On media literacy more broadly, Ofcom retains its overall statutory duty to promote media literacy, which remains broad and non-prescriptive. The new duties in this Bill, however, are focused specifically on harm; that is because the of nature of the Bill, which seeks to make the UK the safest place in the world to be online and is necessarily focused on tackling harms. To ensure that Ofcom succeeds in the delivery of these new specific duties with regard to regulated services, it is necessary that the regulator has a clearly defined scope. Broadening the duties would risk overburdening Ofcom by making its priorities less clear.

The noble Baroness, Lady Bull—who has been translated to the Woolsack while we have been debating this group—raised media literacy for more vulnerable users. Under Ofcom’s existing media literacy programme, it is already delivering initiatives to support a range of users, including those who are more vulnerable online, such as people with special educational needs and people with disabilities. I am happy to reassure her that, in delivering this work, Ofcom is already working not just with expert groups including Mencap but with people with direct personal experiences of living with disabilities.

The noble Lord, Lord Clement-Jones, raised Ofsted. Effective regulatory co-ordination is essential for addressing the crosscutting opportunities and challenges posed by digital technologies and services. Ofsted will continue to engage with Ofcom through its existing mechanisms, including engagement led by its independent policy team and those held with Ofcom’s online safety policy director. In addition to that, Ofsted is considering mechanisms through which it can work more closely with Ofcom where appropriate. These include sharing insights from inspections in an anonymised form, which could entail reviews of its inspection bases and focus groups with inspectors, on areas of particular concern to Ofcom. Ofsted is committed to working with Ofcom’s policy teams to work these plans up in more detail.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I add my congratulations to the noble Baroness, Lady Harding, on her tenacity, and to the Minister on his flexibility. I believe that where we have reached is pretty much the right balance. There are the questions that the noble Baroness, Lady Harding, and others have asked of the Minister, and I hope he will answer those, but this is a game-changer, quite frankly. Rightly, the noble Baroness has paid tribute to the companies which have put their head above the parapet. That was not that easy for them to do when you consider that those are the platforms they have to depend on for their services to reach the public.

Unlike the research report, they have reserved powers that the Secretary of State can use if the report is positive, which I hope it will be. I believe this could be a turning point. The digital markets and consumers Bill is coming down the track this autumn and that is going to give greater powers to make sure that the app stores can be tackled—after all, there are only two of them and they are an oligopoly. They are the essence of big tech, and they need to function in a much more competitive way.

The noble Baroness talked about timing, and it needs to be digital timing, not analogue. Four years does seem a heck of a long time. I hope the Minister will address that.

Then there is the really important aspect of harmful content. In the last group, the Minister reassured us about systems and processes and the illegality threshold. Throughout, he has tried to reassure us that this is all about systems and processes and not so much about content. However, every time we look, we see that content is there almost by default, unless the subject is raised. We do not yet have a Bill that is actually fit for purpose in that sense. I hope the Minister will use his summer break wisely and read through the Bill to make sure that it meets its purpose, and then come back at Third Reading with a whole bunch of amendments that add functionalities. How about that for a suggestion? It is said in the spirit of good will and summer friendship.

The noble Baroness raised a point about transparency when it comes to Ofcom publishing its review. I hope the Minister can give that assurance as well.

The noble Baroness, Lady Kidron, asked about the definition of app store. That is the gatekeeper function, and we need to be sure that that is what we are talking about.

I end by congratulating once again the noble Baroness and the Minister on where we have got to so far.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I will start with the final point of the noble Lord, Lord Clement-Jones. I remind him that, beyond the world of the smartphone, there is a small company called Microsoft that also has a store for software—it is not just Google and Apple.

Principally, I say well done to the noble Baroness, Lady Harding, in deploying all of her “winsome” qualities to corral those of us who have been behind her on this and then persuade the Minister of the merits of her arguments. She also managed to persuade the noble Lord, Lord Allan of Misery Guts, that this was a good idea. The sequence of research, report, regulation and regulate is a good one, and as the noble Lord, Lord Clement-Jones, reminded us it is being deployed elsewhere in the Bill. I agree with the noble Baroness about the timing: I much prefer two years to four years. I hope that at least Ofcom would have the power to accelerate this if it wanted to do so.

I was reminded of the importance of this in an article I read in the Guardian last week, headed:

“More than 850 people referred to clinic for video game addicts”.


This was in reference to the NHS-funded clinic, the National Centre for Gaming Disorders. A third of gamers receiving treatment there were spending money on loot boxes in games such as “Fortnite”, “FIFA”, “Minecraft”, “Call of Duty” and “Roblox”—all games routinely accessed by children. Over a quarter of those being treated by the centre were children.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I am interested to hear what the Minister says, but could he also explain to the House the difference in status of this sort of material in Part 5 versus Part 3? I believe that the Government brought in a lot of amendments that sorted it out and that many of us hoped were for the entire Bill, although we discovered, somewhat to our surprise, that they were only in Part 5. I would be interested if the Minister could expand on that.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for raising this; it is important. Clause 49(3)(a)(i) mentions content

“generated directly on the service by a user”,

which, to me, implies that it would include the actions of another user in the metaverse. Sub-paragraph (ii) mentions content

“uploaded to or shared on the service by a user”,

which covers bots or other quasi-autonomous virtual characters in the metaverse. As we heard, a question remains about whether any characters or objects provided by the service itself are covered.

A scenario—in my imagination anyway—would be walking into an empty virtual bar at the start of a metaverse service. This would be unlikely to be engaging: the attractions of indulging in a lonely, morose drink at that virtual bar are limited. The provider may therefore reasonably configure the algorithm to generate characters and objects that are engaging until enough users then populate the service to make it interesting.

Of course, there is the much more straightforward question of gaming platforms. On Monday, I mentioned “Grand Theft Auto”, a game with an advisory age of 17—they are still children at that age—but that is routinely accessed by younger children. Shockingly, an article that I read claimed that it can evolve into a pornographic experience, where the player becomes the character from a first-person angle and received services from virtual sex workers, as part of the game design. So my question to the Minister is: does the Bill protect the user from these virtual characters interacting with users in virtual worlds?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will begin with that. The metaverse is in scope of the Bill, which, as noble Lords know, has been designed to be technology neutral and future-proofed to ensure that it keeps pace with emerging technologies—we have indeed come a long way since the noble Lord, Lord Clement-Jones, the noble Lords opposite and many others sat on the pre-legislative scrutiny committee for the Bill. Even as we debate, we envisage future technologies that may come. But the metaverse is in scope.

The Bill will apply to companies that enable users to share content online or to interact with each other, as well as search services. That includes a broad range of services, such as websites, applications, social media services, video games and virtual reality spaces, including the metaverse.

Any service that enables users to interact, as the metaverse does, will need to conduct a child access test and will need to comply with the child safety duties—if it is likely to be accessed by children. Content is broadly defined in the Bill as,

“anything communicated by means of an internet service”.

Where this is uploaded, shared or directly generated on a service by a user and able to be encountered by other users, it will be classed as user-generated content. In the metaverse, this could therefore include things like objects or avatars created by users. It would also include interactions between users in the metaverse such as chat—both text and audio—as well as images, uploaded or created by a user.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, from this side we certainly welcome these government amendments. I felt it was probably churlish to ask why it had taken until this late stage to comply with international standards, but that point was made very well by the noble Lord, Lord Allan of Hallam, and I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am grateful to noble Lords for their support for these amendments and for their commitment, as expected, to ensuring that we have the strongest protections in the Bill for children.

The noble Lord, Lord Allan of Hallam, asked: why only now? It became apparent during the regular engagement that, as he would expect, the Government have with the National Crime Agency on issues such as this that this would be necessary, so we are happy to bring these amendments forward. They are vital amendments to enable law enforcement partners to prosecute offenders and keep children safe.

Reports received by the National Crime Agency are for intelligence only and so cannot be relied on as evidence. As a result, in some cases law enforcement agencies may be required to request that companies provide data in an evidential format. The submitted report will contain a limited amount of information from which law enforcement agencies will have to decide what action to take. Reporting companies may hold wider data that relate to the individuals featured in the report, which could allow law enforcement agencies to understand the full circumstances of the event or attribute identities to the users of the accounts.

The data retention period will provide law enforcement agencies with the necessary time to decide whether it is appropriate to request data in order to continue their investigations. I hope that explains the context of why we are doing this now and why these amendments are important ones to add to the Bill. I am very grateful for noble Lords’ support for them.

Online Safety Bill

Lord Knight of Weymouth Excerpts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I will be extremely brief. We have come a very long way since the Joint Committee made its recommendations to the Government, largely, I think, as a result of the noble Baroness, Lady Kidron. I keep mistakenly calling her “Baroness Beeban”; familiarity breeds formality, or something.

I thank the Minister and the Secretary of State for what they have done, and the bereaved families for having identified these issues. My noble friend Lord Allan rightly identified the sentiments as grief and anger at what has transpired. All we can do is try to do, in a small way, what we can to redress the harm that has already been done. I was really interested in his insights into how a platform will respond and how this will help them through the process of legal order and data protection issues with a public authority.

My main question to the Minister is in that context—the relationship with the Information Commissioner’s Office—because there are issues here. There is, if you like, an overlap of jurisdiction with the ICO, because the potential or actual disclosure of personal data is involved, and therefore there will necessarily have to be co-operation between the ICO and Ofcom to ensure the most effective regulatory response. I do not know whether that has emerged on the Minister’s radar, but it certainly has emerged on the ICO’s radar. Indeed, in the ideal world, there probably should be some sort of consultation requirement on Ofcom to co-operate with the Information Commissioner in these circumstances. Anything that the Minister can say on that would be very helpful.

Again, this is all about reassurance. We must make sure that we have absolutely nailed down all the data protection issues involved in the very creative way the Government have responded to the requests of the bereaved families so notably championed by the noble Baroness, Lady Kidron.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, first, I associate myself with the excellent way in which the noble Baroness, Lady Harding, paid tribute to the work of the noble Baroness, Lady Kidron, on behalf of Bereaved Families for Online Safety, and with the comments she made about the Minister and the Secretary of State in getting us to this point, which were echoed by others.

I have attached my name, on behalf of the Opposition, to these amendments on the basis that if they are good enough for the noble Baroness, Lady Kidron, it ought to be good enough for me. We should now get on with implementing them. I am also hopeful to learn that the Minister has been liaising with the noble Baroness, Lady Newlove, to ensure that the amendments relating to coroners’ services, and the equivalent procurator fiscal service in Scotland, will satisfy her sense of what will work for victims. I am interested, also, in the answer to the question raised by the noble Baroness, Lady Kidron, regarding a requirement for senior managers to attend inquests. I liked what she had to say about the training for coroners being seeing as media literacy and therefore fundable from the levy.

All that remains is for me to ask three quick questions to get the Minister’s position clear regarding the interpretation of the new Chapter 3A, “Deceased Child Users”. First, the chapter is clear that terms of service must clearly and easily set out policy for dealing with the parents of a deceased child, and must provide a dedicated helpline and a complaints procedure. In subsection (2), does a helpline or similar—the “similar” being particularly important—mean that the provider must offer an accessible, responsive and interactive service? Does that need to be staffed by a human? I think it would be helpful for the Minister to confirm that is his intention that it should be, so that parents are not fobbed off with solely an automated bot-type service.

Online Safety Bill

Lord Knight of Weymouth Excerpts
I support these amendments. I hope the Minister can give us some indication that we are all heading in the same direction as he is or that he is heading in the same direction as us. That would be enormously helpful.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, as we have heard, this is a small group of amendments concerned with preventing size and lack of capacity being used as a reasonable excuse for allowing children to be unsafe. Part of the problem is the complexity of the Bill and the way it has been put together.

For example, Clause 11, around user-to-user services, is the pertinent clause and it is headed “Safety duties protecting children”. Clause 11(2) is preceded in italics with the wording “All services” so anyone reading it would think that what follows applies to all user-to-user services regardless of size. Clause 11(3) imposes a duty on providers

“to operate a service using proportionate systems and processes”

to protect children from harm. That implies that there will be judgment around what different providers can be expected to do to protect children; for example, by not having to use a particular unaffordable technical solution on age assurance if they can show the right outcome by doing things differently. That starts to fudge things a little.

The noble Lord, Lord Bethell, who introduced this debate so well with Amendment 39, supported by my noble friend Lady Ritchie, wants to be really sure that the size of the provider can never be used to argue that preventing all children from accessing porn is disproportionate and that a few children slipping through the net might just be okay.

The clarity of Clause 11 unravels even further at the end of the clause, where in subsection (12)(b) it reads that

“the size and capacity of the provider of a service”

is relevant

“in determining what is proportionate”.

The clause starts to fall apart at that point quite thoroughly in terms of anyone reading it being clear about what is supposed to happen.

Amendment 43 seeks to take that paragraph out, as we have heard from the noble Lord, Lord Russell, and would do the same for search in Amendment 87. I have added my name to these amendments because I fear that the ambiguity in the wording of this clause will give small and niche platforms an easy get out from ensuring that children are safe by design.

I use the phrase “by design” deliberately. We need to make a choice with this Bill even at this late stage. Is the starting point in the Bill children’s safety by design? Or is the starting point one where we do not want to overly disrupt the way providers operate their business first—which is to an extent how the speech from the noble Lord, Lord Allan, may have been heard—and then overlay children’s safety on top of that?

Yesterday, I was reading about how children access inappropriate and pornographic content, not just on Twitter, Instagram, Snapchat, TikTok and Pinterest but on Spotify and “Grand Theft Auto”—the latter being a game with an age advisory of “over 17” but which is routinely played by teenaged children. Wherever we tolerate children being online, there are dangers which must be tackled. Listening to the noble Baroness, Lady Harding, took me to where a big chunk of my day job in education goes to—children’s safeguarding. I regularly have to take training in safeguarding because of the governance responsibilities that I have. Individual childminders looking after one or two children have an assessment and an inspection around their safeguarding. In the real world we do not tolerate a lack of safety for children in this context. We should not tolerate it in the online world either.

The speech from the noble Lord, Lord Russell, reminded me of the breadcrumbing from big platforms into niche platforms that is part of that incel insight that he referenced. Content that is harmful to children can also be what some children are looking for, which keeps them engaged. Small, emergent services aggressively seeking growth could set algorithms accordingly. They must not be allowed to believe that engaging harmful content is okay until they get to the size that they need to be to afford the age-assurance technology which we might envisage in the Bill. I hope that the Minister shares our concerns and can help us with this problem.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, short debates can be helpful and useful. I am grateful to noble Lords who have spoken on this group.

I will start with Amendment 39, tabled by my noble friend Lord Bethell. Under the new duty at Clause 11(3)(a), providers which allow pornography or other forms of primary priority content under their terms of service will need to use highly effective age verification or age estimation to prevent children encountering it where they identify such content on their service, regardless of their size or capacity. While the size and capacity of providers is included as part of a consideration of proportionality, this does not mean that smaller providers or those with less capacity can evade the strengthened new duty to protect children from online pornography. In response to the questions raised by the noble Baronesses, Lady Ritchie of Downpatrick and Lady Kidron, and others, no matter how much pornographic content is on a service, where providers do not prohibit this content they would still need to meet the strengthened duty to use age verification or age estimation.

Proportionality remains relevant for the purposes of providers in scope of the new duty at Clause 11(3)(a) only in terms of the age-verification or age-estimation measures that they choose to use. A smaller provider with less capacity may choose to go for a less costly but still highly effective measure. For instance, a smaller provider with less capacity might seek a third-party solution, whereas a larger provider with greater capacity might develop their own solution. Any measures that providers use will need to meet the new high bar of being “highly effective”. If a provider does not comply with the new duties and fails to use measures which are highly effective at correctly determining whether or not a particular user is a child, Ofcom can take tough enforcement action.

The other amendments in this group seek to remove references to the size and capacity of providers in provisions relating to proportionality. The principle of proportionate, risk-based regulation is fundamental to the Bill’s regulatory framework, and we consider that the Bill as drafted already strikes the correct balance. The Bill ultimately will regulate a large number of services, ranging from some of the biggest companies in the world to smaller, voluntary organisations, as we discussed in our earlier debate on exemptions for public interest services.

The provisions regarding size and capacity recognise that what it is proportionate to require of companies of various sizes and business models will be different. Removing this provision would risk setting a lowest common denominator standard which does not create incentives for larger technology companies to do more to protect their users than smaller organisations. For example, it would not be proportionate for a large multinational company which employs thousands of content moderators and which invests in significant safety technologies to argue that it is required to take only the same steps to comply as a smaller provider which might have only a handful of employees and a few thousand UK users.

While the size and capacity of providers is included as part of a consideration of proportionality, let me be clear that this does not mean that smaller providers or those with less capacity do not need to meet the child safety duties and other duties in the Bill, such as the illegal content safety duties. These duties set out clear requirements for providers. If providers do not meet these duties, they will face enforcement action.

I hope that is reassuring to my noble friend Lord Bethell and to the other noble Lords with amendments in this group. I urge my noble friend to withdraw his amendment.

Online Safety Bill

Lord Knight of Weymouth Excerpts
Moved by
230: Clause 146, page 128, line 35, leave out from “publish” to end of line 36 and insert “an interim report within the period of three months beginning with the day on which this section comes into force, and a final report within the period of two years beginning on the day on which the interim report is published.”
Member’s explanatory statement
This amendment seeks to accelerate the process relating to Ofcom’s report on researchers’ access to information. Instead of simply requiring a report within two years of Clause 146 being brought into force, this amendment would require an interim report within three months, with a final report to follow two years after that.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, my noble friend Lord Stevenson, who tabled this amendment, unfortunately cannot be with us today as he is off somewhere drinking sherry, I hope.

This is an important set of amendments about researchers’ access to data. As I have previously said to the Committee, we need to ensure that Ofcom has the opportunity to be as trusted as possible in doing its job, so that we can give it as much flexibility as we can, and so that it can deal with a rapidly changing environment. As I have also said on more than one occasion, in my mind, that trust is built by the independence of Ofcom from Secretary of State powers; the ongoing and post-legislative scrutiny of Parliament, which is not something that we can deal with in this Bill; and, finally, transparency—and this group of amendments goes to that very important issue.

The lead amendment in this group, Amendment 230 in my noble friend Lord Stevenson’s name, seeks to accelerate the process relating to Ofcom’s report on researchers’ access to information. Instead of simply requiring a report within two years of Clause 146 being brought into force, this amendment would require an interim report within three months with a final report to follow two years later. Although it is the lead amendment in the group, I do not think it is the more significant because, in the end, it does not do much about the fundamental problem that we want to deal with in this group, which is the need to do better than just having a report. We need to ensure that there really is access by independent reporters.

Amendments 233 and 234 are, I think, of more significance. These proposed new clauses would assist independent researchers in accessing information and data from providers of regulated services. Amendment 233 would allow Ofcom itself to appoint researchers to undertake a variety of research. Amendment 234 would require Ofcom to issue a code of practice on researchers’ access to data; again, this is important so that the practical and legal difficulties for both researchers and service providers can be overcome though negotiation and consultation by Ofcom. Amendment 233A from the noble Lord, Lord Allan, which I am sure he will speak to in a moment, is helpful in clarifying that no data protection breach would be incurred by allowing the research access.

In many ways, there is not a huge amount more to say. When Melanie Dawes, the head of Ofcom, appeared before the Joint Committee on 1 November 2021—all that time ago—she said that

“tightening up the requirement to work with external researchers would be a good thing in the Bill”.

It is therefore a disappointment that, when the Bill was finally published after the Joint Committee’s consideration of the draft, there was not something more significant and more weighty than just a report. That is what we are trying to address, particularly now that we see, as an example, that Twitter is charging more than £30,000 a month for researchers’ access. That is quite a substantial rate in order for researchers to be able to do their work in respect of that platform. Others are restricting or obscuring some of the information that people want to be able to see.

This is a vital set of measures if this Bill is to be effective. These amendments go a long way towards where we want to get to on this; for the reasons I have set out around ensuring that there is transparency, they are vital. We know from the work of Frances Haugen that the platforms themselves are doing this research. We need that out in the open, we need Ofcom to be able to see it through independent researchers and we need others to be able to see it so that Parliament and others can continue to hold these platforms to account. Given that the Minister is in such a positive mood, I look forward to his positive response.

Baroness Barker Portrait The Deputy Chairman of Committees (Baroness Barker) (LD)
- Hansard - - - Excerpts

My Lords, I must advise the Committee that if Amendment 230 is agreed to then I cannot call Amendment 231 because of pre-emption.

--- Later in debate ---
In summary, the regulatory framework’s focus on transparency will improve the data which are publicly available to researchers, while Ofcom’s report on the issue will enable the development of the evidence base before further action is considered. At the risk of disappointing noble Lords about the more open-minded attitudes today—
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

Before the Minister succeeds in disappointing us, can he clarify something for us? Once Ofcom has published the report, it has the power to issue guidance. What requirement is there for platforms to abide by that guidance? We want there to be some teeth at the end of all this. There is a concern that a report will be issued, followed by some guidance, but that nothing much else will happen.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It is guidance rather than direction, but it will be done openly and transparently. Users will be able to see the guidance which Ofcom has issued, to see whether companies have responded to it as they see fit and, through the rest of the framework of the Bill, be empowered to make their decisions about their experiences online. This being done openly and transparently, and informed by Ofcom’s research, will mean that everyone is better informed.

We are sympathetic to the amendment. It is complex, and this has been a useful debate—

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

With that, if there are no further questions, I invite the noble Lord to withdraw his amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, this was a short but important debate with some interesting exchanges at the end. The noble Baroness, Lady Harding, mentioned the rapidly changing environment generated by generative AI. That points to the need for wider ecosystem-level research on an independent basis than we fear we might get as things stand, and certainly wider than the skilled persons we are already legislating for. The noble Lord, Lord Bethell, referred to the access that advertisers already have to insight. It seems a shame that we run the risk, as the noble Baroness, Lady Kidron, pointed out, of researchers in other jurisdictions having more privileged access than researchers in this country, and therefore becoming dependent on those researchers and whistleblowers to give us that wider view. We could proceed with a report and guidance as set out in the Bill but add in some reserved powers in order to take action if the report suggests that Ofcom might need and want that. The Minister may want to reflect on that, having listened to the debate. On that basis, I am happy to beg leave to withdraw the amendment.

Amendment 230 withdrawn.
--- Later in debate ---
Moved by
286ZA: After Clause 184, insert the following new Clause—
“Artificial intelligence: labelling of machine-generated content
Within the period of six months beginning with the day on which this Act is passed, the Secretary of State must publish draft legislation with provisions requiring providers of regulated services to put in place systems and processes for—(a) identifying content on their service which is machine-generated, and(b) informing users of the service that such content is machine-generated.”Member’s explanatory statement
This probing amendment is to facilitate a discussion around the potential labelling of machine-generated content, which is a measure being considered in other jurisdictions.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, that was a bravura performance by the noble Lord, Lord Lexden. We thank him. To those listening in the Public Gallery, I should say that we debated most of those; it was not quite as on the nod as it looked.

Amendment 286ZA, in the name of my noble friend Lord Stevenson, seeks to address a critical issue in our digital landscape: the labelling of AI-generated content on social media platforms.

As we navigate the ever-evolving world of technology, it is crucial that we uphold a transparency safeguarding the principles of honesty and accountability. Social media has become an integral part of our lives, shaping public discourse, disseminating information and influencing public opinion. However, the rise of AI-powered algorithms and tools has given rise to a new challenge: an increasing amount of content generated by artificial intelligence without explicit disclosure.

We live in an age where AI is capable of creating incredibly realistic text, images and even videos that can be virtually indistinguishable from those generated by humans. While this advancement holds immense potential, it also raises concerns regarding authenticity, trust and the ethical implications of AI-generated content. The proposed amendment seeks to address this concern by advocating for a simple but powerful solution—labelling AI-generated content as such. By clearly distinguishing human-generated content from AI-generated content, we empower individuals to make informed decisions about the information they consume, promoting transparency and reducing the potential for misinformation or manipulation.

Labelling AI-generated content serves several crucial purposes. First and foremost, it allows individuals to differentiate between information created by humans and that generated by algorithms in an era where misinformation and deep fakes pose a significant threat to public trust. Such labelling becomes a vital tool to protect and promote digital literacy.

Secondly, it enables users to better understand the potential biases and limitations of AI-generated content. AI algorithms are trained on vast datasets, and without labelling, individuals might unknowingly attribute undue credibility to AI-generated information, assuming it to be wholly objective and reliable. Labelling, however, helps users to recognise the context and provides an opportunity for critical evaluation.

Furthermore, labelling AI-generated content encourages responsible behaviour from the platforms themselves. It incentivises social media companies to develop and implement AI technologies with integrity and transparency, ensuring that users are aware of the presence and influence of AI in their online experiences.

Some may argue that labelling AI-generated content is an unnecessary burden or that it could stifle innovation. However, the intention behind this amendment is not to impede progress but to foster a healthier digital ecosystem built on trust, integrity and informed decision-making. By promoting transparency, we can strike a balance that allows innovation to flourish while safeguarding the interests of individuals and society as a whole.

In conclusion, the amendment to label AI-generated content on social media platforms represents a crucial step forward in addressing the challenges of the digital age. By embracing transparency and empowering individuals, we can foster a more informed and discerning society. Let us lead by example and advocate for a digital landscape that values accountability, integrity and the rights of individuals. I urge your Lordships to support this amendment as we strive to build a future where technology works hand-in-hand with humanity for the betterment of all.

In the spirit of the amendment, I must flag that my entire speaking note was generated by AI, as the noble Lord, Lord Allan, from his expression, had clearly guessed. In using this tool, I do so not to belittle the amendment but to illustrate that these tools are already infiltrating everyday life and can supercharge misinformation. We need to do something to ease internet users in trusting what they read.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Does the noble Lord agree that the fact that we did not notice his speech was generated by AI somewhat damages his argument?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

The fact that I labelled it as being AI-generated helped your Lordships to understand, and the transparency eases the debate. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Knight, for laying out the amendment and recognise that there was a very thoughtful debate on the subject of machine-generated content on Amendment 125 in my name on a previous day of Committee.

I appreciate that the concept of labelling or watermarking machine-generated material is central to recent EU legislation, but I am equally aware that there is more than one school of thought on the efficacy of that approach among AI experts. On the one hand, as the noble Lord, Lord Knight, beautifully set out—with the help of his artificial friend—there are those who believe that visibly marking the division of real and altered material is a clue for the public to look more carefully at what they are seeing and that labelling it might provide an opportunity for both creators and digital companies to give greater weight to “human-created material”. For example, it could be that the new BBC Verify brand is given greater validity by the public, or that Google’s search results promote it above material labelled as machine-generated as a more authentic source. There are others who feel that the scale of machine-generated material will be so vast that this labelling will be impossible or that labelling will downgrade the value of very important machine-generated material in the public imagination, when in the very near future it is likely that most human activity will be a blend of generated material and human interaction.

I spent the first part of this week locked in a room with others at the Institute for Ethics in AI in Oxford debating some of these issues. While this is a very live discussion, one thing is clear: if we are to learn from history, we must act now before all is certain, and we should act with pragmatism and a level of humility. It may be that either or both sets of experts are correct.

Industry has clearly indicated that there is an AI arms race, and many companies are launching services that they do not understand the implications of. This is not my view but one told to me by a company leader, who said that the speed of distribution was so great that the testing was confined to whether deploying large language models crashed the platforms; there was no testing for safety.

The noble Lord, Lord Stevenson, says in his explanatory statement that this is a probing amendment. I therefore ask the Minister whether we might meet before Report and look once again at the gaps that might be covered by some combination of Amendment 125 and the amendment in front of us, to make certain that the Bill adequately reflects the concerns raised by the enforcement community and reflects the advice of those who best understand the latest iterations of the digital world.

The Communications Act 2003 made a horrible mistake in not incorporating digital within it; let us not do the same here. Adding explicit safety duties to AI and machine learning would not slow down innovation but would ensure that innovation is not short-sighted and dangerous for humanity. It is a small amendment for what may turn out to be an unimaginably important purpose.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am sure that the noble Lord, Lord Stevenson of Balmacara, is smiling over a sherry somewhere about the debate he has facilitated. His is a useful probing amendment and we have had a useful discussion.

The Government certainly recognise the potential challenges posed by artificial intelligence and digitally manipulated content such as deepfakes. As we have heard in previous debates, the Bill ensures that machine-generated content on user-to-user services created by automated tools or machine bots will be regulated where appropriate. Clause 49(4)(b) means that machine-generated content is regulated unless the bot or automated tool producing the content is controlled by the provider of the service.

The labelling of this content via draft legislation is not something to which I can commit today. The Government’s AI regulation White Paper sets out the principles for the responsible development of artificial intelligence in the UK. These principles, such as safety, transparency and accountability, are at the heart of our approach to ensuring the responsible development and use of AI. As set out in the White Paper, we are building an agile approach that is designed to be adaptable in response to emerging developments. We do not wish to introduce a rigid, inflexible form of legislation for what is a flexible and fast-moving technology.

The public consultation on these proposals closed yesterday so I cannot pre-empt our response to it. The Government’s response will provide an update. I am joined on the Front Bench by the Minister for Artificial Intelligence and Intellectual Property, who is happy to meet with the noble Baroness, Lady Kidron, and others before the next stage of the Bill if they wish.

Beyond labelling such content, I can say a bit to make it clear how the Bill will address the risks coming from machine-generated content. The Bill already deals with many of the most serious and illegal forms of manipulated media, including deepfakes, when they fall within scope of services’ safety duties regarding illegal content or content that is potentially harmful to children. Ofcom will recommend measures in its code of practice to tackle such content, which could include labelling where appropriate. In addition, the intimate image abuse amendments that the Government will bring forward will make it a criminal offence to send deepfake images.

In addition to ensuring that companies take action to keep users safe online, we are taking steps to empower users with the skills they need to make safer choices through our work on media literacy. Ofcom, for example, has an ambitious programme of work through which it is funding several initiatives to build people’s resilience to harm online, including initiatives designed to equip people with the skills to identify disinformation. We are keen to continue our discussions with noble Lords on media literacy and will keep an open mind on how it might be a tool for raising awareness of the threats of disinformation and inauthentic content.

With gratitude to the noble Lords, Lord Stevenson and Lord Knight, and everyone else, I hope that the noble Lord, Lord Knight, will be content to withdraw his noble friend’s amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I am grateful to everyone for that interesting and quick debate. It is occasionally one’s lot that somebody else tables an amendment but is unavoidably detained in Jerez, drinking sherry, and monitoring things in Hansard while I move the amendment. I am perhaps more persuaded than my noble friend might have been by the arguments that have been made.

We will return to this in other fora in response to the need to regulate AI. However, in the meantime, I enjoyed in particular the John Booth quote from the noble Baroness, Lady Bennett. In respect of this Bill and any of the potential harms around generative AI, if we have a Minister who is mindful of the need for safety by design when we have concluded this Bill then we will have dealt with the bits that we needed to deal with as far as this Bill is concerned.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Can the noble Lord confirm whether he generated those comments himself, or was he on his phone while we were speaking?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

I do not have an invisible earpiece feeding me my lines—that was all human-generated. I beg leave to withdraw the amendment.

Amendment 286ZA withdrawn.
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I associate these Benches with the introduction by the noble Baroness, Lady Kidron, support her amendments and, likewise, hope that they form part of the package that is trundling on its way towards us.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, what more can I say than that I wish to be associated with the comments made by the noble Baroness and then by the noble Lord, Lord Clement-Jones? I look forward to the Minister’s reply.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am very grateful to the noble Baroness for her amendment, which is a useful opportunity for us to state publicly and share with the Committee the progress we have been making in our helpful discussions on these issues in relation to these amendments. I am very grateful to her and to my noble friends Lord Bethell and Lady Harding for speaking as one on this, including, as is well illustrated, in this short debate this evening.

As the noble Baroness knows, discussions continue on the precise wording of these definitions. I share her optimism that we will be able to reach agreement on a suitable way forward, and I look forward to working with her, my noble friends and others as we do so.

The Bill already includes a definition of age assurance in Clause 207, which is

“measures designed to estimate or verify the age or age-range of users of a service”.

As we look at these issues, we want to avoid using words such as “checking”, which suggests that providers need to take a proactive approach to checking age, as that may inadvertently preclude the use of technologies which determine age through other means, such as profiling. It is also important that any definition of age assurance does not restrict the current and future use of innovative and accurate technologies. I agree that it is important that there should be robust definitions for terms which are not currently defined in the Bill, such as age verification, and recommit to the discussions we continue to have on what terms need to be defined and the best way to define them.

This has been a very helpful short debate with which to end our deliberations in Committee. I am very grateful to noble Lords for all the points that have been raised over the past 10 days, and I am very glad to be ending in this collaborative spirit. There is much for us still to do, and even more for the Office of the Parliamentary Counsel to do, before we return on Report, and I am grateful to it and to the officials working on the Bill. I urge the noble Baroness to withdraw her amendment.

Online Safety Bill

Lord Knight of Weymouth Excerpts
There have also been questions about the training for coroners and about approaching the US Government, which is an even larger dimension than anything I have mentioned so far. I very much look forward to hearing what the Minister has to say and hope that we will have achieved the goal that so many families want us to achieve.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I am all that is left between us and hearing from the Minister with his good news, so I will constrain my comments accordingly.

The noble Baroness, Lady Kidron, begin by paying tribute to the parents of Olly, Breck, Molly, Frankie and Sophie. I very much join her in doing that; to continually have to come to this place and share their trauma and experience comes at a great emotional cost. We are all very grateful to them for doing it and for continuing to inform and motivate us in trying to do the right thing. I am grateful to my noble friend Lady Healy and in particular to the noble Baroness, Lady Newlove, for amplifying that voice and talking about the lost opportunity, to an extent, of our failure to find a way of imposing a general duty of care on the platforms, as was the original intention when the noble Baroness, Lady Morgan, was the Secretary of State.

I also pay a big tribute to the noble Baroness, Lady Kidron. She has done the whole House, the country and the world a huge service in her campaigning around this and in her influence on Governments—not just this one—on these issues. We would not be here without her tireless efforts, and it is important that we acknowledge that.

We need to ensure that coroners can access the information they need to do their job, and to have proper sanctions available to them when they are frustrated in being able to do it. This issue is not without complication, and I very much welcome the Government’s engagement in trying to find a way through it. I too look forward to the good news that has been trailed; I hope that the Minister will be able to live up to his billing. Like the noble Baroness, Lady Harding, I would love to see him embrace, at the appropriate time, the “safety by design” amendments and some others that could complete this picture. I also look forward to his answers on issues such as data preservation, which the noble Lord, Lord Allan, covered among the many other things in his typically fine speech.

I very much agree that we should have a helpline and do more about that. Some years ago, when my brother-in-law sadly died in his 30s, it fell to me to try to sort out his social media accounts. I was perplexed that the only way I could do it was by fax to these technology companies in California. That was very odd, so to have proper support for bereaved families going through their own grief at that moment seems highly appropriate.

As we have discussed in the debates on the Bill, a digital footprint is an asset that is exploited by these companies. But it is an asset that should be regarded as part of one’s estate that can be bequeathed to one’s family; then some of these issues would perhaps be lessened. On that basis, and in welcoming a really strong and moving debate, I look forward to the Minister’s comments.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, this has been a strong and moving debate, and I am grateful to the noble Baroness, Lady Kidron, for bringing forward these amendments and for the way she began it. I also echo the thanks that the noble Baroness and others have given to the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens, Frankie Thomas and all the young people whose names she rightly held in remembrance at the beginning of this debate. There are too many others who find themselves in the same position. The noble Lord, Lord Knight, is right to pay tribute to their tirelessness in campaigning, given the emotional toll that we know it has on them. I know that they have followed the sometimes arcane processes of legislation and, as my noble friend Lady Morgan said, we all look forward to the Bill becoming an Act of Parliament so that it can make a difference to families who we wish to spare from the heartache they have had.

Every death is sorrowful, but the death of a child is especially heartbreaking. The Government take the issues of access to information relating to a deceased child very seriously. We have undertaken extensive work across government and beyond to understand the problems that parents, and coroners who are required to investigate such deaths, have faced in the past in order to bring forward appropriate solutions. I am pleased to say that, as a result of that work, and thanks to the tireless campaigning of the noble Baroness, Lady Kidron, and our discussions with those who, very sadly, have first-hand experience of these problems, we will bring forward a package of measures on Report to address the issues that parents and coroners have faced. Our amendments have been devised in close consultation with the noble Baroness and bereaved families. I hope the measures will rise to the expectations they rightly have and that they will receive their support.

The package of amendments will ensure that coroners have access to the expertise and information they need to conduct their investigations, including information held by technology companies, regardless of size, and overseas services such as Wattpad, mentioned by the noble Baroness, Lady Healy of Primrose Hill, in her contribution. This includes information about how a child interacted with specific content online as well as the role of wider systems and processes, such as algorithms, in promoting it. The amendments we bring forward will also help to ensure that the process for accessing data is more straightforward and humane. The largest companies must ensure that they are transparent with parents about their options for accessing data and respond swiftly to their requests. We must ensure that companies cannot stonewall parents who have lost a child and that those parents are treated with the humanity and compassion they deserve.

I take the point that the noble Baroness, Lady Kidron, rightly makes: small does not mean safe. All platforms will be required to comply with Ofcom’s requests for information about a deceased child’s online activity. That will be backed by Ofcom’s existing enforcement powers, so that where a company refuses to provide information without a valid excuse it may be subject to enforcement action, including sanctions on senior managers. Ofcom will also be able to produce reports for coroners following a Schedule 5 request on matters relevant to an investigation or inquest. This could include information about a company’s systems and processes, including how algorithms have promoted specific content to a child. This too applies to platforms of any size and will ensure that coroners are provided with information and expertise to assist them in understanding social media.

Where this Bill cannot solve an issue, we are exploring alternative avenues for improving outcomes as well. For example, the Chief Coroner has committed to consider issuing non-legislative guidance and training for coroners about social media, with the offer of consultation with experts.

Moved by
160A: Clause 68, page 62, line 23, leave out paragraph (d) and insert—
“(d) be made publicly available, subject to appropriate redactions, on the date specified in the notice.”
Member’s explanatory statement
This amendment makes clear that Ofcom guidance under Clause 66 must outline how a platform’s terms of service would be considered “adequate and appropriate”, as required under a new Clause in the name of Lord Stevenson of Balmacara.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, as we have said many times, this is a complex Bill. As we reflect on the priorities for Report, we can be more relaxed about some of the specifics on how Ofcom may operate, thereby giving it more flexibility—the flexibility it needs to be agile in the online world—if we as a Parliament trust Ofcom. Building trust, I believe, is a triangulation. First, there is independence from government—as discussed in respect of Secretary of State powers. Secondly, we need proper scrutiny by Parliament. Earlier today I talked about my desire for there to be proper post-legislative scrutiny and a permanent Joint Committee to do that. The third leg of the stool is the transparency to assist that scrutiny.

Clause 68 contains the provisions which would require category 1, 2A and 2B services to produce an annual transparency report containing information described by Ofcom in a notice given to the service. Under these provisions, Ofcom would be able to require these services to report on, among other things: information about the incidence of illegal content and content that is harmful to children; how many users are assumed to have encountered this content by means of the service; the steps and processes for users to report this content; and the steps and processes which a provider uses for dealing with this content.

We welcome the introduction of transparency reporting in relation to illegal content and content that is harmful to children. We agree with the Government that effective transparency reporting plays a crucial role in building Ofcom’s understanding of online harms and empowering users to make a more informed choice about the services they use.

However, despite the inclusion of transparency reporting in the Bill representing a step in the right direction, we consider that these requirements could and should be strengthened to do the trust building we think is important. First, the Bill should make clear that, subject to appropriate redactions, companies will be required to make their transparency reports publicly available—to make them transparent—hence Amendment 160A.

Although it is not clear from the Bill whether companies will be required to make these reports publicly available, we consider that, in most instances, such a requirement would be appropriate. As noted, one of the stated purposes of transparency reporting is that it would enable service users to make more informed choices about their own and their children’s internet use—but they can only do so if the reports are published. Moreover, in so far as transparency reporting would facilitate public accountability, it could also act as a powerful incentive for service providers to do more to protect their users.

We also recognise that requiring companies to publish the incidences of CSEA content on their platforms, for instance, may have the effect of encouraging individuals seeking such material towards platforms on which there are high incidences of that content—that must be avoided. I recognise that simply having a high incidence of CSEA content on a platform does not necessarily mean that that platform is problematic; it could just mean that it is better at reporting it. So, as ever with the Bill, there is a balance to be struck.

Therefore, we consider that the Bill should make it explicit that, once provided to Ofcom, transparency reports are to be made publicly available, subject to redactions. To support this, Ofcom should be required to produce guidance on the publication of transparency reports and the redactions that companies should make before making reports publicly accessible. Ofcom should also retain the power to stop a company from publishing a particular transparency report if it considers that the risk of directing individuals to illegal materials outweighs the benefit of making a report public—hence Amendments 160B and 181A.

Amendments 165 and 229 are in my noble friend Lord Stevenson’s name. Amendment 165 would broaden the transparency requirements around user-to-user services’ terms of service, ensuring that information can be sought on the scope of these terms, not just their application. As I understand it, scope is important to understand, as it is significant in informing Ofcom’s regulatory approach. We are trying to guard against minimal terms of service where detail is needed for users and Ofcom.

The proposed clause in Amendment 229 probes how Ofcom will review the effectiveness of the transparency requirements in the Bill. It would require Ofcom to undertake a review of the effectiveness of transparency reports within three years and every five years thereafter, and it would give the Secretary of State powers to implement any recommendations made by the regulator. The Committee should note that we also include a requirement that a Select Committee, charged by the relevant House, must consider and report on the regulations, with an opportunity for Parliament to debate them. So we link the three corners of the triangle rather neatly there.

If we agree that transparency is an important part of building trust in Ofcom in doing this difficult and innovative regulatory job—it is always good to see the noble Lord, Lord Grade, in his place; I know he is looking forward to getting on with this—then this proposed clause is sensible. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I am pleased that the noble Lord, Lord Knight of Weymouth, has given us an opportunity to talk about transparency reports with these amendments, which are potentially a helpful addition to the Bill. Transparency is one of the huge benefits that the legislation may bring. One of the concerns that the public have and that politicians have always had with online platforms is that they appear to be a black box—you cannot see what is going on in them.

In the entire edifice that we are constructing in the Online Safety Bill, there are huge opportunities to change that. The platforms will have to do risk assessments —there are measures in the Bill to make sure that information about these is put out—and they will have to take active steps to mitigate any risks they find. Again, we may get directions and guidance from Ofcom that will explain to the public exactly what is expected of them. The final piece of the jigsaw is the transparency reports that show the outcomes—how a platform has performed and what it has done to meet its obligations in dealing with content and behaviour on its services.

For the record, I previously worked for one of the platforms, and I would have said that I was on the pro-transparency wing of the transparency party inside the company. I believed that it was in the platform’s interest: if you do not tell people what you are doing, they will make things up about you, and what they make up will generally be worse than what you are actually doing. So there are huge advantages to the platforms from being transparent.

The noble Lord, Lord Knight, has picked up on some important points in his Amendment 160B, which talks about making sure that the transparency report is not counterproductive by giving the bad guys information that they could use to ill effect. That is a valid point; it is often debated inside the platforms. Sometimes, I argued furiously with my colleagues in the platforms about why we should disclose information. They would ask, “What about the bad guys?” Sometimes I challenged that, but other times it would have been a genuine and accurate concern. The noble Lord mentioned things such as child sexual abuse material, and we have to recognise that the bad guys are incredibly devious and creative, and if you show them anything that they can use against you to get around your systems, they will try to do that. That is a genuine and valid concern.

The sort of thing that you might put into a transparency report is, for example, whether you have banned particular organisations. I would be in favour of indicating to the public that an organisation is banned, but you can see that the potential impact of that is that all the people you are concerned about would create another organisation with a different name and then get back on to your platform. We need to be alive to those kinds of concerns.

It is also relevant to Amendment 165 and the terms of service that the more granular and detailed your terms of service are, the better they are for public information, but there are opportunities to get around them. Again, we would have that argument internally. I would say, “If we are prohibiting specific hate speech terms, tell people that, and then they won’t use them”. For me, that would be a success, as they are not using those hate speech terms anymore, but, of course, they may then find alternative hate speech terms that they can use instead. You are facing that battle all the time. That is a genuine concern that I hope we will be able to debate. I hope that Ofcom will be able to mitigate that risk by discussing with platforms what these transparency reports should look like. In a sense, we are doing a risk assessment of the transparency report process.

Amendment 229 on effectiveness is really interesting. My experience was that if you did not have a transparency report, you were under huge pressure to produce one and that once you produced one, nobody was interested. For fear of embarrassing anyone in the Committee, I would be curious to know how many noble Lords participating in this debate have read the transparency reports already produced by Meta Platforms, Google and others. If they have not read them, they should not be embarrassed, because my experience was that I would talk to regulators and politicians about something they had asked me to come in to talk about, such as hate speech or child sexual abuse material, and I learned to print off the transparency report. I would go in and say, “Well, you know what we are doing; it’s in our transparency report”. They would ask, “What transparency report?”, and I would have to show them. So, having produced a transparency report, every time we published it, we would expect there to be public interest, but little use was made of it. That is not a reason not to do them—as I said, I am very much in favour of doing them—but, on their own, they may not be effective, and Amendment 229 touches on that.

I was trying to think of a collective noun for transparency reports and, seeing as they shed light, I think it may be a “chandelier”. Where we may get the real benefit is if Ofcom can produce a chandelier of transparency reports, taking all the information it gets from the different platforms, processing it and selecting the most relevant information—the reports are often too long for people to work their way through—so that it can enable comparisons. That is really good and it is quite good for the industry that people know that platform A did this, platform B did that, and platform C did something else. They will take note of that, compare with each other and want to get into the best category. It is also critical that Ofcom puts this into user-friendly language, and Ofcom has quite a good record of producing intelligible reports. In the context of Amendment 229, a review process is good. One of the things that might come out of that, thinking ahead, would be Ofcom’s role in producing meta transparency reports, the chandelier that will shed light on what the whole sector is doing.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I think I probably would agree, but I would welcome a chance to discuss it further.

Finally, Amendment 229 intends to probe how Ofcom will review the effectiveness of transparency requirements in the Bill. It would require Ofcom to produce reports reviewing the effectiveness of transparency reports and would give the Secretary of State powers to implement any recommendations made by the regulator. While I of course agree with the sentiment of this amendment, as I have outlined, the transparency reporting power is designed to ensure that Ofcom can continuously review the effectiveness of transparency reports and make adjustments as necessary. This is why the Bill requires Ofcom to set out in annual transparency notices what each provider should include in its reports and the format and manner in which it should be presented, rather than putting prescriptive or static requirements in the Bill. That means that Ofcom will be able to learn, year on year, what will be most effective.

Under Clause 145, Ofcom is required to produce its own annual transparency report, which must include a summary of conclusions drawn from providers’ transparency reports, along with the regulator’s view on industry best practice and other appropriate information—I hope and think that goes to some of the points raised by the noble Lord, Lord Allan of Hallam.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, just before the Minister moves on—and possibly to save me finding and reading it—can he let us know whether those annual reports by Ofcom will be laid before Parliament and whether Parliament will have a chance to debate them?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I believe so, but I will have to confirm that in writing. I am sorry not to be able to give a rapid answer.

Clause 159 requires the Secretary of State to review in total the operation of the regulatory framework to ensure it is effective. In that review, Ofcom will be a statutory consultee. The review will specifically require an assessment of the effectiveness of the regulatory framework in ensuring that the systems and processes used by services provide transparency and accountability to users.

The Bill will create what we are all after, which is a new culture of transparency and accountability in the tech sector. For the reasons I have laid out, we are confident that the existing provisions are sufficiently broad and robust to provide that. As such, I hope the noble Lord feels sufficiently reassured to withdraw the amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, that was a good, quick debate and an opportunity for the noble Viscount to put some things on the record, and explain some others, which is helpful. It is always good to get endorsement around what we are doing from both the noble Lord, Lord Allan, and the noble Baroness, Lady Fox. That is a great spread of opinion. I loved the sense of the challenge as to whether anyone ever reads the transparency reports whenever they are published; I imagine AI will be reading and summarising them, and making sure they are not written as gobbledygook.

On the basis of what we have heard and if we can get some reassurance that strong transparency is accompanied by strong parliamentary scrutiny, then I am happy to withdraw the amendment.

Amendment 160A withdrawn.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful for that intervention as well. That summarises the core questions that we have for the Minister. Of the three areas that we have for him, the first is the question of scope and the extent to which he can assure us that the Bill as drafted will be robust in covering the metaverse and bots, which are the issues that have been raised today. The second is on behaviours and to the two interventions that we have just had. We have been asking whether, with the behaviours that are criminal today, that criminality will stretch to new, similar forms of behaviour taking place in new environments—let us put it that way. The behaviour, the intent and the harm are the same, but the environment is different. We want to understand the extent to which the Government are thinking about that, where that thinking is happening and how confident they are that they can deal with that.

Finally, on the question of agency, how do the Government expect to deal with the fact that we will have machines operating in a user-to-user environment when the connection between the machine and another individual user is qualitatively different from anything that we have seen before? Those are just some small questions for the Minister on this Thursday afternoon.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, the debate on this group has been a little longer, deeper and more important than I had anticipated. It requires all of us to reflect before Report on some of the implications of the things we have been talking about. It was introduced masterfully by the noble Baroness, Lady Harding, and her comments—and those from the noble Baronesses, Lady Finlay and Lady Berridge—were difficult to listen to at times. I also congratulate the Government Whip on the way he handled the situation so that innocent ears were not subject to some of that difficult listening. But the questions around the implications of virtual reality, augmented reality and haptic technology are really important, and I hope the Minister will agree to meet with the noble Baroness, Lady Berridge, and the people she referenced to reflect on some of that.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am happy to reassure my noble friend that the director of the Dawes Centre for Future Crime sits on the Home Office’s Science Advisory Council, whose work is very usefully fed into the work being done at the Home Office. Colleagues at the Ministry of Justice keep criminal law under constant review, in light of research by such bodies and what we see in the courts and society. I hope that reassures my noble friend that the points she raised, which are covered by organisations such as the Dawes Centre, are very much in the mind of government.

The noble Lord, Lord Allan of Hallam, explained very effectively the nuances of how behaviour translates to the virtual world. He is right that we will need to keep both offences and the framework under review. My noble friend Lady Berridge asked a good and clear question, to which I am afraid I do not have a similarly concise answer. I can reassure her that generated child sexual abuse and exploitation material is certainly illegal, but she asked about sexual harassment via a haptic suit; that would depend on the specific circumstances. I hope she will allow me to respond in writing, at greater length and more helpfully, to the very good question she asked.

Under Clause 56, Ofcom will also be required to undertake periodic reviews into the incidence and severity of content that is harmful to children on the in-scope services, and to recommend to the Secretary of State any appropriate changes to regulations based on its findings. Clause 141 also requires Ofcom to carry out research into users’ experiences of regulated services, which will likely include experiences of services such as the metaverse and other online spaces that allow user interaction. Under Clause 147, Ofcom may also publish reports on other online safety matters.

The questions posed by the noble Lord, Lord Russell of Liverpool, about international engagement are best addressed in a group covering regulatory co-operation, which I hope we will reach later today. I can tell him that we have introduced a new information-sharing gateway for the purpose of sharing information with overseas regulators, to ensure that Ofcom can collaborate effectively with its international counterparts. That builds on existing arrangements for sharing information that underpin Ofcom’s existing regulatory regimes.

The amendments tabled by the noble Lord, Lord Knight of Weymouth, relate to providers’ judgments about when content produced by bots is illegal content, or a fraudulent advertisement, under the Bill. Clause 170 sets out that providers will need to take into account all reasonably available relevant information about content when making a judgment about its illegality. As we discussed in the group about illegal content, providers will need to treat content as illegal when this information gives reasonable grounds for inferring that an offence was committed. Content produced by bots is in scope of providers’ duties under the Bill. This includes the illegal content duties, and the same principles for assessing illegal content will apply to bot-produced content. Rather than drawing inferences about the conduct and intent of the user who generated the content, the Bill specifies that providers should consider the conduct and the intent of the person who can be assumed to have controlled the bot at the point it created the content in question.

The noble Lord’s amendment would set out that providers could make judgments about whether bot-produced content is illegal, either by reference to the conduct or mental state of the person who owns the bot or, alternatively, by reference to the person who controls it. As he set out in his explanatory statement and outlined in his speech, I understand he has brought this forward because he is concerned that providers will sometimes not be able to identify the controller of a bot, and that this will impede providers’ duties to take action against illegal content produced by them. Even when the provider does not know the identity of the person controlling the bot, however, in many cases there will still be evidence from which providers can draw inferences about the conduct and intent of that person, so we are satisfied that the current drafting of the Bill ensures that providers will be able to make a judgment on illegality.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My concern is also whether or not the bot is out of control. Can the Minister clarify that issue?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It depends on what the noble Lord means by “out of control” and what content the bot is producing. If he does not mind, this may be an issue which we should go through in technical detail and have a more free-flowing conservation with examples that we can work through.