Lord Parkinson of Whitley Bay debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Yes; the noble Baroness is right. She has pointed out in other discussions I have been party to that, for example, gaming technology that looks at the movement of the player can quite accurately work out from their musculoskeletal behaviour, I assume, the age of the gamer. So there are alternative methods. Our challenge is to ensure that if they are to be used, we will get the equivalent of age verification or better. I now hand over to the Minister.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I think those last two comments were what are known in court as leading questions.

As the noble Baroness, Lady Ritchie of Downpatrick, said herself, some of the ground covered in this short debate was covered in previous groups, and I am conscious that we have a later grouping where we will cover it again, including some of the points that were made just now. I therefore hope that noble Lords will understand if I restrict myself at this point to Amendments 29, 83 and 103, tabled by the noble Baroness, Lady Ritchie.

These amendments seek to mandate age verification for pornographic content on a user-to-user or search service, regardless of the size and capacity of a service provider. The amendments also seek to remove the requirement on Ofcom to have regard to proportionality and technical feasibility when setting out measures for providers on pornographic content in codes of practice. While keeping children safe online is the top priority for the Online Safety Bill, the principle of proportionate, risk-based regulation is also fundamental to the Bill’s framework. It is the Government’s considered opinion that the Bill as drafted already strikes the correct balance between these two.

The provisions in the Bill on proportionality are important to ensure that the requirements in the child-safety duties are tailored to the size and capacity of providers. It is also essential that measures in codes of practice are technically feasible. This will ensure that the regulatory framework as a whole is workable for service providers and enforceable by Ofcom. I reassure your Lordships that the smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services, and will need to make sure that these systems and processes achieve the required outcomes of the child safety duty. Wherever in the Bill they are regulated, companies will need to take steps to ensure that they cannot offer pornographic content online to those who should not see it. Ofcom will set out in its code of practice the steps that companies in the scope of Part 3 can take to comply with their duties under the Bill, and will take a robust approach to sites that pose the greatest risk of harm to children, including sites hosting online pornography.

The passage of the Bill should be taken as a clear message to providers that they need to begin preparing for regulation now—indeed, many are. Responsible providers should already be factoring in regulatory compliance as part of their business costs. Ofcom will continue to work with providers to ensure that the transition to the new regulatory framework will be as smooth as possible.

The Government expect companies to use age-verification technologies to prevent children accessing services that pose the highest risk of harm to children, such as online pornography. The Bill will not mandate that companies use specific technologies to comply with new duties because, as noble Lords have heard me say before, what is most effective in preventing children accessing pornography today might not be equally effective in future. In addition, age verification might not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For instance, if a user-to-user service, such as a particular social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. This would allow content to be better detected and taken down, instead of restricting children from seeing content which is not allowed on the service in the first place. Companies may also use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.

In addition, the amendments in the name of the noble Baroness, Lady Ritchie, risk inadvertently shutting children out of large swathes of the internet that are entirely appropriate for them to access. This is because it is impossible totally to eliminate the risk that a single piece of pornography or pornographic material might momentarily appear on a site, even if that site prohibits it and has effective systems in place to prevent it appearing. Her amendments would have the effect of essentially requiring every service to block children through the use of age verification.

Those are the reasons why the amendments before us are not ones that we can accept. Mindful of the fact that we will return to these issues in a future group, I invite the noble Baroness to withdraw her amendment.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords who have participated in this wide-ranging debate, in which various issues have been raised.

The noble Baroness, Lady Benjamin, made the good point that there needs to be a level playing field between Parts 3 and 5, which I originally raised and which other noble Lords raised on Tuesday of last week. We keep coming back to this point, so I hope that the Minister will take note of it on further reflection before we reach Report. Pornography needs to be regulated on a consistent basis across the Bill.

--- Later in debate ---
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I wonder whether I can make a brief intervention—I am sorry to do so after the noble Lord, Lord Clement-Jones, but I want to intervene before my noble friend the Minister stands up, unless the Labour Benches are about to speak.

I have been pondering this debate and have had a couple of thoughts. Listening to the noble Lord, Lord Clement-Jones, I am reminded of something which was always very much a guiding light for me when I chaired the Charity Commission, and therefore working in a regulatory space: regulation is never an end in itself; you regulate for a reason.

I was struck by the first debate we had on day one of Committee about the purpose of the Bill. If noble Lords recall, I said in that debate that, for me, the Bill at its heart was about enhancing the accountability of the platforms and the social media businesses. I felt that the contribution from my noble friend Lady Harding was incredibly important. What we are trying to do here is to use enforcement to drive culture change, and to force the organisations not to never think about profit but to move away from profit-making to focusing on child safety in the way in which they go about their work. That is really important when we start to consider the whole issue of enforcement.

It struck me at the start of this discussion that we have to be clear what our general approach and mindset is about this part of our economy that we are seeking to regulate. We have to be clear about the crimes we think are being committed or the offences that need to be dealt with. We need to make sure that Ofcom has the powers to tackle those offences and that it can do so in a way that meets Parliament’s and the public’s expectations of us having legislated to make things better.

I am really asking my noble friend the Minister, when he comes to respond on this, to give us a sense of clarity on the whole question of enforcement. At the moment, it is insufficiently clear. Even if we do not get that level of clarity today, when we come back later on and look at enforcement, it is really important that we know what we are trying to tackle here.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I will endeavour to give that clarity, but it may be clearer still if I flesh some points out in writing in addition to what I say now.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.

Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.

Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I join in the chorus of good wishes to the bungee-jumping birthday Baroness, Lady Kidron. I know she will not have thought twice about joining us today in Committee for scrutiny of the Bill, which is testament to her dedication to the cause of the Bill and, more broadly, to protecting children online. The noble Lord, Lord Clement-Jones, is right to note that we have already had a few birthdays along the way; I hope that we get only one birthday each before the Bill is finished.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My birthday is in October, so I hope not.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Very good—only one each, and hopefully fewer. I thank noble Lords for the points they raised in the debate on these amendments. I understand the concerns raised about how the design and operation of services can contribute to risk and harm online.

The noble Lord, Lord Russell, was right, when opening this debate, that companies are very successful indeed at devising and designing products and services that people want to use repeatedly, and I hope to reassure all noble Lords that the illegal and child safety duties in the Bill extend to how regulated services design and operate their services. Providers with services that are likely to be accessed by children will need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service. It also includes reviewing children’s use of higher-risk features, such as live streaming or private messaging. Service providers are also specifically required to consider the design of functionalities, algorithms and other features when delivering the child safety duties imposed by the Bill.

I turn first to Amendments 23 and 76 in the name of the noble Lord, Lord Russell. These would require providers to eliminate the risk of harm to children identified in the service’s most recent children’s risk assessment, in addition to mitigating and managing those risks. The Bill will deliver robust and effective protections for children, but requiring providers to eliminate the risk of harm to children would place an unworkable duty on providers. As the noble Baroness, Lady Fox, my noble friend Lord Moylan and others have noted, it is not possible to eliminate all risk of harm to children online, just as it is not possible entirely to eliminate risk from, say, car travel, bungee jumping or playing sports. Such a duty could lead to service providers taking disproportionate measures to comply; for instance, as noble Lords raised, restricting children’s access to content that is entirely appropriate for them to see.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Does the Minister accept that that is not exactly what we were saying? We were not saying that they would have to eliminate all risk: they would have to design to eliminate risks, but we accept that other risks will apply.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It is part of the philosophical ruminations that we have had, but the point here is that elimination is not possible through the design or any drafting of legislation or work that is there. I will come on to talk a bit more about how we seek to minimise, mitigate and manage risk, which is the focus.

Amendments 24, 31, 32, 77, 84, 85 and 295, from the noble Lord, Lord Russell, seek to ensure that providers do not focus just on content when fulfilling their duties to mitigate the impact of harm to children. The Bill already delivers on those objectives. As the noble Baroness, Lady Kidron, noted, it defines “content” very broadly in Clause 207 as

“anything communicated by means of an internet service”.

Under this definition, in essence, all communication and activity is facilitated by content.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I hope that the Minister has in his brief a response to the noble Baroness’s point about Clause 11(14), which, I must admit, comes across extraordinarily in this context. She quoted it, saying:

“The duties set out … are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.


Is not that exception absolutely at the core of what we are talking about today? It is surely therefore very difficult for the Minister to say that this applies in a very broad way, rather than purely to content.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will come on to talk a bit about dissemination as well. If the noble Lord will allow me, he can intervene later on if I have not done that to his satisfaction.

I was about to talk about the child safety duties in Clause 11(5), which also specifies that they apply to the way that a service is designed, how it operates and how it is used, as well as to the content facilitated by it. The definition of content makes it clear that providers are responsible for mitigating harm in relation to all communications and activity on their service. Removing the reference to content would make service providers responsible for all risk of harm to children arising from the general operation of their service. That could, for instance, bring into scope external advertising campaigns, carried out by the service to promote its website, which could cause harm. This and other elements of a service’s operations are already regulated by other legislation.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I apologise for interrupting. Is that the case, and could that not be dealt with by defining harm in the way that it is intended, rather than as harm from any source whatever? It feels like a big leap that, if you take out “content”, instead of it meaning the scope of the service in its functionality and content and all the things that we have talked about for the last hour and a half, the suggestion is that it is unworkable because harm suddenly means everything. I am not sure that that is the case. Even if it is, one could find a definition of harm that would make it not the case.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Taking it out in the way that the amendment suggests throws up that risk. I am sure that it is not the intention of the noble Lord or the noble Baroness in putting it, but that is a risk of the drafting, which requires some further thought.

Clause 11(2), which is the focus of Amendments 32, 85 and 295, already means that platforms have to take robust action against content which is harmful because of the manner of its dissemination. However, it would not be feasible for providers to fulfil their duties in relation to content which is harmful only by the manner of its dissemination. This covers content which may not meet the definition of content which is harmful to children in isolation but may be harmful when targeted at children in a particular way. One example could be content discussing a mental health condition such as depression, where recommendations are made repeatedly or in an amplified manner through the use of algorithms. The nature of that content per se may not be inherently harmful to every child who encounters it, but, when aggregated, it may become harmful to a child who is sent it many times over. That, of course, must be addressed, and is covered by the Bill.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Can the Minister assure us that he will take another look at this between Committee and Report? He has almost made the case for this wording to be taken out—he said that it is already covered by a whole number of different clauses in the Bill—but it is still here. There is still an exception which, if the Minister is correct, is highly misleading: it means that you have to go searching all over the Bill to find a way of attacking the algorithm, essentially, and the way that it amplifies, disseminates and so on. That is what we are trying to get to: how to address the very important issue not just of content but of the way that the algorithm operates in social media. This seems to be highly misleading, in the light of what the Minister said.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I do not think so, but I will certainly look at it again, and I am very happy to speak to the noble Lord as I do. My point is that it would not be workable or proportionate for a provider to prevent or protect all children from encountering every single instance of the sort of content that I have just outlined, which would be the effect of these amendments. I will happily discuss that with the noble Lord and others between now and Report.

Amendment 27, by the noble Lord, Lord Stevenson, seeks to add a duty to prevent children encountering targeted paid-for advertising. As he knows, the Bill has been designed to tackle harm facilitated through user-generated content. Some advertising, including paid-for posts by influencers, will therefore fall under the scope of the Bill. Companies will need to ensure that systems for targeting such advertising content to children, such as the use of algorithms, protect them from harmful material. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. The Bill is designed to reduce harm on services which host user-generated content, whereas online advertising poses a different set of problems, with different actors. The Government are taking forward work in this area through the online advertising programme, which will consider the full range of actors and sector-appropriate solutions to those problems.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I understand the Minister’s response, and I accept that there is a parallel stream of work that may well address this. However, we have been waiting for the report from the group that has been looking at that for some time. Rumours—which I never listen to—say that it has been ready for some time. Can the Minister give us a timescale?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I cannot give a firm timescale today but I will seek what further information I can provide in writing. I have not seen it yet, but I know that the work continues.

Amendments 28 and 82, in the name of the noble Lord, Lord Russell, seek to remove the size and capacity of a service provider as a relevant factor when determining what is proportionate for services in meeting their child safety duties. This provision is important to ensure that the requirements in the child safety duties are appropriately tailored to the size of the provider. The Bill regulates a large number of service providers, which range from some of the biggest companies in the world to small voluntary organisations. This provision recognises that what it is proportionate to require of providers at either end of that scale will be different.

Removing this provision would risk setting a lowest common denominator. For instance, a large multinational company could argue that it is required only to take the same steps to comply as a smaller provider.

Amendment 32A from the noble Lord, Lord Knight of Weymouth, would require services to have regard to the potential use of virtual private networks and similar tools to circumvent age-restriction measures. He raised the use of VPNs earlier in this Committee when we considered privacy and encryption. As outlined then, service providers are already required to think about how safety measures could be circumvented and take steps to prevent that. This is set out clearly in the children’s risk assessment and safety duties. Under the duty at Clause 10(6)(f), all services must consider the different ways in which the service is used and the impact of such use on the level of risk. The use of VPNs is one factor that could affect risk levels. Service providers must ensure that they are effectively mitigating and managing risks that they identify, as set out in Clause 11(2). The noble Lord is correct in his interpretation of the Bill vis-à-vis VPNs.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Is this technically possible?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Technical possibility is a matter for the sector—

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful to the noble Lord for engaging in dialogue while I am in a sedentary position, but I had better stand up. It is relevant to this Committee whether it is technically possible for providers to fulfil the duties we are setting out for them in statute in respect of people’s ability to use workarounds and evade the regulatory system. At some point, could he give us the department’s view on whether there are currently systems that could be used —we would not expect them to be prescribed—by platforms to fulfil the duties if people are using their services via a VPN?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.

The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.

Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.

Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.

As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.

I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.

My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My noble friend the Minister did not address the concern I set out that the Bill’s approach will overburden Ofcom. If Ofcom has to review the suitability of each set of alternative measures, we will create an even bigger monster than we first thought.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I do not think that it will. We have provided further resource for Ofcom to take on the work that this Bill will give it; it has been very happy to engage with noble Lords to talk through how it intends to go about that work and, I am sure, would be happy to follow up on that point with my noble friend to offer her some reassurance.

Responding to the point from my noble friend Lord Vaizey, the Bill is part of the UK’s overall digital regulatory landscape, which will deliver protections for children alongside the data protection requirements for children set out in the Information Commissioner’s age-appropriate design code. Ofcom has strong existing relationships with other bodies in the regulatory sphere, including through the Digital Regulation Co-operation Forum. The Information Commissioner has been added to this Bill as a statutory consultee for Ofcom’s draft codes of practice and relevant pieces of guidance formally to provide for the ICO’s input into its areas of expertise, especially relating to privacy.

Amendment 138 from the noble Lord, Lord Russell of Liverpool, would amend the criteria for non-designated content which is harmful to children to bring into scope content whose risk of harm derives from its potential financial impact. The Bill already requires platforms to take measures to protect all users, including children, from financial crime online. All companies in scope of the Bill will need to design and operate their services to reduce the risk of users encountering content amounting to a fraud offence, as set out in the list of priority offences in Schedule 7. This amendment would expand the scope of the Bill to include broader commercial harms. These are dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This amendment therefore risks creating regulatory overlap, which would cause confusion for business while not providing additional protections to consumers and internet users.

Amendment 261 in the name of the right reverend Prelate the Bishop of Oxford seeks to modify the existing requirements for the Secretary of State’s review into the effectiveness of the regulatory framework. The purpose of the amendment is to ensure that all aspects of a regulated service are taken into account when considering the risk of harm to users and not just content.

As we have discussed already, the Bill defines “content” very broadly and companies must look at every aspect of how their service facilitates harm associated with the spread of content. Furthermore, the review clause makes explicit reference to the systems and processes which regulated services use, so the review can already cover harm associated with, for example, the design of services.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we too support the spirit of these amendments very much and pay tribute to the noble Lord, Lord Russell, for tabling them.

In many ways, I do not need to say very much. I think the noble Baroness, Lady Kidron, made a really powerful case, alongside the way the group was introduced in respect of the importance of these things. We do want the positivity that the noble Baroness, Lady Harding, talked about in respect of the potential and opportunity of technology for young people. We want them to have the right to freedom of expression, privacy and reliable information, and to be protected from exploitation by the media. Those happen to be direct quotes from the UN Convention on the Rights of the Child, as some of the rights they would enjoy. Amendments 30 and 105, which the noble Lord, Lord Clement-Jones, tabled—I attached my name to Amendment 30—are very much in that spirit of trying to promote well-being and trying to say that there is something positive that we want to see here.

In particular, I would like to see that in respect of Ofcom. Amendment 187 is, in some ways, the more significant amendment and the one I most want the Minister to reflect on. That is the one that applies to Ofcom: that it should have reference to the UN Convention on the Rights of the Child. I think even the noble Lord, Lord Weir, could possibly agree. I understand his thoughtful comments around whether or not it is right to encumber business with adherence to the UN convention, but Ofcom is a public body in how it carries out its duties as a regulator. There are choices for regulation. Regulation can just be about minimum standards, but it can also be about promoting something better. What we are seeking here in trying to have reference to the UN convention is for Ofcom to regulate for something more positive and better, as well as police minimum standards. On that basis, we support the amendments.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I will start in the optimistic spirit of the debate we have just had. There are many benefits to young people from the internet: social, educational and many other ways that noble Lords have mentioned today. That is why the Government’s top priority for this legislation has always been to protect children and to ensure that they can enjoy those benefits by going online safely.

Once again, I find myself sympathetic to these amendments, but in a position of seeking to reassure your Lordships that the Bill already delivers on their objectives. Amendments 25, 78, 187 and 196 seek to add references to the United Nations Convention on the Rights of the Child and general comment 25 on children’s rights in relation to the digital environment to the duties on providers and Ofcom in the Bill.

As I have said many times before, children’s rights are at the heart of this legislation, even if the phrase itself is not mentioned in terms. The Bill already reflects the principles of the UN convention and the general comment. Clause 207, for instance, is clear that a “child” means a person under the age of 18, which is in line with the convention. All providers in scope of the Bill need to take robust steps to protect users, including children, from illegal content or activity on their services and to protect children from content which is harmful to them. They will need to ensure that children have a safe, age-appropriate experience on services designed for them.

Both Ofcom and service providers will also have duties in relation to users’ rights to freedom of expression and privacy. The safety objectives will require Ofcom to ensure that services protect children to a higher standard than adults, while also making sure that these services account for the different needs of children at different ages, among other things. Ofcom must also consult bodies with expertise in equality and human rights, including those representing the interests of children, for instance the Children’s Commissioner. While the Government fully support the UN convention and its continued implementation in the UK, it would not be appropriate to place obligations on regulated services to uphold an international treaty between state parties. We agree with the reservations that were expressed by the noble Lord, Lord Weir of Ballyholme, in his speech, and his noble friend Lady Foster.

The convention’s implementation is a matter for the Government, not for private businesses or voluntary organisations. Similarly, the general comment acts as guidance for state parties and it would not be appropriate to refer to that in relation to private entities. The general comment is not binding and it is for individual states to determine how to implement the convention. I hope that the noble Lord, Lord Russell, will feel reassured that children’s rights are baked into the Bill in more ways than a first glance may suggest, and that he will be content to withdraw his amendment.

The noble Lord, Lord Clement-Jones, in his Amendments 30 and 105, seeks to require platforms and Ofcom to consider a service’s benefits to children’s rights and well-being when considering what is proportionate to fulfil the child safety duties of the Bill. They also add children’s rights and well-being to the online safety objectives for user-to-user services. The Bill as drafted is focused on reducing the risk of harm to children precisely so that they can better enjoy the many benefits of being online. It already requires companies to take a risk-based and proportionate approach to delivering the child safety duties. Providers will need to address only content that poses a risk of harm to children, not that which is beneficial or neutral. The Bill does not require providers to exclude children or restrict access to content or services that may be beneficial for them.

Children’s rights and well-being are already a central feature of the existing safety objectives for user-to-user services in Schedule 4 to the Bill. These require Ofcom to ensure that services protect children to a higher standard than adults, while making sure that these services account for the different needs of children at different ages, among other things. On this basis, while I am sympathetic to the aims of the amendments the noble Lord has brought forward, I respectfully say that I do not think they are needed.

More pertinently, Amendment 30 could have unintended consequences. By introducing a broad balancing exercise between the harms and benefits that children may experience online, it would make it more difficult for Ofcom to follow up instances of non-compliance. For example, service providers could take less effective safety measures to protect children, arguing that, as their service is broadly beneficial to children’s well-being or rights, the extent to which they need to protect children from harm is reduced. This could mean that children are more exposed to more harmful content, which would reduce the benefits of going online. I hope that this reassures the noble Lord, Lord Russell, of the work the Bill does in the areas he has highlighted, and that it explains why I cannot accept his amendments. I invite him to withdraw Amendment 25.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords for taking part in this discussion. I thank the noble Lord, Lord Weir, although I would say to him that his third point—that, in his experience, the UNCRC is open to different interpretations by different departments—is my experience of normal government. Name me something that has not been interpreted differently by different departments, as it suits them.

--- Later in debate ---
Moved by
27A: Clause 11, page 11, line 19, at end insert—“(10A) A duty to summarise in the terms of service the findings of the most recent children’s risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to children).”Member’s explanatory statementThis amendment requires providers of Category 1 services to summarise (in their terms of service) the findings of their latest children’s risk assessment. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.

BBC: Appointment and Resignation of Chair

Lord Parkinson of Whitley Bay Excerpts
Tuesday 2nd May 2023

(12 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, the BBC is a world-class broadcaster and cultural institution which produces some of the very best television and radio in the world. We understand and respect Richard Sharp’s decision to stand down. His Majesty’s Government and the BBC board both want to see stability for the corporation. We want to ensure an orderly transition and will launch a process to identify and appoint a new permanent chairman.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, last week’s report found Richard Sharp to be wrong in not declaring his close links with Boris Johnson when applying for the job of BBC chair. The facts have been clear for some time, so while we welcome the report, this matter could and should have been resolved much earlier. Does the Minister accept that this sorry episode has caused damage both to the BBC’s reputation and to confidence in the public appointments process? With Prime Minister Rishi Sunak promising integrity at every level of his Government, why was it left to Mr Sharp to resign rather than him being dismissed weeks ago?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

It is right that an independent process was commissioned and allowed the time to run. Mr Sharp himself has said that he regrets the impact this has had on the corporation he has faithfully served. Mr Heppinstall’s report says:

“Overall, DCMS officials conducted a good and thorough process”.


There are some helpful lessons for all in his investigation, which we will look at and take forward as appropriate.

Lord Birt Portrait Lord Birt (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest as a former director-general of the BBC. This episode will not damage the BBC—there, I agree with the Minister. It has been around for 100 years, and it is a wonderful institution. It will quickly ride through this sorry affair. The damage that has been done is to the Government’s own process for making public appointments. The Heppinstall report is a truly shocking read. Will the Government now overhaul the process for making public appointments?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I agree with the first part of what the noble Lord says. The news today about the BBC’s work launching the emergency radio service in Sudan is another testament to the fantastic work it does not just in this country but around the world. As I have said, Mr Heppinstall’s report concluded that:

“Overall, DCMS officials conducted a good and thorough process”.


There are some lessons in his report. We will carefully consider its findings and respond in due course.

Baroness Kingsmill Portrait Baroness Kingsmill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, would it preferable if Ministers and holders of public office were, in fact, suspended when being investigated for various situations, such as bullying or arranging loans and things like that for the Prime Minister? Should they not be suspended rather than being allowed to continue with their employment while an investigation takes place?

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

It is important that there is a thorough and swift investigation in cases such as this, and that is what has happened here; Adam Heppinstall has produced a thorough report. He has looked into this carefully and brought forward his conclusions. Richard Sharp has resigned, and we understand and respect his reasons for doing so.

Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is very nice to have a Tory voice in this debate. I declare my interest as a presenter on Times Radio. Richard Sharp was an excellent chair of the BBC, and he has been extremely harshly treated—not least by that terrible cartoon in the Guardian over the weekend. However, I echo the noble Lord, Lord Birt: one of things that is clear from this report, and something we all knew at the time, is who the Government’s favoured candidate for the position was. This does a disservice to the Government because it prevents excellent candidates putting themselves forward and giving them a genuine choice. I know the Minister will simply play a completely straight bat as he answers this Question, but he must know that the Government should have a much more open process for the appointment of the next chair of the BBC.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I completely agree with what my noble friend says about the brilliant work done by Richard Sharp during his time as chairman of the BBC and with the comments he made about the deplorable cartoon in the Guardian, which I am glad was pulled. The Adam Heppinstall report rightly points to the impact that the publication of candidates’ names in the media can have on the public appointments process, and we echo the concerns he raised there. The process to appoint a new permanent chairman will be run in a robust, fair and open manner, in accordance with the governance code.

Baroness Armstrong of Hill Top Portrait Baroness Armstrong of Hill Top (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, when I was a councillor and somebody knocked on my door to say that they were applying for a school caretaker’s job or a dinner assistant’s job, I would say, “Congratulations; I hope you do well. I will now take no part in the selection because I now have an interest: I know who you are”. The noble Lord opposite is right: the Government must make sure that the appointments process is open and that lobbying will actually be a disadvantage rather than the way you get on, which is the way the Government have been behaving.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

Ministerial responsibility is a core principle of the public appointments system. It is important that the process is run and is seen to be run in accordance with that code, and that people declare the things they are required to declare, so that people know. However, there are other independent panel members who are appointed to appointment panels to make sure that there is independence in the system. These are decisions on which Ministers are entitled to take a view, in line with the Government’s code.

Lord McNally Portrait Lord McNally (LD)
- View Speech - Hansard - - - Excerpts

My Lords, nothing the Minister has said so far can give us any confidence that the process is not going to still be influenced by No. 10 Downing Street. Therefore, is it not absolutely imperative that a system of selection be produced that makes it clear that whoever the incumbent is in No. 10, they will not have undue or improper influence on this appointment? I say this as someone who was once head of the political office in No. 10, so I know how that, under successive Governments, there is a desire to interfere. The Government have an opportunity now to create a really transparent, open system, but they have to have the will to do it as well.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

The process for appointing the chair of the BBC is set out in the BBC’s royal charter. It requires an appointment to be made by Order in Council following a fair and open competition. By convention, the Secretary of State for Culture, Media and Sport recommends the appointment to the Lord President of the Council, and the Prime Minister recommends the appointment to His Majesty the King. It is important that the process be followed and that all public appointments be set out and conducted in accordance with the Government’s code.

Lord Berkeley of Knighton Portrait Lord Berkeley of Knighton (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I declare my interest as a freelance broadcaster for the BBC. Does the Minister agree that there is a parallel here with your Lordships’ House? For example, we read endless headlines about prime ministerial appointments to the House but very little about the hours and hours of scrutiny that go into legislation. So it is with the BBC, but this has very little to do with the workforce, who produce programmes day in, day out. It has more to do, as we heard from the noble Lord, Lord Birt, with the selection and appointment process.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I agree with the noble Lord. Indeed, Mr. Sharp pointed in his own resignation statement and letters to his regret at the distraction this has caused to the corporation. We are very lucky indeed to have the BBC in this country, producing the world-class television and radio content I mentioned in my first response.

Lord Clarke of Nottingham Portrait Lord Clarke of Nottingham (Con)
- View Speech - Hansard - - - Excerpts

My Lords, we know that he disclosed the possible conflict to the Cabinet Secretary. Why did the Cabinet Secretary not disclose or tell him to disclose that conflict to those responsible for making the recommendation? When the Government are reviewing the process for these public appointments, will they ensure that the rules on potential conflicts of interest say that all those people involved in making recommendations or making the eventual choice need to have the declaration of interest made known to them? It seems to be an obvious point, which was overlooked by some otherwise perfectly sensible people on this occasion.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

Adam Heppinstall’s report makes it clear that the governance code puts the obligation to make a disclosure on the candidate and not on others. He has looked into this matter and concluded that Mr Sharp accepted that he should have disclosed the matter to the panel and apologised for his error. Given that error, he tended his resignation, and the Government understand and respect his reason for doing so.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome this debate, which revisits some of the areas discussed in earlier debates about the scope of the Bill, as many noble Lords said. It allows your Lordships’ House to consider what has to be the primary driver for assessment. In my view and as others said, it ought to be about risk, which has to be the absolute driver in all this. As the noble Baroness, Lady Harding, said, businesses do not remain static: they start at a certain size and then change. Of course, we hope that many of the businesses we are talking about will grow, so this is about preparation for growth and the reality of doing businesses.

As we discussed, there certainly are cases where search providers may, by their very nature, be almost immune from presenting users with content that could be considered either harmful or illegal under this legislative framework. The new clause proposed by the noble Lord, Lord Moylan—I am grateful to him for allowing us to explore these matters—and its various consequential amendments, would limit the duty to prevent access to illegal content to core category 2A search providers, rather than all search providers, as is currently the case under Clause 23(3).

The argument that I believe the noble Lord, Lord Moylan, put forward is that the illegal content duty is unduly wide, placing a disproportionate and otherwise unacceptable burden on smaller and/or supposedly safer search providers. He clearly said he was not saying that small was safe—that is now completely understood—but he also said that absolute safety is not achievable. As the noble Baroness, Lady Kidron, said, that is indeed so. If this legislation is too complex and creates the wrong provisions, we will clearly be a long way away from our ambition, which here has to be to have in place the best legislative framework, one that everyone can work with and that provides the maximum opportunity for safety and what we all seek to achieve.

Of course, the flip side of the argument about an unacceptable burden on smaller, or on supposedly safer, search providers may be that they would in fact have very little work to do to comply with the illegal content duty, at least in the short term. But the duty would act as an important safeguard, should the provider’s usual systems prove ineffective with the passage of time. Again, that point was emphasised in this and the previous debate by the noble Baroness, Lady Harding.

We look forward to the Minister’s response to find out which view he and his department subscribe to or, indeed, whether they have another view they can bring to your Lordships’ House. But, on the face of it, the current arrangements do not appear unacceptably onerous.

Amendment 157 in the name of the noble Lord, Lord Pickles, and introduced by the noble Baroness, Lady Deech, deals with search by a different approach by inserting requirements about search services’ publicly available statements into Clause 65. In the debate, the noble Baroness and the noble Lord, Lord Weir, raised very important, realistic examples of where search engines can take us, including to material that encourages racism directed at Jews and other groups and encourages hatred of various groups, including Jews. The amendment talks about issues such as the changing of algorithms or the hiding of content and the need to ensure that the terms of providers’ publicly available statements are applied as consistently.

I look forward to hearing from the Minister in response to Amendment 157 as the tech certainly moves us beyond questions of scope and towards discussion of the conduct of platforms when harm is identified.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I must first apologise for my slightly dishevelled appearance as I managed to spill coffee down my shirt on my way to the Chamber. I apologise for that—as the fumes from the dried coffee suffuse the air around me. It will certainly keep me caffeinated for the day ahead.

Search services play a critical role in users’ online experience, allowing them easily to find and access a broad range of information online. Their gateway function, as we have discussed previously, means that they also play an important role in keeping users safe online because they have significant influence over the content people encounter. The Bill therefore imposes stringent requirements on search services to tackle the risks from illegal content and to protect children.

Amendments 13, 15, 66 to 69 and 73 tabled by my noble friend Lord Moylan seek to narrow the scope of the Bill so that its safety search duties apply only to the largest search services—categorised in the Bill as category 2A services—rather than to all search services. Narrowing the scope in this way would have an adverse impact on the safety of people using search services, including children. Search services, including combined services, below the category 2A threshold would no longer have a duty to minimise the risk of users encountering illegal content or children encountering harmful content in or via search results. This would increase the likelihood of users, including children, accessing illegal content and children accessing harmful content through these services.

The Bill already takes a targeted approach and the duties on search services will be proportionate to the risk of harm and the capacity of companies. This means that services which are smaller and lower-risk will have a lighter regulatory burden than those which are larger and higher-risk. All search services will be required to conduct regular illegal content risk assessments and, where relevant, children’s risk assessments, and then implement proportionate mitigations to protect users, including children. Ofcom will set out in its codes of practice specific steps search services can take to ensure compliance and must ensure that these are proportionate to the size and capacity of the service.

The noble Baroness, Lady Kidron, and my noble friend Lady Harding of Winscombe asked how search services should conduct their risk assessments. Regulated search services will have a duty to conduct regular illegal content risk assessments, and where a service is likely to be accessed by children it will have a duty to conduct regular children’s risk assessments, as I say. They will be required to assess the level and nature of the risk of individuals encountering illegal content on their service, to implement proportionate mitigations to protect people from illegal content, and to monitor them for effectiveness. Services likely to be accessed by children will also be required to assess the nature and level of risk of their service specifically for children to identify and implement proportionate mitigations to keep children safe, and to monitor them for effectiveness as well.

Companies will also need to assess how the design and operation of the service may increase or reduce the risks identified and Ofcom will have a duty to issue guidance to assist providers in carrying out their risk assessments. That will ensure that providers have, for instance, sufficient clarity about what an appropriate risk assessment looks like for their type of service.

The noble Lord, Lord Allan, and others asked about definitions and I congratulate noble Lords on avoiding the obvious

“To be, or not to be”


pun in the debate we have just had. The noble Lord, Lord Allan, is right in the definition he set out. On the rationale for it, it is simply that we have designated as category 1 the largest and riskiest services and as category 2 the smaller and less risky ones, splitting them between 2A, search services, and 2B, user-to-user services. We think that is a clear framework. The definitions are set out a bit more in the Explanatory Notes but that is the rationale.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification. I take it then that the Government’s working assumption is that all search services, including the biggest ones, are by definition less risky than the larger user-to-user services. It is just a clarification that that is their thinking that has informed this.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

As I said, the largest and riskiest sites may involve some which have search functions, so the test of large and most risky applies. Smaller and less risky search services are captured in category 2A.

Amendment 157 in the name of my noble friend Lord Pickles, and spoken to by the noble Baroness, Lady Deech, seeks to apply new duties on the largest search services. I agree with the objectives in my noble friend’s amendment of increasing transparency about the search services’ operations and enabling users to hold them to account. It is not, however, an amendment I can accept because it would duplicate existing duties while imposing new duties which we do not think are appropriate for search services.

As I say, the Bill will already require search services to set out how they are fulfilling their illegal content and child safety duties in publicly available statements. The largest search services—category 2A—will also be obliged to publish a summary of their risk assessments and to share this with Ofcom. That will ensure that users know what to expect on those search services. In addition, they will be subject to the Bill’s requirements relating to user reporting and redress. These will ensure that search services put in place effective and accessible mechanisms for users to report illegal content and content which is harmful to children.

My noble friend’s amendment would ensure that the requirements to comply with its publicly available statements applied to all actions taken by a search service to prevent harm, not just those relating to illegal content and child safety. This would be a significant expansion of the duties, resulting in Ofcom overseeing how search services treat legal content which is accessed by adults. That runs counter to the Government’s stated desire to avoid labelling legal content which is accessed by adults as harmful. It is for adult users themselves to determine what legal content they consider harmful. It is not for us to put in place measures which could limit their access to legal content, however distasteful. That is not to say, of course, that where material becomes illegal in its nature that we do not share the determination of the noble Baroness, my noble friend and others to make sure that it is properly tackled. The Secretary of State and Ministers have had extensive meetings with groups making representations on this point and I am very happy to continue speaking to my noble friend, the noble Baroness and others if they would welcome it.

I hope that that provides enough reassurance for the amendment to be withdrawn at this stage.

--- Later in debate ---
Moved by
13A: Clause 6, page 5, line 35, after “service” insert “is not a Category 2A service and”
Member’s explanatory statement
This technical amendment ensures that the duties imposed on providers of combined services in relation to the search engine are correct following the changes to clause 20 arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only.
--- Later in debate ---
Finally, I know this is unpopular as far as the Government are concerned, but is there not a concern that we are running a coach and horses through some of our well thought-through and important issues relating to human rights? The EHRC’s paper says that the provisions in Clause 110 may be disproportionate and an infringement of millions of individuals’ rights to privacy where those individuals are not suspected of any wrongdoing. This is not a right or wrong issue; it is a proportion issue. We need to balance that. I do not know if have heard the Minister set out exactly why the measures in the Bill meet that set of conditions, so I would be grateful if he could talk about that or, if not, write to us. If we are in danger of heading into issues which are raised by Article 8 of the ECHR—I know the noble Lord opposite may not be a huge supporter of it, but it is an important part of our current law, and senior Ministers have said how important it will be in the future—surely we must have safeguards which will protect it.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, this has indeed been a very good debate on a large group of amendments. We have benefited from two former Ministers, the noble Lord, Lord McNally, and my noble friend Lord Kamall. I hope it is some solace to my noble friend that, such a hard act is he to follow, his role has been taken on by two of us on the Front Bench—myself at DCMS and my noble friend Lord Camrose at the new Department for Science, Innovation and Technology.

The amendments in this group are concerned with the protection of user privacy under the Bill and the maintenance of end-to-end encryption. As noble Lords have noted, there has been some recent coverage of this policy in the media. That reporting has not always been accurate, and I take this opportunity to set the record straight in a number of areas and seek to provide the clarity which the noble Lord, Lord Stevenson of Balmacara, asked for just now.

Encryption plays a crucial role in the digital realm, and the UK supports its responsible use. The Bill does not ban any service design, nor will it require services materially to weaken any design. The Bill contains strong safeguards for privacy. Broadly, its safety duties require platforms to use proportionate systems and processes to mitigate the risks to users resulting from illegal content and content that is harmful to children. In doing so, platforms must consider and implement safeguards for privacy, including ensuring that they are complying with their legal responsibilities under data protection law.

With regard to private messaging, Ofcom will set out how companies can comply with their duties in a way that recognises the importance of protecting users’ privacy. Importantly, the Bill is clear that Ofcom cannot require companies to use proactive technology, such as automated scanning, on private communications in order to comply with their safety duties.

In addition to these cross-cutting protections, there are further safeguards concerning Ofcom’s ability to require the use of proactive technology, such as content identification technology on public channels. That is in Clause 124(6) of the Bill. Ofcom must consider a number of matters, including the impact on privacy and whether less intrusive measures would have the equivalent effect, before it can require a proactive technology.

The implementation of end-to-end encryption in a way that intentionally blinds companies to criminal activity on their services, however, has a disastrous effect on child safety. The National Center for Missing & Exploited Children in the United States of America estimates that more than half its reports could be lost if end-to-end encryption were implemented without preserving the ability to tackle child sexual abuse—a conundrum with which noble Lords grappled today. That is why our new regulatory framework must encourage technology companies to ensure that their safety measures keep pace with this evolving and pernicious threat, including minimising the risk that criminals are able to use end-to-end encrypted services to facilitate child sexual abuse and exploitation.

Given the serious risk of harm to children, the regulator must have appropriate powers to compel companies to take the most effective action to tackle such illegal and reprehensible content and activity on their services, including in private communications, subject to stringent legal safeguards. Under Clause 110, Ofcom will have a stand-alone power to require a provider to use, or make best endeavours to develop, accredited technology to tackle child sexual exploitation and abuse, whether communicated publicly or privately, by issuing a notice. Ofcom will use this power as a last resort only when all other measures have proven insufficient adequately to address the risk. The only other type of harm for which Ofcom can use this power is terrorist content, and only on public communications.

The use of the power in Clause 110 is subject to additional robust safeguards to ensure appropriate protection of users’ rights online. Ofcom will be able to require the use of technology accredited as being highly accurate only in specifically detecting illegal child sexual exploitation and abuse content, ensuring a minimal risk that legal content is wrongly identified. In addition, under Clause 112, Ofcom must consider a number of matters, including privacy and whether less intrusive means would have the same effect, before deciding whether it is necessary and proportionate to issue a notice.

The Bill also includes vital procedural safeguards in relation to Ofcom’s use of the power. If Ofcom concludes that issuing a notice is necessary and proportionate, it will need to publish a warning notice to provide the company an opportunity to make representations as to why the notice should not be issued or why the detail contained in it should be amended. In addition, the final notice must set out details of the rights of appeal under Clause 149. Users will also be able to complain to and seek action from a provider if the use of a specific technology results in their content incorrectly being removed and if they consider that technology is being used in a way that is not envisaged in the terms of service. Some of the examples given by the noble Baroness, Lady Fox of Buckley, pertain in this instance.

The Bill also recognises that in some cases there will be no available technology compatible with the particular service design. As I set out, this power cannot be used by Ofcom to require a company to take any action that is not proportionate, including removing or materially weakening encryption. That is why the Bill now includes an additional provision for this scenario, to allow Ofcom to require technology companies to use their best endeavours to develop or find new solutions that work on their services while meeting the same high standards of accuracy and privacy protection. Given the ingenuity and resourcefulness of the sector, it is reasonable to ask it to do everything possible to protect children from abuse and exploitation. I echo the comments made by the noble Lord, Lord Allan, about the work being done across the sector to do that.

More broadly, the regulator must uphold the right to privacy under its Human Rights Act obligations when implementing the new regime. It must ensure that its actions interfere with privacy only where it is lawful, necessary and proportionate to do so. I hope that addresses the question posed by the noble Lord, Lord Stevenson. In addition, Ofcom will be required to consult the Information Commissioner’s Office when developing codes of practice and relevant pieces of guidance.

I turn now to Amendments 14—

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister does so, can he give a sense of what he means by “best endeavours” for those technology companies? If it is not going to be general monitoring of what is happening as the message moves from point to point—we have had some discussions about the impracticality and issues attached to monitoring at one end or the other—what, theoretically, could “best endeavours” possibly look like?

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am hesitant to give too tight a definition, because we want to remain technology neutral and make sure that we are keeping an open mind to developing changes. I will think about that and write to the noble Lord. The best endeavours will inevitably change over time as new technological solutions present themselves. I point to the resourcefulness of the sector in identifying those, but I will see whether there is anything more I can add.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

While the Minister is reflecting, I note that the words “best endeavours” are always a bit of a worry. The noble Lord, Lord Allan, made the good point that once it is on your phone, you are in trouble and you must report it, but the frustration of many people outside this Chamber, if it has been on a phone and you cannot deal with it, is what comes next to find the journey of that piece of material without breaking encryption. I speak to the tech companies very often—indeed, I used to speak to the noble Lord, Lord Allan, when he was in position at then Facebook—but that is the question that we would like answered in this Committee, because the frustration that “It is nothing to do with us” is where we stop with our sympathy.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The noble Baroness’s intervention has given me an opportunity to note that I am about to say a little more on best endeavours, which will not fully answer the question from the noble Lord, Lord Knight, but I hope fleshes it out a little more.

I do that in turning to Amendments 14, 108 and 205, which seek to clarify that companies will not be required to undertake fundamental changes to the nature of their service, such as the removal or weakening of end-to-end encryption. As I previously set out, the Bill does not require companies to weaken or remove any design and there is no requirement for them to do so as part of their risk assessments or in response to a notice. Instead, companies will need to undertake risk assessments, including consideration of risks arising from the design of their services, before taking proportionate steps to mitigate and manage these risks. Where relevant, assessing the risks arising from end-to-end encryption will be an integral part of this process.

This risk management approach is well established in almost every other industry and it is right that we expect technology companies to take user safety into account when designing their products and services. We understand that technologies used to identify child sexual abuse and exploitation content, including on private communications, are in some cases nascent and complex. They continue to evolve, as I have said. That is why Ofcom has the power through the Bill to issue a notice requiring a company to make best endeavours to develop or source technology.

This notice will include clear, proportionate and enforceable steps that the company must take, based on the relevant information of the specific case. Before issuing a warning notice, Ofcom is expected to enter into informal consultation with the company and/or to exercise information-gathering powers to determine whether a notice is necessary and proportionate. This consultation period will assist in establishing what a notice to develop a technology may require and appropriate steps for the company to take to achieve best endeavours. That dialogue with Ofcom is part of the process.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

There are a lot of phrases here—best endeavour, proportionate, appropriate steps—that are rather subjective. The concern of a number of noble Lords is that we want to address this issue but it is a matter of how it is applied. That is one of the reasons why noble Lords were asking for some input from the legal profession, a judge or otherwise, to make those judgments.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

All the phrases used in the Bill are subject to the usual scrutiny through the judicial process—that is why we debate them now and think about their implications—but of course they can, and I am sure will, be tested in the usual legal ways. Once a company has developed a new technology that meets minimum standards of accuracy, Ofcom may require its use but not before considering matters including the impact on user privacy, as I have set out. The Bill does not specify which tools are likely to be required, as we cannot pre-empt Ofcom’s evidence-based and case-by-case assessment.

Amendment 285 intends to clarify that social media platforms will not be required to undertake general monitoring of the activity of their users. I agree that the protection of privacy is of utmost importance. I want to reassure noble Lords, in particular my noble friend Lady Stowell of Beeston, who asked about it, that the Bill does not require general monitoring of all content. The clear and strong safeguards for privacy will ensure that users’ rights are protected.

Setting out clear and specific safeguards will be more effective in protecting users’ privacy than adopting the approach set out in Amendment 285. Ofcom must consider a number of matters, including privacy, before it can require the use of proactive technology. The government amendments in this group, Amendments 290A to 290G, further clarify that technology which identifies words, phrases or images that indicate harm is subject to all of these restrictions. General monitoring is not a clearly defined concept—a point made just now by my noble friend Lord Kamall. It is used in EU law but is not defined clearly in that, and it is not a concept in UK law. This lack of clarity could create uncertainty that some technology companies might attempt to exploit in order to avoid taking necessary and proportionate steps to protect their users. That is why we resist Amendment 285.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I understand the point the Minister is making, but it is absolutely crystal clear that, whatever phrase is used, the sensibility is quite clear that the Government are saying on record, at the Dispatch Box, that the Bill can in no way be read as requiring anybody to provide a view into private messaging or encrypted messaging unless there is good legal cause to suspect criminality. That is a point that the noble Baroness, Lady Stowell, made very clearly. One may not like the phrasing used in other legislatures, but could we find a form of words that will make it clear that those who are operating in this legal territory are absolutely certain about where they stand on that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, I want to give clear reassurance that the Bill does not require general monitoring of all content. We have clear and strong safeguards for privacy in the Bill to ensure that users’ rights are protected. I set out the concerns about use of the phrase “general monitoring”. I hope that provides clarity, but I may have missed the noble Lord’s point. The brief answer to the question I think he was asking is yes.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Let the record stand clear: yes. It was the slight equivocation around how the Minister approached and left that point that I was worried about, and that people might seek to use that later. Words from the Dispatch Box are never absolute and they are never meant to be, but the fact that they have been said is important. I am sure that everybody understands that point, and the Minister did say “yes” to my question.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I did, and I am happy to say it again: yes.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Perhaps I might go back to an earlier point. When the Minister said the Government want to make sure, I think he was implying that certain companies would try to avoid obligations to keep their users safe by threatening to leave or whatever. I want it to be clear that the obligations to the users of the service are, in the instance of encrypted services, to protect their privacy, and they see that as keeping them safe. It would be wrong to make that a polar opposite. I think that companies that run unencrypted services believe that to be what their duties are—so that in a way is a clash.

Secondly, I am delighted by the clarity in the Minister’s “yes” answer, but I think that maybe there needs to be clearer communication with people outside this Chamber. People are worried about whether duties placed on Ofcom to enact certain things would lead to some breach of encryption. No one thinks that the Government intend to do this or want to spy on anyone, but that the unintended consequences of the duty on Ofcom might have that effect. If that is not going to be the case, and that can be guaranteed by the Government, and they made that clear, it would reassure not just the companies but the users of messaging services, which would be helpful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The points the noble Baroness has just made bring me neatly to what I was about to say in relation to the question raised earlier by the noble Lord, Lord Knight of Weymouth. But first, I would say that Ofcom as a public body is subject to public law principles already, so those apply in this case.

The noble Lord, Lord Knight, asked about virtual private networks and the risk of displacing people on to VPNs or other similar alternatives. That is a point worth noting, not just in this group but as we consider all these amendments, particularly when we talk later on about age verification, pornography and so on. Services will need to think about how safety measures could be circumvented and take steps to prevent that, because they need to mitigate risk effectively. There may also be a role in enforcement action, too; Ofcom will be able to apply to the courts to require these services where appropriate to apply business disruption measures. We should certainly be mindful of the incentives for people to do that, and the example the noble Lord, Lord Knight, gave earlier is a useful lesson in the old adage “Caveat emptor” when looking at some of these providers.

I want to say a little bit about Amendments 205A and 290H in my name. Given the scale of child sexual abuse and exploitation that takes place online, and the reprehensible nature of these crimes, it is important that Ofcom has effective powers to require companies to tackle it. This brings me to these government amendments, which make small changes to the powers in Clause 110 to ensure that they are effective. I will focus particularly, in the first instance, on Amendment 290H, which ensures that Ofcom considers whether a service has features that allow content to be shared widely via another service when deciding whether content has been communicated publicly or privately, including for the purposes of issuing a notice. This addresses an issue highlighted by the Independent Reviewer of Terrorism Legislation, Jonathan Hall, and Professor Stuart Macdonald in a recent paper. The separate, technical amendment, Amendment 205A, clarifies that Clause 110(7) refers only to a notice on a user-to-user service.

Amendment 190 in the name of the noble Lord, Lord Clement-Jones, seeks to introduce a new privacy duty on Ofcom when considering whether to use any of its powers. The extensive privacy safeguards that I have already set out, along with Ofcom’s human rights obligations, would make this amendment unnecessary. Ofcom must also explicitly consult persons whom it considers to have expertise in the enforcement of the criminal law and the protection of national security, which is relevant to online safety matters in the course of preparing its draft codes. This may include the integrity and security of internet services where relevant.

Amendments 202 and 206, in the name of the noble Lord, Lord Stevenson of Balmacara, and Amendments 207, 208, 244, 246, 247, 248, 249 and 250 in the name of the noble Lord, Lord Clement-Jones, all seek to deliver privacy safeguards to notices issued under Clause 110 through additional review and appeals processes. There are already strong safeguards concerning this power. As part of the warning notice process, companies will be able to make representations to Ofcom which it is bound to consider before issuing a notice. Ofcom must also review any notice before the end of the period for which it has effect.

Amendment 202 proposes mirroring the safeguards of the investigatory powers Act when issuing notices to encrypted messaging services under this power. First, this would be inappropriate, because the powers in the investigatory powers Act serve different purposes from those in this Bill. The different legal safeguards in the investigatory powers Act reflect the potential intrusion by the state into an individual’s private communications; that is not the case with this Bill, which does not grant investigatory powers to state bodies, such as the ability to intercept private communications. Secondly, making a reference to encryption would be—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Is that right? I do not need a yes or no answer. It was rhetorical; I am just trying to frame the right question. The Minister is making a very strong point about the difference between RIPA requirements and those that might be brought in under this Bill. But it does not really get to the bottom of the questions we were asking. In this situation, whatever the exact analogy between the two systems is, it is clear that Ofcom is marking its own homework—which is fair enough, as there are representations, but it is not getting external advice or seeking judicial approval.

The Minister’s point was that that was okay because it was private companies involved. But we are saying here that these would be criminal offences taking place and therefore there is bound to be interest from the police and other agencies, including anti-terrorism agencies. It is clearly similar to the RIPA arrangements, so he could he just revisit that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, I think it is right. The investigatory powers Act is a tool for law enforcement and intelligence agencies, whereas the Bill is designed to regulate technology companies—an important high-level distinction. As such, the Bill does not grant investigatory powers to state bodies. It does not allow the Government or the regulator to access private messages. Instead, it requires companies to implement proportionate systems and processes to tackle illegal content on their platforms. I will come on to say a little about legal redress and the role of the courts in looking at Ofcom’s decisions so, if I may, I will respond to that in a moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

The investigatory powers Act includes a different form of technical notice, which is to put in place surveillance equipment. The noble Lord, Lord Stevenson, has a good point: we need to ensure that we do not have two regimes, both requiring companies to put in place technical equipment but with quite different standards applying.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

Before my noble friend moves on, when he is reviewing that back in the office, could he also satisfy himself that the concerns coming from the journalism and news organisations in the context of RIPA are also understood and have been addressed? That is another angle which, from what my noble friend has said so far, I am not sure has really been acknowledged. That is not a criticism but it is worth him satisfying himself on it.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am about to talk about the safeguards for journalists in the context of the Bill and the questions posed by the noble Baroness, Lady Bennett. However, I take my noble friend’s point about the implications of other Acts that are already on the statute book in that context as well.

Just to finish the train of thought of what I was saying on Amendment 202, making a reference to encryption, as it suggests, would be out of step with the wider approach of the Bill, which is to remain technology-neutral.

I come to the safeguards for journalistic protections, as touched on by the noble Baroness, Lady Bennett. The Government are fully committed to protecting the integrity of journalistic sources, and there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources. Any tools required on private communications must be accredited by Ofcom as highly accurate only in detecting child sexual abuse and exploitation content. These minimum standards of accuracy will be approved and published by the Secretary of State, following advice from Ofcom. We therefore expect it to be very unlikely that journalistic content will be falsely detected by the tools being required.

Under Clause 59, companies are obliged to report child sexual abuse material which is detected on their service to the National Crime Agency; this echoes a point made by the noble Lord, Lord Allan, in an earlier contribution. That would include child sexual abuse and exploitation material identified through tools required by a notice and, even in this event, the appropriate protections in relation to journalistic sources would be applied by the National Crime Agency if it were necessary to identify individuals involved in sharing illegal material.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I want to flag that in the context of terrorist content, this is quite high risk for journalists. It is quite common for them, for example, to be circulating a horrific ISIS video not because they support ISIS but because it is part of a news article they are putting together. We should flag that terrorist content in particular is commonly distributed by journalists and it could be picked up by any system that is not sufficiently sophisticated.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I see that my noble friend Lord Murray of Blidworth has joined the Front Bench in anticipation of the lunch-break business for the Home Office. That gives me the opportunity to say that I will discuss some of these points with him, my noble friend Lord Sharpe of Epsom and others at the Home Office.

Amendment 246 aims to ensure that there is no requirement for a provider to comply with a notice until the High Court has determined the appeal. The Government have ensured that, in addition to judicial review through the High Court, there is an accessible and relatively affordable alternative means of appealing Ofcom’s decisions via the Upper Tribunal. We cannot accept amendments such as this, which could unacceptably delay Ofcom’s ability to issue a notice, because that would leave children vulnerable.

To ensure that Ofcom’s use of its powers under Clause 110, and the technology that underpins it, are transparent, Ofcom will produce an annual report about the exercise of its functions using these powers. This must be submitted to the Secretary of State and laid before Parliament. The report must also provide the details of technology that has been assessed as meeting minimum standards of accuracy, and Ofcom may also consider other factors, including the impact of technologies on privacy. That will be separate to Ofcom’s annual report to allow for full scrutiny of this power.

The legislation also places a statutory requirement on Ofcom to publish guidance before its functions with regard to Clause 110 come into force. This will be after Royal Assent, given that the legislation is subject to change until that point. Before producing the guidance, Ofcom must consult the Information Commissioner. As I said, there are already strong safeguards regarding Ofcom’s use of these powers, so we think that this additional oversight is unnecessary.

Amendments 203 and 204, tabled by the noble Lord, Lord Clement-Jones, seek to probe the privacy implications of Ofcom’s powers to require technology under Clause 110. I reiterate that the Bill will not ban or weaken any design, including end-to-end encryption. But, given the scale of child sexual abuse and exploitation taking place on private communications, it is important that Ofcom has effective powers to require companies to tackle this abhorrent activity. Data from the Office for National Statistics show that in nearly three-quarters of cases where children are contacted online by someone they do not know, this takes place by private message. This highlights the scale of the threat and the importance of technology providers taking steps to safeguard children in private spaces online.

As already set out, there are already strong safeguards regarding the use of this power, and these will prevent Ofcom from requiring the use of any technology that would undermine a platform’s security and put users’ privacy at risk. These safeguards will also ensure that platforms will not be required to conduct mass scanning of private communications by default.

Until the regime comes into force, it is of course not possible to say with certainty which tools would be accredited. However, some illustrative examples of the kinds of current tools we might expect to be used—providing that they are highly accurate and compatible with a service’s design—are machine learning or artificial intelligence, which assess content to determine whether it is illegal, and hashing technology, which works by assigning a unique number to an image that has been identified as illegal.

Given the particularly abhorrent nature of the crimes we are discussing, it is important that services giving rise to a risk of child sexual abuse and exploitation in the UK are covered, wherever they are based. The Bill, including Ofcom’s ability to issue notices in relation to this or to terrorism, will therefore have extraterritorial effect. The Bill will apply to any relevant service that is linked to the UK. A service is linked to the UK if it has a significant number of UK users, if UK users form a target market or if the service is capable of being used in the UK and there is a material risk of significant harm to individuals in the UK arising from the service. I hope that that reassures the noble Lord, on behalf of his noble friend, about why that amendment is not needed.

Amendments 209 to 214 seek to place additional requirements on Ofcom to consider the effect on user privacy when using its powers under Clause 110. I agree that tackling online harm needs to take place while protecting privacy and security online, which is why Ofcom already has to consider user privacy before issuing notices under Section 110, among the other stringent safeguards I have set out. Amendment 202A would impose a duty on Ofcom to issue a notice under Clause 110, where it is satisfied that it is necessary and proportionate to do so—this will have involved ensuring that the safeguards have been met.

Ofcom will have access to a wide range of information and must have the discretion to decide the most appropriate course of action in any particular scenario, including where this action lies outside the powers and procedures conferred by Clause 110; for instance, an initial period of voluntary engagement. This is an in extremis power. It is essential that we balance users’ rights with the need to enable a strong response, so Ofcom must be able to assess whether any alternative, less intrusive measures would effectively reduce the level of child sexual exploitation and abuse or terrorist content occurring on a service before issuing a notice.

I hope that that provides reassurance to noble Lords on the amendments in this group, and I invite the noble Lord to withdraw Amendment 14.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, this has been a very useful debate and serves as a good appetite builder for lunch, which I understand we will be able to take shortly.

I am grateful to the Minister for his response and to all noble Lords who have taken part in the debate. As always, the noble Baroness, Lady Kidron, gave us a balanced view of digital rights—the right to privacy and to security—and the fact that we should be trying to advance these two things simultaneously. She was right again to remind us that this is a real problem and there is a lot we can do. I know she has worked on this through things such as metadata—understanding who is communicating with whom—which might strike that nice balance where we are not infringing on people’s privacy too grossly but are still able to identify those who wish harm on our society and in particular on our children.

The noble Baroness, Lady Bennett, was right to pick up this tension between everything, everywhere, all at once and targeted surveillance. Again, that is really interesting to tease out. I am personally quite comfortable with quite intrusive targeted surveillance. I do not know whether noble Lords have been reading the Pegasus spyware stories: I am not comfortable with some Governments placing such spyware on the phones of human rights defenders but I would be much more relaxed about the British authorities placing something similar on the phones of people who are going to plant bombs in Manchester. We need to be really honest about where we are drawing our red lines if we want to go in the direction of targeted surveillance.

The noble Lord, Lord Moylan, was right again to remind us about the importance of private conversations. I cited the example of police officers whose conversations have been exposed. Although it is hard, we should remember that if ordinary citizens want to exchange horrible racist jokes with each other and so on in private groups that is not a matter for the state, but it is when it is somebody in a position of public authority; we have a right to intervene there. Again, we have to remember that as long as it is not illegal people can say horrible things in private, and we should not encourage any situation where we suggest that the state would interfere unless there are legitimate grounds—for example, it is a police officer or somebody is doing something that crosses the line of legality.

The noble Baroness, Lady Fox, reminded us that it is either encrypted or it is not. That is really helpful, as things cannot be half encrypted. If a service provider makes a commitment it is critical that it is truthful. That is what our privacy law tells us. If I say, “This service is encrypted between you and the person you send the message to”, and I know that there is somebody in between who could access it, I am lying. I cannot say it is a private service unless it is truly private. We have to bear that in mind. Historically, people might have been more comfortable with fudging it, but not in 2023, when have this raft of privacy legislation.

The noble Baroness is also right to remind us that privacy can be safety. There is almost nothing more devastating than the leaking of intimate images. When services such as iCloud move to encrypted storage that dramatically reduces the risk that somebody will get access to your intimate images if you store them there, which you are legally entitled to do. Privacy can be a critical part of an individual maintaining their own security and we should not lose that.

The noble Baroness, Lady Stowell, was right again to talk about general monitoring. I am pleased that she found the WhatsApp briefing useful. I was unable to attend but I know from previous contact that there are people doing good work and it is sad that that often does not come out. We end up with this very polarised debate, which my noble friend Lord McNally was right to remind us is unhelpful. The people south of the river are often working very closely in the public interest with people in tech companies. Public rhetoric tends to focus on why more is not being done; there are very few thanks for what is being done. I would like to see the debate move a little more in that direction.

The noble Lord, Lord Knight, opened up a whole new world of pain with VPNs, which I am sure we will come back to. I say simply that if we get the regulatory frameworks right, most people in Britain will continue to use mainstream services as long as they are allowed to be offered. If those services are regulated by the European Union under its Digital Services Act and pertain to the UK and the US in a similar way, they will in effect have global standards, so it will not matter where you VPN from. The scenario the noble Lord painted, which I worry about, is where those mainstream services are not available and we drive people into small, new services that are not regulated by anyone. We would then end up inadvertently driving people back to the wild west that we complain about, when most of them would prefer to use mainstream services that are properly regulated by Ofcom, the European Commission and the US authorities.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
So having different terms of service for different types of service is healthy, but I also think that Ofcom making sure that people do what they say they do is a reasonably healthy development, as long as we recognise and accept the consequences of that.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I am grateful for this short and focused debate, which has been helpful, and for the points made by the noble Lords, Lord Stevenson and Lord Allan, and the noble Baroness, Lady Kidron. I think we all share the same objective: ensuring that terms of service promote accountability and transparency, and empower users.

One of the Bill’s key objectives is to ensure that the terms of service of user-to-user platforms are suitable and effective. Under the Bill, companies will be required both to set out clearly how they will tackle illegal content and protect children and to ensure that their terms of service are properly enforced. The additional transparency and accountability duties on category 1 services will further ensure that users know what to expect on the largest platforms. This will put an end to these services arbitrarily removing content or, conversely, failing to remove content that they profess to prohibit.

The Bill will also ensure that search services are clear to their users about how they are complying with their adult and child safety duties under this new law. Given the very different way in which search services operate, however, this will be achieved through a publicly available statement rather than through terms of service. The two are meant distinctly.

Noble Lords are right to point to the question of intelligibility. It struck me that, if it takes 10 days to read terms of service, perhaps we should have a race during the 10 days allotted to this Committee stage to see which is quicker—but I take the point. The noble Lord, Lord Allan, is also right that the further requirements imposed through this Bill will only add to that.

The noble Baroness, Lady Kidron, asked a fair question about what “accessibility” means. The Bill requires all platforms’ terms of service for illegal content and child safety duties to be clear and accessible. Ofcom will provide guidance on what that means, including ensuring that they are suitably prominent. The same applies to terms of service for category 1 services relating to content moderation.

I will focus first on Amendments 16, 21, 66DA, 75 and 197, which seek to ensure that both Ofcom and platforms consider the risks associated with platforms’ terms of service with regard to the illegal content and child safety duties in the Bill. We do not think that these amendments are needed. User-to-user services will already be required to assess the risks regarding their terms of service for illegal content. Clause 8 requires companies to assess the “design and operation” of a service in relation to illegal content. As terms of service are integral to how a service operates, they would be covered by this provision. Similarly, Clause 10 sets out that companies likely to be accessed by children will be required to assess the “design and operation” of a service as part of their child risk assessments, which would include the extent to which their terms of service may reduce or increase the risk of harm to children.

In addition to those risk assessment duties, the safety duties will require companies to take proportionate measures effectively to manage and mitigate the risk of harm to people whom they have identified through risk assessments. This will include making changes to their terms of service, if appropriate. The Bill does not impose duties on search services relating to terms of service, as search services’ terms of service play a less important role in determining how users can engage on a platform. I will explain this point further when responding to specific amendments relating to search services but I can assure the noble Lord, Lord Stevenson, that search services will have comprehensive duties to understand and mitigate how the design and operation of their service affects risk.

Amendment 197 would require Ofcom to assess how platforms’ terms of service affect the risk of harm to people that the sector presents. While I agree that this is an important risk factor which Ofcom must consider, it is already provided for in Clause 89, which requires Ofcom to undertake an assessment of risk across regulated services. That requires Ofcom to consider which characteristics of regulated services give rise to harm. Given how integral terms of service are to how many technology companies function, Ofcom will necessarily consider the risk associated with terms of service when undertaking that risk assessment.

However, elevating terms of service above other systems and processes, as mentioned in Clause 89, would imply that Ofcom needs to take account of the risk of harm on the regulated service, more than it needs to do so for other safety-by-design systems and processes or for content moderation processes, for instance. That may not be suitable, particularly as the service delivery methods will inevitably change over time. Instead, Clause 89 has been written to give Ofcom scope to organise its risk assessment, risk register and risk profiles as it thinks suitable. That is appropriate, given that it is best placed to develop detailed knowledge of the matters in question as they evolve over time.

Amendments 70, 71, 72, 79, 80, 81, 174 and 302 seek to replace the Bill’s references to publicly available statements, in relation to search services, with terms of service. This would mean that search services would have to publish how they are complying with their illegal content and child protection duties in terms of service rather than in publicly available statements. I appreciate the spirit in which the noble Lord has tabled and introduced these amendments. However, they do not consider the very different ways in which search services operate.

User-to-user services’ terms of service fulfil a very specific purpose. They govern a user’s behaviour on the service and set rules on what a user is allowed to post and how they can interact with others. If a user breaks these terms, a service can block his or her access or remove his or her content. Under the status quo, users have very few mechanisms by which to hold user-to-user platforms accountable to these terms, meaning that users can arbitrarily see their content removed with few or no avenues for redress. Equally, a user may choose to use a service because its terms and conditions lead them to believe that certain types of content are prohibited while in practice the company does not enforce the relevant terms.

The Bill’s duties relating to user-to-user services’ terms of service seek to redress this imbalance. They will ensure that people know what to expect on a platform and enable them to hold platforms accountable. In contrast, users of search services do not create content or interact with other users. Users can search for anything without restriction from the search service provider, although a search term may not always return results. It is therefore not necessary to provide detailed information on what a user can and cannot do on a search service. The existing duties on such services will ensure that search engines are clear to users about how they are complying with their safety duties. The Bill will require search services to set out how they are fulfilling them, in publicly available statements. Their actions must meet the standards set by Ofcom. Using these statements will ensure that search services are as transparent as user-to-user services about how they are complying with their safety duties.

The noble Lord’s Amendment 174 also seeks to expand the transparency reporting requirements to cover the scope and application of the terms of service set out by search service providers. This too is unnecessary because, via Schedule 8, the Bill already ensures transparency about the scope and application of the provisions that search services must make publicly available. I hope that gives the noble Lord some reassurance that the concerns he has raised are already covered. With that, I invite him to withdraw Amendment 16.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am very grateful to the Minister for that very detailed response, which I will have to read very carefully because it was quite complicated. That is the answer to my question. Terms of service will not be very easy to identify because to answer my questions he has had to pray in aid issues that Ofcom will necessarily have to assess—terms of services—to get at whether the companies are performing the duties that the Bill requires of them.

I will not go further on that. We know that there will be enough there to answer the main questions I had about this. I take the point about search being distinctively different in this area, although a tidy mind like mine likes to see all these things in one place and understand all the words. Every time I see “publicly available statement”, I do not know why but I think about people being hanged in public rather than a term of service or a contract.

--- Later in debate ---
Moved by
16A: Clause 8, page 7, line 23, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 19 about supplying records of risk assessments to OFCOM.
--- Later in debate ---
Moved by
16B: Clause 9, page 7, line 27, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to summarise illegal content risk assessments in the terms of service (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we seem to have done it again—a very long list of amendments in a rather ill-conceived group has generated a very interesting discussion. We are getting quite good at this, exchanging views across the table, across the Committee, even within the Benches—Members who perhaps have not often talked together are sharing ideas and thoughts, and that is a wonderful feeling.

I want to start with an apology. I think I may be the person who got the noble Baroness, Lady Kidron, shopped by the former leader—once a leader, always a leader. What I thought I was being asked was whether the Committee would be interested in hearing the views of the noble Viscount who could not be present, and I was very keen, because when he does speak it is from a point of view that we do not often hear. I did not know that it was a transgression of the rules—but of course it is not, really, because we got round it. Nevertheless, I apologise for anything that might have upset the noble Baroness’s blood pressure—it did not stop her making a very good contribution later.

We have covered so much ground that I do not want to try and summarise it in one piece, because you cannot do that. The problem with the group as it stands is that the right reverend Prelate the Bishop of Derby and myself must have some secret connection, because we managed to put down almost the same amendments. They were on issues that then got overtaken by the Minister, who finally got round to—I mean, who put down a nice series of amendments which exactly covered the points we made, so we can lose all those. But this did not stop the right reverend Prelate the Bishop of Guildford making some very good additional points which I think we all benefited from.

I welcome back the noble Baroness, Lady Buscombe, after her illness; she gave us a glimpse of what is to come from her and her colleagues, but I will leave the particular issue that she raised for the Minister to respond to. It raises an issue that I am not competent on, but it is a very important one—we need to get the right balance between what is causing the alarm and difficulty outside in relation to what is happening on the internet, and I think we all agree with her that we should not put any barrier in the way of dealing with that.

Indeed, that was the theme of a number of the points that have been raised on the question of what is or can constitute illegal content, and how we judge it. It is useful to hear again from the master about how you do it in practice. I cannot imagine being in a room of French lawyers and experts and retaining my sanity, let alone making decisions that affect the ability of people to carry on, but the noble Lord did it; he is still here and lives to tell the tale—bearded or otherwise.

The later amendments, particularly from the noble Lord, Lord Clement-Jones, are taking us round in a circle towards the process by which Ofcom will exercise the powers that it is going to get in this area. These are probably worth another debate on their own, and maybe it will come up in a different form, because—I think the noble Baroness, Lady Stowell, made this point as well—there is a problem in having an independent regulator that is also the go-to function for getting advice on how others have to make decisions that are theirs to rule on at the end if they go wrong. That is a complicated way of saying that we may be overloading Ofcom if we also expect it to provide a reservoir of advice on how you deal with the issues that the Bill puts firmly on the companies—I agree that this is a problem that we do not really have an answer to.

My amendments were largely overtaken by the Government’s amendments, but the main one I want to talk about was Amendment 272. I am sorry that the noble Baroness, Lady Morgan, is not here, because her expertise is in an area that I want to talk about, which is fraud—cyber fraud in particular—and how that is going to be brought into the Bill. The issue, which I think has been raised by Which?, but a number of other people have also written to us about it, is that the Bill in Clauses 170 and 171 is trying to establish how a platform should identify illegal content in relation to fraud—but it is quite prescriptive. In particular, it goes into some detail which I will leave for the Minister to respond to, but uniquely it sets out a specific way for gathering information to determine whether content is illegal in this area, although it may have applicability in other areas.

One of the points that have to be taken into account is whether the platform is using human moderators, automated systems or a combination of the two. I am not quite sure why that is there in the Bill; that is really the basis for the tabling of our amendments. Clearly, one would hope that the end result is whether or not illegality has taken place, not how that information has been gathered. If one must make concessions to the process of law because a judgment is made that, because it is automated, it is in some way not as valid as if it had been done by a human moderator, there seems to be a whole world there that we should not be going into. I certainly hope that that is not going to be the case if we are talking about illegality concerning children or other vulnerable people, but that is how the Bill reads at present; I wonder whether the Minister can comment on that.

There is a risk of consumers being harmed here. The figures on fraud in the United Kingdom are extraordinary; the fact that it is not the top priority for everybody, let alone the Government, is extraordinary. It is something like the equivalent of consumers being scammed at the rate of around £7.5 billion per year. A number of awful types of scamming have emerged only because of the internet and social media. They create huge problems of anxiety and emotional distress, with lots of medical care and other things tied in if you want to work out the total bill. So we have a real problem here that we need to settle. It is great that it is in the Bill, but it would be a pity if the movement towards trying to resolve it is in any way infringed on by there being imperfect instructions in the Bill. I wonder whether the Minister would be prepared to respond to that; I would be happy to discuss it with him later, if that is possible.

As a whole, this is an interesting question as we move away from what a crime is towards how people judge how to deal with what they think is a crime but may not be. The noble Lord, Lord Allan, commented on how to do it in practice but one hopes that any initial problems will be overcome as we move forward and people become more experienced with this.

When the Joint Committee considered this issue, we spent a long time talking about why we were concerned about having certainty on the legal prescription in the Bill; that is why we were very much against the idea of “legal but harmful” because it seemed too subjective and too subject to difficulties. Out of that came another thought, which answers the point made by the noble Baroness, Lady Stowell: so much of this is about fine judgments on certain things that are there in stone and that you can work to but you then have to interpret them.

There is a role for Parliament here, I think; we will come on to this in later amendments but, if there is a debate to be had on this, let us not forget the points that have been made here today. If we are going to think again about Ofcom’s activity in practice, that is the sort of thing that either a Joint Committee or Select Committees of the two Houses could easily take on board as an issue that needs to be reflected on, with advice given to Parliament about how it might be taken forward. This might be the answer in the medium term.

In the short term, let us work to the Bill and make sure that it works. Let us learn from the experience but let us then take time out to reflect on it; that would be my recommendation but, obviously, that will be subject to the situation after we finish the Bill. I look forward to hearing the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, as well as throwing up some interesting questions of law, this debate has provoked some interesting tongue-twisters. The noble Lord, Lord Allan of Hallam, offered a prize to the first person to pronounce the Netzwerkdurchsetzungsgesetz; I shall claim my prize in our debate on a later group when inviting him to withdraw his amendment.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, that would be welcome.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Can I suggest one of mine?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I thank the noble Lord.

I was pleased to hear about Wicipedia Cymraeg—there being no “k” in Welsh. As the noble Lord, Lord Stevenson, said, there has been a very good conversational discussion in this debate, as befits Committee and a self-regulating House. My noble friend Lady Stowell is right to point out matters of procedure, although we were grateful to know why the noble Viscount, Lord Colville, supports the amendments in question.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

Or indeed any evidence.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I take the noble Lord’s point and my noble friend’s further contribution. I will see whether I can give a clearer and more succinct description in writing to flesh that out, but that it is the reason that we have alighted on the words that we have.

The noble Lord, Lord Allan, also asked about jurisdiction. If an offence has been committed in the UK and viewed by a UK user, it can be treated as illegal content. That is set out in Clause 53(11), which says:

“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom”.


I hope that that bit, at least, is clearly set out to the noble Lord’s satisfaction. It looks like it may not be.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Again, I think that that is clear. I understood from the Bill that, if an American says something that would be illegal were they to be in the United Kingdom, we would still want to exclude that content. But that still leaves it open, and I just ask the question again, for confirmation. If all of the activities are outside the United Kingdom—Americans talking to each other, as it were—and a British person objects, at what point would the platform be required to restrict the content of the Americans talking to each other? Is it pre-emptively or only as and when somebody in the United Kingdom objects to it? We should flesh out that kind of practical detail before this becomes law.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

If it has been committed in the UK and is viewed by a UK user, it can be treated as illegal. I will follow up on the noble Lord’s further points ahead of the next stage.

Amendment 272 explicitly provides that relevant information that is reasonably available to a provider includes information submitted by users in complaints. Providers will already need to do this when making judgments about content, as it will be both relevant and reasonably available.

My noble friend Lord Moylan returned to the question that arose on day 2 in Committee, querying the distinction between “protect” and “prevent”, and suggesting that a duty to protect would or could lead to the excessive removal of content. To be clear, the duty requires platforms to put in place proportionate systems and processes designed to prevent users encountering content. I draw my noble friend’s attention to the focus on systems and processes in that. This requires platforms to design their services to achieve the outcome of preventing users encountering such content. That could include upstream design measures, as well as content identification measures, once content appears on a service. By contrast, a duty to protect is a less stringent duty and would undermine the proactive nature of the illegal content duties for priority offences.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Before he moves on, is my noble friend going to give any advice to, for example, Welsh Wikipedia, as to how it will be able to continue, or are the concerns about smaller sites simply being brushed aside, as my noble friend explicates what the Bill already says?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will deal with all the points in the speech. If I have not done so by the end, and if my noble friend wants to intervene again, I would be more than happy to hear further questions, either to answer now or write to him about.

Amendments 128 to 133 and 143 to 153, in the names of the right reverend Prelate the Bishop of Derby and the noble Lord, Lord Stevenson of Balmacara, seek to ensure that priority offences relating to modern slavery and human trafficking, where they victimise children, are included in Schedule 6. These amendments also seek to require technology companies to report content which relates to modern slavery and the trafficking of children—including the criminal exploitation of children—irrespective of whether it is sexual exploitation or not. As noble Lords know, the strongest provisions in the Bill relate to children’s safety, and particularly to child sexual exploitation and abuse content. These offences are captured in Schedule 6. The Bill includes a power for Ofcom to issue notices to companies requiring them to use accredited technology or to develop new technology to identify, remove and prevent users encountering such illegal content, whether communicated publicly or privately.

These amendments would give Ofcom the ability to issue such notices for modern slavery content which affects children, even when there is no child sexual exploitation or abuse involved. That would not be appropriate for a number of reasons. The power to tackle illegal content on private communications has been restricted to the identification of content relating to child sexual exploitation and abuse because of the particular risk to children posed by content which is communicated privately. Private spaces online are commonly used by networks of criminals to share illegal images—as we have heard—videos, and tips on the commitment of these abhorrent offences. This is highly unlikely to be reported by other offenders, so it will go undetected if companies do not put in place measures to identify it. Earlier in Committee, the noble Lord, Lord Allan, suggested that those who receive it should report it, but of course, in a criminal context, a criminal recipient would not do that.

Extending this power to cover the identification of modern slavery in content which is communicated privately would be challenging to justify and could represent a disproportionate intrusion into someone’s privacy. Furthermore, modern slavery is usually identified through patterns of behaviour or by individual reporting, rather than through content alone. This reduces the impact that any proactive technology required under this power would have in tackling such content. Schedule 6 already sets out a comprehensive list of offences relating to child sexual exploitation and abuse which companies must tackle. If these offences are linked to modern slavery—for example, if a child victim of these offences has been trafficked—companies must take action. This includes reporting content which amounts to an offence under Schedule 6 to the National Crime Agency or another reporting body outside of the UK.

My noble friend Lord Moylan’s Amendment 135 seeks to remove the offence in Section 5 of the Public Order Act 1986 from the list of priority offences. His amendment would mean that platforms were not required to take proactive measures to reduce the risk of content which is threatening or abusive, and intended to cause a user harassment, alarm or distress, from appearing on their service. Instead, they would be obliged to respond only once they are made aware of the content, which would significantly reduce the impact of the Bill’s framework for tackling such threatening and abusive content. Given the severity of the harm which can be caused by that sort of content, it is right that companies tackle it. Ofcom will have to include the Public Order Act in its guidance about illegal content, as provided for in Clause 171.

Government Amendments 136A to 136C seek to strengthen the illegal content duties by adding further priority offences to Schedule 7. Amendments 136A and 136B will add human trafficking and illegal entry offences to the list of priority offences in the Bill. Crucially, this will mean that platforms will need to take proactive action against content which encourages or assists others to make dangerous, illegal crossings of the English Channel, as well as those who use social media to arrange or facilitate the travel of another person with a view to their exploitation.

The noble Lord, Lord Allan, asked whether these amendments would affect the victims of trafficking themselves. This is not about going after the victims. Amendment 136B addresses only content which seeks to help or encourage the commission of an existing immigration offence; it will have no impact on humanitarian communications. Indeed, to flesh out a bit more detail, Section 2 of the Modern Slavery Act makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation. Facilitating a victim’s travel includes recruiting them. This offence largely appears online in the form of advertisements to recruit people into being exploited. Some of the steps that platforms could put in place include setting up trusted flagger programmes, signposting users to support and advice, and blocking known bad actors. Again, I point to some of the work which is already being done by social media companies to help tackle both illegal channel crossings and human trafficking.

--- Later in debate ---
Moved by
18A: Clause 9, page 8, line 23, at end insert—
“(8A) A duty to summarise in the terms of service the findings of the most recent illegal content risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to individuals).”Member’s explanatory statement
This amendment requires providers of Category 1 services to summarise (in their terms of service) the findings of their latest risk assessment regarding illegal content and activity. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, over the last few hours I have praised us for having developed a style of discussion and debate that is certainly relatively new and not often seen in the House, where we have tried to reach out to each other and find common ground. That was not a problem in this last group of just over an hour; I think we are united around the themes that were so brilliantly introduced in a very concise and well-balanced speech by the noble Baroness, Lady Kidron, who has been a leading and inspirational force behind this activity for so long.

Although different voices have come in at different times and asked questions that still need to be answered, I sense that we have reached a point in our thinking, if not in our actual debates, where we need a plan. I too reached this point; that was exactly the motivation I had in tabling Amendment 1, which was discussed on the first day. Fine as the Bill is—it is a very impressive piece of work in every way—it lacks what we need as a Parliament to convince others that we have understood the issues and have the answers to their questions about what this Government, or this country as a whole, are going to do about this tsunami of difference, which has arrived in the wake of the social media companies and search engines, in the way we do our business and live our lives these days. There is consensus, but it is slightly different to the consensus we had in earlier debates, where we were reassuring ourselves about the issues we were talking about but were not reaching out to the Government to change anything so much as being happy that we were speaking the same language and that they were in the same place as we are gradually coming to as a group, in a way.

Just before we came back in after the lunch break, I happened to talk to the noble Lord, Lord Grade, who is the chair of Ofcom and is listening to most of our debates and discussions when his other duties allow. I asked him what he thought about it, and he said that it was fascinating for him to recognise the level of expertise and knowledge that was growing up in the House, and that it would be a useful resource for Ofcom in the future. He was very impressed by the way in which everyone was engaging and not getting stuck in the niceties of the legislation, which he admitted he was experiencing himself. I say that softly; I do not want to embarrass him in any way because he is an honourable man. However, the point he makes is really important.

I say to the Minister that I do not think we are very far apart on this. He knows that, because we have discussed it at some length over the last six to eight weeks. What I think he should take away from this debate is that this is a point where a decision has to be taken about whether the Government are going to go with the consensus view being expressed here and put deliberately into the Bill a repetitive statement, but one that is clear and unambiguous, about the intention behind the Government’s reason for bringing forward the Bill and for us, the Opposition and other Members of this House, supporting it, which is that we want a safe internet for our children. The way we are going to do that is by having in place, up front and clearly in one place, the things that matter when the regulatory structure sits in place and has to deal with the world as it is, of companies with business plans and business models that are at variance with what we think should be happening and that we know are destroying the lives of people we love and the future of our country—our children—in a way that is quite unacceptable when you analyse it down to its last detail.

It is not a question of saying back to us across the Dispatch Box—I know he wants to but I hope he will not—“Everything that you have said is in the Bill; we don’t need to go down this route, we don’t need another piece of writing that says it all”. I want him to forget that and say that actually it will be worth it, because we will have written something very special for the world to look at and admire. It is probably not in its perfect form yet, but that is what the Government can do: take a rough and ready potential diamond, polish it, chamfer it, and bring it back and set it in a diadem we would all be proud to wear—Coronations excepted—so that we can say, “Look, we have done the dirty work here. We’ve been right down to the bottom and thought about it. We’ve looked at stuff that we never thought in our lives we would ever want to see and survived”.

I shake at some of the material we were shown that Molly Russell was looking at. But I never want to be in a situation where I will have to say to my children and grandchildren, “We had the chance to get this right and we relied on a wonderful piece of work called the Online Safety Act 2023; you will find it in there, but it is going to take you several weeks and a lot of mental harm and difficulty to understand what it means”.

So, let us make it right. Let us not just say “It’ll be alright on the night”. Let us have it there. It is almost right but, as my noble friend Lord Knight said, it needs to be patched back into what is already in the Bill. Somebody needs to look at it and say, “What, out of that, will work as a statement to the world that we care about our kids in a way that will really make a difference?” I warn the Minister that, although I said at Second Reading that I wanted to see this Bill on the statute book as quickly as possible, I will not accept a situation where we do not have more on this issue.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am grateful to all noble Lords who have spoken on this group and for the clarity with which the noble Lord, Lord Stevenson, has concluded his remarks.

Amendments 20, 74, 93 and 123, tabled by the noble Baroness, Lady Kidron, would mean a significant revising of the Bill’s approach to content that is harmful to children. It would set a new schedule of harmful content and risk to children—the 4 Cs—on the face of the Bill and revise the criteria for user-to-user and search services carrying out child safety risk assessments.

I start by thanking the noble Baroness publicly—I have done so privately in our discussions—for her extensive engagement with the Government on these issues over recent weeks, along with my noble friends Lord Bethell and Lady Harding of Winscombe. I apologise that it has involved the noble Baroness, Lady Harding, missing her stop on the train. A previous discussion we had also very nearly delayed her mounting a horse, so I can tell your Lordships how she has devoted hours to this—as they all have over recent weeks. I would like to acknowledge their campaigning and the work of all organisations that the noble Baroness, Lady Kidron, listed at the start of her speech, as well as the families of people such as Olly Stephens and the many others that the right reverend Prelate the Bishop of Oxford mentioned.

I also reassure your Lordships that, in developing this legislation, the Government carried out extensive research and engagement with a wide range of interested parties. That included reviewing international best practice. We want this to be world-leading legislation, including the four Cs framework on the online risks of harm to children. The Government share the objectives that all noble Lords have echoed in making sure that children are protected from harm online. I was grateful to the noble Baroness, Lady Benjamin, for echoing the remarks I made earlier in Committee on this. I am glad we are on the same page, even if we are still looking at points of detail, as we should be.

As the noble Baroness, Lady Kidron, knows, it is the Government’s considered opinion that the Bill’s provisions already deliver these objectives. I know that she remains to be convinced, but I am grateful to her for our continuing discussions on that point, and for continuing to kick the tyres on this to make sure that this is indeed legislation of which we can be proud.

It is also clear that there is broad agreement across the House that the Bill should tackle harmful content to children such as content that promotes eating disorders, illegal behaviour such as grooming and risk factors for harm such as the method by which content is disseminated, and the frequency of alerts. I am pleased to be able to put on record that the Bill as drafted already does this in the Government’s opinion, and reflects the principles of the four Cs framework, covering each of those: content, conduct, contact and commercial or contract risks to children.

First, it is important to understand how the Bill defines content, because that question of definition has been a confusing factor in some of the discussions hitherto. When we talk in general terms about content, we mean the substance of a message. This has been the source of some confusion. The Bill defines “content”, for the purposes of this legislation, in Clause 207 extremely broadly as

“anything communicated by means of an internet service”.

Under this definition, in essence, all user communication and activity, including recommendations by an algorithm, interactions in the metaverse, live streams, and so on, is facilitated by “content”. So, for example, unwanted and inappropriate contact from an adult to a child would be treated by the Bill as content harm. The distinctions that the four Cs make between content, conduct and contact risks is therefore not necessary. For the purposes of the Bill, they are all content risks.

Secondly, I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Where are the commercial harms? I cannot totally get my head around my noble friend’s definition of content. I can sort of understand how it extends to conduct and contact, but it does not sound as though it could extend to the algorithm itself that is driving the addictive behaviour that most of us are most worried about.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

In that vein, will the noble Lord clarify whether that definition of content does not include paid-for content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.

I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.

With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I agree that content, although unfathomable to the outside world, is defined as the Minister says. However, does that mean that when we see that

“primary priority content harmful to children”

will be put in regulations by the Secretary of State under Clause 54(2)—ditto Clause 54(3) and (4)—we will see those contact risks, conduct risks and commercial risks listed as primary priority, priority and non-designated harms?

I do not want to make my speech twice, but in my final sentence I said that my challenge to the Government is to have a very simple way forward by other means, if those things were articulated, but my understanding is that they are to bring forward content harms that describe only content as we normally believe it.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I have tried to outline the Bill’s definition of content, which I think will give some reassurance that other concerns that noble Lords have raised are covered. I will turn in a moment to address priority and primary priority content, if the noble Baroness will allow me to do that, and then perhaps intervene again if I have not done so to her satisfaction. I want to set that out and try to keep track of all the questions which have been posed as I do so.

For now, I know there have been concerns from some noble Lords that if functionalities are not labelled as harm in the legislation they would not be addressed by providers, and I reassure your Lordships’ House that this is not the case. There is an important distinction between content and other risk factors such as, for instance, an algorithm, which without content cannot risk causing harm to a child. That is why functionalities are not covered by the categories of primary, priority and priority content which is harmful to children. The Bill sets out a comprehensive risk assessment process which will cover content or activity that poses a risk of harm to children and other factors, such as functionality, which may increase the risk of harm. As such, the existing children’s risk assessment criteria already cover many of the changes proposed in this amendment. For example, the duties already require service providers to assess the risk of harm to children from their business model and governance. They also require providers to consider how a comprehensive range of functionalities affect risk, how the service is used and how the use of algorithms could increase the risks to children.

Turning to the examples of harmful content set out in the proposed new schedule, I am happy to reassure the noble Baroness and other noble Lords that the Government’s proposed list of primary, priority and priority content covers a significant amount of this content. In her opening speech she asked about cumulative harm—that is, content sent many times or content which is harmful due to the manner of its dissemination. We will look at that in detail on the next group as well, but I will respond to the points she made earlier now. The definition of harm in the Bill under Clause 205 makes it clear that physical or psychological harm may arise from the fact or manner of dissemination of the content, not just the nature of the content—content which is not harmful per se, but which if sent to a child many times, for example by an algorithm, would meet the Bill’s threshold for content that is harmful to children. Companies will have to consider this as a fundamental part of their risk assessment, including, for example, how the dissemination of content via algorithmic recommendations may increase the risk of harm, and they will need to put in place proportionate and age-appropriate measures to manage and mitigate the risks they identify. I followed the exchanges between the noble Baronesses, Lady Kidron and Lady Fox, and I make it clear that the approach set out by the Bill will mean that companies cannot avoid tackling the kind of awful content which Molly Russell saw and the harmful algorithms which pushed that content relentlessly at her.

This point on cumulative harm was picked up by my noble friend Lord Bethell. The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service by way of payment or non-financial reward. This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children. The actions that companies will be required to take under their risk assessment duties in the Bill and the safety measures they will be required to put in place to manage the services risk will consider this bigger-picture risk profile.

The amendments of the noble Baroness, Lady Kidron, would remove references to primary priority and priority harmful content to children from the child risk assessment duties, which we fear would undermine the effectiveness of the child safety duties as currently drafted. That includes the duty for user-to-user providers to prevent children encountering primary priority harms, such as pornography and content that promotes self-harm or suicide, as well as the duty to put in place age-appropriate measures to protect children from other harmful content and activity. As a result, we fear these amendments could remove the requirement for an age-appropriate approach to protecting children online and make the requirement to prevent children accessing primary priority content less clear.

The noble Baroness, Lady Kidron, asked in her opening remarks about emerging harms, which she was right to do. As noble Lords know, the Bill has been designed to respond as rapidly as possible to new and emerging harms. First, the primary priority and priority list of content can be updated by the Secretary of State. Secondly, it is important to remember the function of non-designated content that is harmful to children in the Bill—that is content that meets the threshold of harmful content to children but is not on the lists designated by the Government. Companies are required to understand and identify this kind of content and, crucially, report it to Ofcom. Thirdly, this will inform the actions of Ofcom itself in its review and report duties under Clause 56, where it is required to review the incidence of harmful content and the severity of harm experienced by children as a result of it. This is not limited to content that the Government have listed as being harmful, as it is intended to capture new and emerging harms. Ofcom will be required to report back to the Government with recommendations on changes to the primary priority and priority content lists.

I turn to the points that the noble Lord, Lord Knight of Weymouth, helpfully raised earlier about things that are in the amendments but not explicitly mentioned in the Bill. As he knows, the Bill has been designed to be tech-neutral, so that it is future-proof. That is why there is no explicit reference to the metaverse or virtual or augmented reality. However, the Bill will apply to service providers that enable users to share content online or interact with each other, as well as search services. That includes a broad range of services such as websites, applications, social media sites, video games and virtual reality spaces such as the metaverse; those are all captured. Any service that allows users to interact, as the metaverse does, will need to conduct a children’s access assessment and comply with the child safety duties if it is likely to be accessed by children.

Amendment 123 from the noble Baroness, Lady Kidron, seeks to amend Clause 48 to require Ofcom to create guidance for Part 3 service providers on this new schedule. For the reasons I have just set out, we do not think it would be workable to require Ofcom to produce guidance on this proposed schedule. For example, the duty requires Ofcom to provide guidance on the content, whereas the proposed schedule includes examples of risky functionality, such as the frequency and volume of recommendations.

I stress again that we are sympathetic to the aim of all these amendments. As I have set out, though, our analysis leads us to believe that the four Cs framework is simply not compatible with the existing architecture of the Bill. Fundamental concepts such as risk, harm and content would need to be reconsidered in the light of it, and that would inevitably have a knock-on effect for a large number of clauses and timing. The Bill has benefited from considerable scrutiny—pre-legislative and in many discussions over many years. The noble Baroness, Lady Kidron, has been a key part of that and of improving the Bill. The task is simply unfeasible at this stage in the progress of the Bill through Parliament and risks delaying it, as well as significantly slowing down Ofcom’s implementation of the child safety duties. We do not think that this slowing down is a risk worth taking, because we believe the Bill already achieves what is sought by these amendments.

Even so, I say to the Committee that we have listened to the noble Baroness, Lady Kidron, and others and have worked to identify changes which would further address these concerns. My noble friend Lady Harding posed a clear question: if not this, what would the Government do instead? I am pleased to say that, as a result of the discussions we have had, the Government have decided to make a significant change to the Bill. We will now place the categories of primary priority and priority content which is harmful to children on the face of the Bill, rather than leaving them to be designated in secondary legislation, so Parliament will have its say on them.

We hope that this change will reassure your Lordships that protecting children from the most harmful content is indeed the priority for the Bill. That change will be made on Report. We will continue to work closely with the noble Baroness, Lady Kidron, my noble friends and others, but I am not able to accept the amendments in the group before us today. With that, I hope that she will be willing to withdraw.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank all the speakers. There were some magnificent speeches and I do not really want to pick out any particular ones, but I cannot help but say that the right reverend Prelate described the world without the four Cs. For me, that is what everybody in the Box and on the Front Bench should go and listen to.

I am grateful and pleased that the Minister has said that the Government are moving in this direction. I am very grateful for that but there are a couple of things that I have to come back on. First, I have swiftly read Amendment 205’s definition of harm and I do not think it says that you do not have to reach a barrier of harm; dissemination is quite enough. There is always the problem of what the end result of the harm is. The thing that the Government are not listening to is the relationship between the risk assessment and the harm. It is about making sure that we are clear that it is the functionality that can cause harm. I think we will come back to this at another point, but that is what I beg them to listen to. Secondly, I am not entirely sure that it is correct to say that the four Cs mean that you cannot have primary priority, priority and so on. That could be within the schedule of content, so those two things are not actually mutually exclusive. I would be very happy to have a think about that.

What was not addressed in the Minister’s answer was the point made by the noble Lord, Lord Allan of Hallam, in supporting the proposal that we should have in the schedule: “This is what you’ve got to do; this is what you’ve got to look at; this is what we’re expecting of you; and this is what Parliament has delivered”. That is immensely important, and I was so grateful to the noble Lord, Lord Stevenson, for putting his marker down on this set of amendments. I am absolutely committed to working alongside him and to finding ways around this, but we need to find a way of stating it.

Ironically, that is my answer to both the noble Baronesses, Lady Ritchie and Lady Fox: we should have our arguments here and now, in this Chamber. I do not wish to leave it to the Secretary of State, whom I have great regard for, as it happens, but who knows: I have seen a lot of Secretaries of State. I do not even want to leave it to the Minister, because I have seen a lot of Ministers too—ditto Ofcom, and definitely not the tech sector. So here is the place, and we are the people, to work out the edges of this thing.

Not for the first time, my friend, the noble Baroness, Lady Harding, read out what would have been my answer to the noble Baroness, Lady Ritchie. I have gone round and round, and it is like the Marx brothers’ movie: in the end, harm is defined by subsection (4)(c), but that says that harm will defined by the Secretary of State. It goes around like that through the Bill.

--- Later in debate ---
Moved by
21A: Clause 10, page 10, line 1, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 19 about supplying records of risk assessments to OFCOM.
--- Later in debate ---
Moved by
22A: Clause 11, page 10, line 6, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise children’s risk assessments in the terms of service (see the amendment in the Minister’s name inserting new subsection (10A) below) is imposed only on providers of Category 1 services.

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, first, I will address Amendments 12BA, 183A and 183B, tabled by the noble Baroness, Lady Ritchie of Downpatrick, who I was grateful to discuss them with earlier today, and the noble Lord, Lord Morrow, whose noble friend, the noble Lord, Lord Browne of Belmont, I am grateful to for speaking to them on his behalf.

These amendments seek to apply the duties in Part 5 of the Bill, which are focused on published pornographic content and user-generated pornography. Amendments 183A and 183B are focused particularly on making sure that children are protected from user-to-user pornography in the same way as from published pornography, including through the use of age verification. I reassure the noble Baroness and the noble Lord that the Government share their concerns; there is clear evidence about the impact of pornography on young people and the need to protect children from it.

This is where I come to the questions posed earlier by the noble Lord, Lord McCrea of Magherafelt and Cookstown. The research we commissioned from the British Board of Film Classification assessed the functionality of and traffic to the UK’s top 200 most visited pornographic websites. The findings indicated that 128 of the top 200 most visited pornographic websites—that is just under two-thirds, or 64%—would have been captured by the proposed scope of the Bill at the time of the Government’s initial response to the online harms White Paper, and that represents 85% of the traffic to those 200 websites.

Since then, the Bill’s scope has been broadened to include search services and pornography publishers, meaning that children will be protected from pornography wherever it appears online. The Government expect companies to use age-verification technologies to prevent children accessing services which pose the highest risk to children, such as online pornography. Age-assurance technologies and other measures will be used to provide children with an age-appropriate experience on their service.

As noble Lords know, the Bill does not mandate that companies use specific approaches or technologies when keeping children safe online as it is important that the Bill is future-proofed: what is effective today might not be so effective in the future. Moreover, age verification may not always be the most appropriate or effective approach for user-to-user companies to comply with their duties under the Bill. For instance, if a user-to-user service, such as a social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. That would allow content to be better detected and removed, instead of restricting children from a service that is designed to be appropriate for their use—as my noble friend Lady Harding of Winscombe puts it, avoiding the situation where children are removed from these services altogether.

While I am sympathetic to the aims of these amendments, I assure noble Lords that the Bill already has robust, comprehensive protections in place to keep children safe from all pornographic content, wherever or however it appears online. This amendment is therefore unnecessary because it duplicates the existing provisions for user-to-user pornography in the child safety duties in Part 3.

It is important to be clear that, wherever they are regulated in the Bill, companies will need to ensure that children cannot access pornographic content online. This is made clear, for user-to-user content, in Clause 11(3); for search services, in Clause 25(3); and for published pornographic content in Clause 72(2). Moving the regulation of pornography from Part 3 to Part 5 would not be a workable or desirable option because the framework is effective only if it is designed to reflect the characteristics of the services in scope.

Part 3 has been designed to address the particular issues arising from the rapid growth in platforms that allow the sharing of user-generated content but are not the ones choosing to upload that content. The scale and speed of dissemination of user-generated content online demands a risk-based and proportionate approach, as Part 3 sets out.

It is also important that these companies understand the risks to children in the round, rather than focusing on one particular type of content. Risks to children will often be a consequence of the design of these services—for instance, through algorithms, which need to be tackled holistically.

I know that the noble Baroness is concerned about whether pornography will indeed be designated as primary priority content for the purposes of the child safety duties in Clauses 11(3) and 25(3). The Government fully intend this to be the case, which means that user-to-user services will need to have appropriate systems to prevent children accessing pornography, as defined in Clause 70(2).

The approach taken in Part 3 is very different from services captured under Part 5, which are publishing content directly, know exactly where it is located on their site and already face legal liability for the content. In this situation the service has full control over its content, so a risk-based approach is not appropriate. It is reasonable to expect that service to prevent children accessing pornography. We do not therefore consider it necessary or effective to apply the Part 5 duties to user-to-user pornographic content.

I also assure the noble Baroness and the noble Lord that, in a case where a provider of user-to-user services is directly publishing pornographic content on its own service, it will already be subject to the Part 5 duties in relation to that particular content. Those duties in relation to that published pornographic content will be separate from and in addition to their Part 3 duties in relation to user-generated pornographic content.

This means that, no matter where published pornographic content appears, the obligation to ensure that children are not normally able to encounter it will apply to all in-scope internet service providers that publish pornographic content. This is made clear in Clause 71(2) and is regardless of whether they also offer user-to-user or search services.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry, but can the Minister just clarify that? Is he saying that it is not possible to be covered by both Part 3 and Part 5, so that where a Part 5 service has user-generated content it is also covered by Part 3? Can he clarify that you cannot just escape Part 5 by adding user-generated content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, that is correct. I was trying to address the points raised by the noble Baroness, but the noble Lord is right. The point on whether people might try to be treated differently by allowing comments or reviews on their content is that they would be treated the same way. That is the motivation behind the noble Baroness’s amendment trying to narrow the definition. There is no risk that a publisher of pornographic content could evade their Part 5 duties by enabling comments or reviews on their content. That would be the case whether or not those reviews contained words, non-verbal indications that a user liked something, emojis or any other form of user-generated content.

That is because the Bill has been designed to confer duties on different types of content. Any service with provider pornographic content will need to comply with the Part 5 duties to ensure that children cannot normally encounter such content. If they add user-generated functionality—

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

I am sorry to come back to the same point, but let us take the Twitter example. As a publisher of pornography, does Twitter then inherit Part 5 responsibilities in as much as it is publishing pornography?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It is covered in the Bill as Twitter. I am not quite sure what my noble friend is asking me. The harms that he is worried about are covered in different ways. Twitter or another social medium that hosts such content would be hosting it, not publishing it, so would be covered by Part 3 in that instance.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

Maybe my noble friend the Minister could write to me to clarify that point, because it is quite a significant one.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Perhaps I will speak to the noble Lord afterwards and make sure I have his question right before I do so.

I hope that answers the questions from the noble Baroness, Lady Ritchie, and that on that basis, she will be happy to withdraw her amendment.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very wide-ranging debate, concentrating not only on the definition of pornography but on the views of noble Lords in relation to how it should be regulated, and whether it should be regulated, as the noble Baroness, Lady Kidron, the noble Lords, Lord Bethell and Lord Browne, and I myself believe, or whether it should be a graduated response, which seems to be the view of the noble Lords, Lord Allan and Lord Clement-Jones.

I believe that all pornography should be treated the same. There is no graduated response. It is something that is pernicious and leads to unintended consequences for many young people, so therefore it needs to be regulated in all its forms. I think that is the point that the noble Lord, Lord Bethell, was making. I believe that these amendments should have been debated along with those of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, because then we could have an ever wider-ranging debate, and I look forward to that in the further groups in the days to come. The focus should be on the content, not on the platform, and the content is about pornography.

I agree with the noble Baroness, Lady Kidron, that porn is not the only harm, and I will be supporting her amendments. I believe that they should be in the Bill because if we are serious about dealing with these issues, they have to be in there.

I do not think my amendments are suggesting that children will be removed from social media. I agree that it is a choice to remove pornography or to age-gate. Twitter is moving to subscriber content anyway, so it can do it; the technology is already available to do that. I believe you just age-gate the porn content, not the whole site. I agree with the noble Lord, Lord Clement-Jones, as I said. These amendments should have been debated in conjunction with those of the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, as I believe that the amendments in this group are complementary to those, and I think I already said that in my original submission.

I found the Minister’s response interesting. Obviously, I would like time to read Hansard. I think certain undertakings were given, but I want to see clearly spelled out where they are and to discuss with colleagues across the House where we take these issues and what we come back with on Report.

I believe that these issues will be debated further in Committee when the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, are debated. I hope that in the intervening period the Minister will have time to reflect on the issues raised today about Parts 3 and 5 and the issue of pornography, and that he will be able to help us in further sessions in assuaging the concerns that we have raised about pornography. There is no doubt that these issues will come back. The only way that they can be dealt with, that pornography can be dealt with and that all our children throughout the UK can be dealt with is through proper regulation.

I think we all need further reflection. I will see, along with colleagues, whether it is possible to come back on Report. In the meantime, I beg leave to withdraw the amendment.

--- Later in debate ---
Moved by
12C: Clause 6, page 5, line 23, at end insert “(2) to (10)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 11 below (because the new duty to summarise children’s risk assessments in the terms of service is only imposed on providers of Category 1 services).

Online Safety Bill

Lord Parkinson of Whitley Bay Excerpts
Finally, let me say this in anticipation of the Minister perhaps suggesting that this might be a good idea but we are far down the road with the Bill and Ofcom is ready to go and we want to get on with implementing it, so maybe let us not do this now but perhaps in another piece of legislation. Personally, I am interested in having a conversation about the sequence of implementation. It might be that we can implement the regime that Ofcom is good to go on but with the powers there in the Bill for it to cover app stores and some other wider internet services, according to a road map that it sets out and that we in Parliament can scrutinise. However, my general message is, as the noble Baroness, Lady Kidron, said, that we should get this right in this legislation and grab the opportunity, particularly with app stores, to bring other internet services in—given that we consume so much through applications—and to provide a safer environment for our children.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I share noble Lords’ determination to deliver the strongest protections for children and to develop a robust and future-proofed regulatory regime. However, it will not be possible to solve every problem on the internet through this Bill, nor through any piece of legislation, flagship or otherwise. The Bill has been designed to confer duties on the services that pose the greatest risk of harm—user-to-user services and search services—and where there are proportionate measures that companies can take to protect their users.

As the noble Baroness, Lady Kidron, and others anticipated, I must say that these services act as a gateway for users to discover and access other online content through search results and links shared on social media. Conferring duties on these services will therefore significantly reduce the risk of users going on to access illegal or harmful content on non-regulated services, while keeping the scope of the Bill manageable and enforceable.

As noble Lords anticipated, there is also a practical consideration for Ofcom in all this. I know that many noble Lords are extremely keen to see this Bill implemented as swiftly as possible; so am I. However, as the noble Lord, Lord Allan, rightly pointed out, making major changes to the Bill’s scope at this stage would have significant implications for Ofcom’s implementation timelines. I say this at the outset because I want to make sure that noble Lords are aware of those implications as we look at these issues.

I turn first to Amendments 2, 3, 5, 92 and 193, tabled by the noble Baroness, Lady Kidron. These aim to expand the number of services covered by the Bill to incorporate a broader range of services accessed by children and a broader range of harms. I will cover the broader range of harms more fully in a separate debate when we come to Amendment 93, but I am very grateful to the noble Baroness for her constructive and detailed discussions on these issues over the past few weeks and months.

These amendments would bring new services into scope of the duties beyond user-to-user and search services. This could include services which enable or promote commercial harms, including consumer businesses such as online retailers. As I have just mentioned in relation to the previous amendments, bringing many more services into scope would delay the implementation of Ofcom’s priorities and risk detracting from its work overseeing existing regulated services where the greatest risk of harm exists—we are talking here about the services run by about 2.5 million businesses in the UK alone. I hope noble Lords will appreciate from the recent communications from Ofcom how challenging the implementation timelines already are, without adding further complication.

Amendment 92 seeks to change the child-user condition in the children’s access assessment to the test in the age-appropriate design code. The test in the Bill is already aligned with the test in that code, which determines whether a service is likely to be accessed by children, in order to ensure consistency for providers. The current child-user condition determines that a service is likely to be accessed by children where it has a significant number or proportion of child users, or where it is of a kind likely to attract a significant number or proportion of child users. This will already bring into scope services of the kind set out in this amendment, such as those which are designed or intended for use by children, or where children form a—

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I am sorry to interrupt. Will the Minister take the opportunity to say what “significant” means, because that is not aligned with the ICO code, which has different criteria?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.

On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.

The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

I know it has been said that the large language models, such as that used by ChatGPT, will be in scope when they are embedded in search, but are they in scope generally?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

They are when they apply to companies enabling users to share content online and interact with each other or in terms of search. They apply in the context of the other duties set out in the Bill.

Amendments 19, 22, 298 and 299, tabled by my noble friend Lady Harding of Winscombe, seek to impose child safety duties on application stores. I am grateful to my noble friend and others for the collaborative approach that they have shown and for the time that they have dedicated to discussing this issue since Second Reading. I appreciate that she has tabled these amendments in the spirit of facilitating a conversation, which I am willing to continue to have as the Bill progresses.

As my noble friend knows from our discussions, there are challenges with bringing application stores—or “app stores” as they are popularly called—into the scope of the Bill. Introducing new duties on such stores at this stage risks slowing the implementation of the existing child safety duties, in the way that I have just outlined. App stores operate differently from user-to-user and search services; they pose different levels of risk and play a different role in users’ experiences online. Ofcom would therefore need to recruit different people, or bring in new expertise, to supervise effectively a substantially different regime. That would take time and resources away from its existing priorities.

We do not think that that would be a worthwhile new route for Ofcom, given that placing child safety duties on app stores is unlikely to deliver any additional protections for children using services that are already in the scope of the Bill. Those services must already comply with their duties to keep children safe or will face enforcement action if they do not. If companies do not comply, Ofcom can rely on its existing enforcement powers to require app stores to remove applications that are harmful to children. I am happy to continue to discuss this matter with my noble friend and the noble Lord, Lord Knight, in the context of the differing implementation timelines, as he has asked.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

The Minister just said something that was material to this debate. He said that Ofcom has existing powers to prevent app stores from providing material that would have caused problems for the services to which they allow access. Can he confirm that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Perhaps the noble Lord could clarify his question; I was too busy finishing my answer to the noble Lord, Lord Knight.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

It is a continuation of the point raised by the noble Baroness, Lady Harding, and it seems that it will go part of the way towards resolving the differences that remain between the Minister and the noble Baroness, which I hope can be bridged. Let me put it this way: is it the case that Ofcom either now has powers or will have powers, as a result of the Bill, to require app stores to stop supplying children with material that is deemed in breach of the law? That may be the basis for understanding how you can get through this. Is that right?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Services already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, on that point, my reading of Clauses 131 to 135, where the Bill sets out the business disruption measures, is that they could be used precisely in that way. It would be helpful for the Minister responding later to clarify that Ofcom would use those business disruption measures, as the Government explicitly anticipate, were an app store, in a rogue way, to continue to list a service that Ofcom has said should not be made available to people in the United Kingdom.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will be very happy to set that out in more detail.

Amendments 33A and 217A in the name of the noble Lord, Lord Storey, would place a new duty on user-to-user services that predominantly enable online gaming. Specifically, they would require them to have a classification certificate stating the age group for which they are suitable. We do not think that is necessary, given that there is already widespread, voluntary uptake of approval classification systems in online gaming.

--- Later in debate ---
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it has certainly been an interesting debate, and I am grateful to noble Lords on all sides of the Committee for their contributions and considerations. I particularly thank the noble Lords who tabled the amendments which have shaped the debate today.

In general, on these Benches, we believe that the Bill offers a proportionate approach to tackling online harms. We feel that granting some of the exemptions proposed in this group would be unintentionally counterproductive and would raise some unforeseen difficulties. The key here—and it has been raised by a number of noble Lords, including the noble Baronesses, Lady Harding and Lady Kidron, and, just now, the noble Lord, Lord Clement-Jones, who talked about the wider considerations of the Joint Committee and factors that should be taken into account—is that we endorse a risk-based approach. In this debate, it is very important that we take ourselves back to that, because that is the key.

My view is that using other factors, such as funding sources or volunteer engagement in moderation, cuts right across this risk-based approach. To refer to Amendment 4, it is absolutely the case that platforms with fewer than 1 million UK monthly users have scope to create considerable harm. Indeed, noble Lords will have seen that later amendments call for certain small platforms to be categorised on the basis of the risk—and that is the important word—that they engender, rather than the size of the platform, which, unfortunately, is something of a crude measure. The point that I want to make to the noble Baroness, Lady Fox, is that it is not about the size of the businesses and how they are categorised but what they actually do. The noble Baroness, Lady Kidron, rightly said that small is not safe, for all the reasons that were explained, including by the noble Baroness, Lady Harding.

Amendment 9 would exempt small and medium-sized enterprises and certain other organisations from most of the Bill’s provisions. I am in no doubt about the well-meaning nature of this amendment, tabled by the noble Lord, Lord Moylan, and supported by the noble Lord, Lord Vaizey. Indeed, there may well be an issue about how start-ups and entrepreneur unicorns cope with the regulatory framework. We should attend to that, and I am sure that the Minister will have something to say about it. But I also expect that the Minister will outline why this would actually be unhelpful in combating many of the issues that this Bill is fundamentally designed to deal with if we were to go down the road of these exclusions.

In particular, granting exemptions simply on the basis of a service’s size could lead to a situation where user numbers are capped or perhaps even where platforms are deliberately broken up to avoid regulation. This would have an effect that none of us in this Chamber would want to see because it would embed harmful content and behaviour rather than helping to reduce them.

Referring back to the comments of the noble Lord, Lord Moylan, I agree with the noble Lord, Lord Vaizey, in his reflection. I, too, have not experienced the two sides of the Chamber that the noble Lord, Lord Moylan, described. I feel that the Chamber has always been united on the matter of child safety and in understanding the ramifications for business. It is the case that good legislation must always seek a balance, but, to go back to the point about excluding small and medium-sized enterprises, to call them a major part of the British economy is a bit of an understatement when they account for 99.9% of the business population. In respect of the exclusion of community-based services, including Wikipedia—and we will return to this in the next group—there is nothing for platforms to fear if they have appropriate systems in place. Indeed, there are many gains to be had for community-based services such as Wikipedia from being inside the system. I look forward to the further debate that we will have on that.

I turn to Amendment 9A in the name of my noble friend Lord Knight of Weymouth, who is unable to participate in this section of the debate. It probes how the Bill’s measures would apply to specialised search services. Metasearch engines such as Skyscanner have expressed concern that the legislation might impose unnecessary burdens on services that pose little risk of hosting the illegal content targeted by the Bill. Perhaps the Minister, in his response, could confirm whether or not such search engines are in scope. That would perhaps be helpful to our deliberations today.

While we on these Benches are not generally supportive of exemptions, the reality is that there are a number of online search services that return content that would not ordinarily be considered harmful. Sites such as Skyscanner and Expedia, as we all know, allow people to search for and book flights and other travel services such as car hire. Obviously, as long as appropriate due diligence is carried out on partners and travel agents, the scope for users to encounter illegal or harmful material appears to be minimal and returns us to the point of having a risk-based approach. We are not necessarily advocating for a carve-out from the Bill, but it would perhaps be helpful to our deliberations if the Minister could outline how such platforms will be expected to interact with the Ofcom-run online safety regime.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, but I cannot accept the amendments tabled by the noble Baroness, Lady Fox, and others. Doing so would greatly reduce the strong protections that the Bill offers to internet users, particularly to children. I agree with the noble Baroness, Lady Merron, that that has long been the shared focus across your Lordships’ House as we seek to strike the right balance through the Bill. I hope to reassure noble Lords about the justification for the existing balance and scope, and the safeguards built in to prevent undue burdens to business.

I will start with the amendments tabled by the noble Baroness, Lady Fox of Buckley—Amendments 4, 6 to 8, 12, 288 and 305—which would significantly narrow the definition of services in scope of regulation. The current scope of the Bill reflects evidence of where harm is manifested online. There is clear evidence that smaller services can pose a significant risk of harm from illegal content, as well as to children, as the noble Baroness, Lady Kidron, rightly echoed. Moreover, harmful content and activity often range across a number of services. While illegal content or activity may originate on larger platforms, offenders often seek to move to smaller platforms with less effective systems for tackling criminal activity in order to circumvent those protections. Exempting smaller services from regulation would likely accelerate that process, resulting in illegal content being displaced on to smaller services, putting users at risk.

These amendments would create significant new loopholes in regulation. Rather than relying on platforms and search services to identify and manage risk proactively, they would require Ofcom to monitor smaller harmful services, which would further annoy my noble friend Lord Moylan. Let me reassure the noble Baroness, however, that the Bill has been designed to avoid disproportionate or unnecessary burdens on smaller services. All duties on services are proportionate to the risk of harm and the capacity of companies. This means that small, low-risk services will have minimal duties imposed on them. Ofcom’s guidance and codes of practice will set out how they can comply with their duties, in a way that I hope is even clearer than the Explanatory Notes to the Bill, but certainly allowing for companies to have a conversation and ask for areas of clarification, if that is still needed. They will ensure that low-risk services do not have to undertake unnecessary measures if they do not pose a risk of harm to their users.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, while my noble friend is talking about the possibility of excessive and disproportionate burden on businesses, can I just ask him about the possibility of excessive and disproportionate burden on the regulator? He seems to be saying that Ofcom is going to have to maintain, and keep up to date regularly, 25,000 risk assessments—this is on the Government’s own assessment, produced 15 months ago, of the state of the market then—even if those assessments carried out by Ofcom result in very little consequence for the regulated entity.

We know from regulation in this country that regulators already cannot cope with the burdens placed on them. They become inefficient, sclerotic and unresponsive; they have difficulty in recruiting staff of the same level and skills as the entities that they regulate. We have a Financial Services and Markets Bill going through at the moment, and the FCA is a very good example of that. Do we really think that this is a sensible burden to place on a regulator that is actually able to discharge it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

The Bill creates a substantial new role for Ofcom, but it has already substantially recruited and prepared for the effective carrying out of that new duty. I do not know whether my noble friend was in some of the briefings with officials from Ofcom, but it is very happy to set out directly the ways in which it is already discharging, or preparing to discharge, those duties. The Government have provided it with further resource to enable it to do so. It may be helpful for my noble friend to have some of those discussions directly with the regulator, but we are confident that it is ready to discharge its duties, as set out in the Bill.

I was about to say that we have already had a bit of discussion on Wikipedia. I am conscious that we are going to touch on it again in the debate on the next group of amendments so, at the risk of being marked down for repetition, which is a black mark on that platform, I shall not pre-empt what I will say shortly. But I emphasise that the Bill does not impose prescriptive, one-size-fits-all duties on services. The codes of practice from Ofcom will set out a range of measures that are appropriate for different types of services in scope. Companies can follow their own routes to compliance, so long as they are confident that they are effectively managing risks associated with legal content and, where relevant, harm to children. That will ensure that services that already use community moderation effectively can continue to do so—such as Wikipedia, which successfully uses that to moderate content. As I say, we will touch on that more in the debate on the next group.

Amendment 9, in the name of my noble friend Lord Moylan, is designed to exempt small and medium sized-enterprises working to benefit the public from the scope of the Bill. Again, I am sympathetic to the objective of ensuring that the Bill does not impose undue burdens on small businesses, and particularly that it should not inhibit services from providing valuable content of public benefit, but I do not think it would be feasible to exempt service providers deemed to be

“working to benefit the public”.

I appreciate that this is a probing amendment, but the wording that my noble friend has alighted on highlights the difficulties of finding something suitably precise and not contestable. It would be challenging to identify which services should qualify for such an exemption.

Taking small services out of scope would significantly undermine the framework established by the Bill, as we know that many smaller services host illegal content and pose a threat to children. Again, let me reassure noble Lords that the Bill has been designed to avoid disproportionate or unnecessary regulatory burdens on small and low-risk services. It will not impose a disproportionate burden on services or impede users’ access to value content on smaller services.

Amendment 9A in the name of the noble Lord, Lord Knight of Weymouth, is designed to exempt “sector specific search services” from the scope of the Bill, as the noble Baroness, Lady Merron, explained. Again, I am sympathetic to the intention here of ensuring that the Bill does not impose a disproportionate burden on services, but this is another amendment that is not needed as it would exempt search services that may pose a significant risk of harm to children, or because of illegal content on them. The amendment aims to exempt specialised search services—that is, those that allow users to

“search for … products or services … in a particular sector”.

It would exempt specialised search services that could cause harm to children or host illegal content—for example, pornographic search services or commercial search services that could facilitate online fraud. I know the noble Lord would not want to see that.

The regulatory duties apply only where there is a significant risk of harm and the scope has been designed to exclude low-risk search services. The duties therefore do not apply to search engines that search a single database or website, for example those of many retailers or other commercial websites. Even where a search service is in scope, the duties on services are proportionate to the risk of harm that they pose to users, as well as to a company’s size and capacity. Low-risk services, for example, will have minimal duties. Ofcom will ensure that these services can quickly and easily comply by publishing risk profiles for low-risk services, enabling them easily to understand their risk levels and, if necessary, take steps to mitigate them.

The noble Lord, Lord McCrea, asked some questions about the 200 most popular pornographic websites. If I may, I will respond to the questions he posed, along with others that I am sure will come in the debate on the fifth group, when we debate the amendments in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Ritchie of Downpatrick, because that will take us on to the same territory.

I hope that provides some assurance to my noble friend Lord Moylan, the noble Baroness, Lady Fox, and others, and that they will be willing not to press their amendments in this group.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I thank people for such a wide-ranging and interesting set of contributions. I take comfort from the fact that so many people understood what the amendments were trying to do, even if they did not fully succeed in that. I thought it was quite interesting that in the first debate the noble Lord, Lord Allan of Hallam, said that he might be a bit isolated on the apps, but I actually agreed with him—which might not do his reputation any good. However, when he said that, I thought, “Welcome to my world”, so I am quite pleased that this has not all been shot down in flames before we started. My amendment really was a serious attempt to tackle something that is a real problem.

The Minister says that the Bill is designed to avoid disproportionate burdens on services. All I can say is, “Sack the designer”. It is absolutely going to have a disproportionate burden on a wide range of small services, which will not be able to cope, and that is why so many of them are worried about it. Some 80% of the companies that will be caught up in this red tape are small and micro-businesses. I will come to the small business point in a moment.

The noble Baroness, Lady Harding, warned us that small tech businesses become big tech businesses. As far as I am concerned, that is a success story—it is what I want; is it not what we all want? Personally, I think economic development and growth is a positive thing—I do not want them to fail. However, I do not think it will ever happen; I do not think that small tech businesses will ever grow into big tech businesses if they face a disproportionate burden in the regulatory sense, as I have tried to describe. That is what I am worried about, and it is not a positive thing to be celebrated.

I stress that it is not small tech and big tech. There are also community sites, based on collective moderation. Wikipedia has had a lot of discussion here. For a Bill that stresses that it wants to empower users, we should think about what it means when these user-moderated community sites are telling us that they will not be able to carry on and get through. That is what they are saying. It was interesting that the noble Lord, Lord Clement-Jones, said that he relies on Wikipedia—many of us do, although please do not believe what it says about me. There are all of these things, but then there was a feeling that, well, Reddit is a bit dodgy. The Bill is not meant to be deciding which ones to trust in quite that way, or people’s tastes.

I was struck that the noble Baroness, Lady Kidron, said that small is not safe, and used the incel example. I am not emphasising that small is safe; I am saying that the small entities will not survive this process. That is my fear. I do not mean that the big ones are nasty and dangerous and the small ones are cosy, lovely and Wikipedia-like. I am suggesting that smaller entities will not be able to survive the regulatory onslaught. That is the main reason I raised this.

The noble Baroness, Lady Merron, said that these entities can cause great harm. I am worried about a culture of fear, in which we demonise tens of thousands of innocent tech businesses and communities and end up destroying them when we do not intend to. I tried to put in the amendment an ability for Ofcom, if there are problematic sites that are risky, to deal with them. As the Minister kept saying, low-risk search engines have been exempted. I am suggesting that low-risk small and micro-businesses are exempted, which is the majority of them. That is what I am suggesting, rather than that we assume they are all guilty and then they have to get exempted.

Interestingly, the noble Lord, Lord McCrea, asked how many pornography sites are in scope and which pornographic websites have a million or fewer users. I am glad I do not know the answer to that, otherwise people might wonder why I did. The point is that there are always going to be sites that are threatening or a risk to children, as we are discussing. But we must always bear in mind—this was the important point that the noble Lord, Lord Moylan, made—that in our absolute determination to protect children via this Bill we do not unintendedly damage society as a whole. Adult access to free speech, for example, is one of my concerns, as are businesses and so on. We should not have that as an outcome.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Like others, I had prepared quite extensive notes to respond to what I thought the noble Lord was going to say about his amendments in this group, and I have not been able to find anything left that I can use, so I am going to have to extemporise slightly. I think it is very helpful to have a little non-focused discussion about what we are about to talk about in terms of age, because there is a snare and a delusion in quite a lot of it. I was put in mind of that in the discussions on the Digital Economy Act, which of course precedes the Minister but is certainly still alive in our thinking: in fact, we were talking about it earlier today.

The problem I see is that we have to find a way of squaring two quite different approaches. One is to prevent those who should not be able to see material, because it is illegal for them to see it. The other is to find a way of ensuring that we do not end up with an age-gated internet, which I am grateful to find that we are all, I think, agreed about: that is very good to know.

Age is very tricky, as we have heard, and it is not the only consideration we have to bear in mind in wondering whether people should be able to gain access to areas of the internet which we know will be bad and difficult for them. That leads us, of course, to the question about legal but harmful, now resolved—or is it? We are going to have this debate about age assurance and what it is. What is age verification? How do they differ? How does it matter? Is 18 a fixed and final point at which we are going to say that childhood ends and adulthood begins, and therefore one is open for everything? It is exactly the point made earlier about how to care for those who should not be exposed to material which, although legal for them by a number called age, is not appropriate for them in any of the circumstances which, clinically, we might want to bring to bear.

I do not think we are going to resolve these issues today—I hope not. We are going to talk about them for ever, but at this stage I think we still need a bit of thinking outside a box which says that age is the answer to a lot of the problems we have. I do not think it is, but whether the Bill is going to carry that forward I have my doubts. How we get that to the next stage, I do not know, but I am looking forward to hearing the Minister’s comments on it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I agree that this has been a rather unfortunate grouping and has led to a slightly strange debate. I apologise if it is the result of advice given to my noble friend. I know there has been some degrouping as well, which has led to slightly odd combinations today. However, as promised, I shall say a bit more about Wikipedia in relation to my noble friend’s Amendments 10 and 11.

The effect of these amendments would be that moderation actions carried out by users—in other words, community moderation of user-to-user and search services —would not be in scope of the Bill. The Government support the use of effective user or community moderation by services where this is appropriate for the service in question. As I said on the previous group, as demonstrated by services such as Wikipedia, this can be a valuable and effective means of moderating content and sharing information. That is why the Bill does not impose a one-size-fits-all requirement on services, but instead allows services to adopt their own approaches to compliance, so long as these are effective. The noble Lord, Lord Allan of Hallam, dwelt on this. I should be clear that duties will not be imposed on individual community moderators; the duties are on platforms to tackle illegal content and protect children. Platforms can achieve this through, among other things, centralised or community moderation. Ultimately, however, it is they who are responsible for ensuring compliance and it is platforms, not community moderators, who will face enforcement action if they fail to do so.

--- Later in debate ---
Moved by
12A: Clause 6, page 5, line 11, at end insert “(2) to (8)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 9 below (because the new duty to summarise illegal content risk assessments in the terms of service is only imposed on providers of Category 1 services).
--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, this group of government amendments relates to risk assessments; it may be helpful if I speak to them now as the final group before the dinner break.

Risk management is at the heart of the Bill’s regulatory framework. Ofcom and services’ risk assessments will form the foundation for protecting users from illegal content and content which is harmful to children. They will ensure that providers thoroughly identify the risks on their own websites, enabling them to manage and mitigate the potential harms arising from them. Ofcom will set out the risks across the sector and issue guidance to companies on how to conduct their assessments effectively. All providers will be required to carry out risk assessments, keep them up-to-date and update them before making a significant change to the design or operation of their service which could put their users at risk. Providers will then need to put in place measures to manage and mitigate the risks they identify in their risk assessments, including any emerging risks.

Given how crucial the risk assessments are to this framework, it is essential that we enable them to be properly scrutinised by the public. The government amendments in this group will place new duties on providers of the largest services—that is, category 1 and 2A services—to publish summaries of their illegal and child safety risk assessments. Through these amendments, providers of these services will also have a new duty to send full records of their risk assessments to Ofcom. This will increase transparency about the risk of harm on the largest platforms, clearly showing how risk is affected by factors such as the design, user base or functionality of their services. These amendments will further ensure that the risk assessments can be properly assessed by internet users, including by children and their parents and guardians, by ensuring that summaries of the assessments are publicly available. This will empower users to make informed decisions when choosing whether and how to use these services.

It is also important that Ofcom is fully appraised of the risks identified by service providers. That is why these amendments introduce duties for both category 1 and 2A services to send their records of these risk assessments, in full, to Ofcom. This will make it easier for Ofcom to supervise compliance with the risk assessment duties, as well as other duties linked to the findings of the risk assessments, rather than having to request the assessments from companies under its information-gathering powers.

These amendments also clarify that companies must keep a record of all aspects of their risk assessments, which strengthens the existing record-keeping duties on services. I hope that noble Lords will welcome these amendments. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.

The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.

--- Later in debate ---
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the Minister for introducing this group, and we certainly welcome this tranche of government amendments. We know that there are more to come both in Committee and as we proceed to Report, and we look forward to seeing them.

The amendments in this group, as other noble Lords have said, amount to a very sensible series of changes to services’ risk-assessment duties. This perhaps begs the question of why they were not included in earlier drafts of the Bill, but we are glad to see them now.

There is, of course, the issue of precisely where some of the information will appear, as well as the wider status of terms of service. I am sure those issues will be discussed in later debates. It is certainly welcome that the department is introducing stronger requirements around the information that must be made available to users; it will all help to make this a stronger and more practical Bill.

We all know that users need to be able to make informed decisions, and it will not be possible if they are required to view multiple statements and various documents. It seems that the requirements for information to be provided to Ofcom go to the very heart of the Bill, and I suggest that the proposed system will work best if there is trust and transparency between the regulator and those who are regulated. I am sure that there will be further debate on the scope of risk assessments, particularly on issues that were dropped from previous iterations of the Bill, and certainly this is a reasonable starting point today.

I will try to be as swift as possible as I raise a few key issues. One is about avoiding warnings that are at such a high level of generality that they get put on to everything. Perhaps the Minister could indicate how Ofcom will ensure that the summaries are useful and accessible to the reader. The test, of course, should be that a summary is suitable and sufficient for a prospective user to form an assessment of the likely risk they would encounter when using the service, taking into account any special vulnerabilities that they might have. That needs to be the test; perhaps the Minister could confirm that.

Is the terms of service section the correct place to put a summary of the illegal content risk assessment? Research suggests, unsurprisingly, that only 3% of people read terms before signing up—although I recall that, in an earlier debate, the Minister confessed that he had read all the terms and conditions of his mobile phone contract, so he may be one of the 3%. It is without doubt that any individual should be supported in their ability to make choices, and the duty should perhaps instead be to display a summary of the risks with due prominence, to ensure that anyone who is considering signing up to a service is really able to read it.

I also ask the Minister to confirm that, despite the changes to Clause 19 in Amendment 16B, the duty to keep records of risk assessments will continue to apply to all companies, but with an enhanced responsibility for category 1 companies.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am grateful to noble Lords for their questions on this, and particularly grateful to the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, for their chorus of welcome. Where we are able to make changes, we will of course bring them forward, and I am glad to be able to bring forward this tranche now.

As the noble Lord, Lord Allan, said, ensuring the transparency of services’ risk assessments will further ensure that the framework of the Bill delivers its core objectives relating to effective risk management and increased accountability regarding regulated services. As we have discussed, it is imperative that these providers take a thorough approach to identifying risks, including emerging risks. The Government believe that it is of the utmost importance that the public are able effectively to scrutinise the risk assessments of the largest in-scope services, so that users can be empowered to make informed decisions about whether and how to use their services.

On the questions from the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, about why it is just category 1 and category 2A services, we estimate that there will be around 25,000 UK service providers in scope of the Bill’s illegal and child safety duties. Requiring all these companies to publish full risk assessments and proactively to send them to Ofcom could undermine the Bill’s risk-based and proportionate approach, as we have discussed in previous groups on the burdens to business. A large number of these companies are likely to be low risk and it is unlikely that many people will seek out their risk assessments, so requiring all companies to publish them would be an excessive regulatory burden.

There would also be an expectation that Ofcom would proactively monitor a whole range of services, even ones that posed a minimal risk to users. That in turn could distract Ofcom from taking a risk-based approach in its regulation by overwhelming it with paperwork from thousands of low-risk services. If Ofcom wants to see records of the risk assessments of providers that are not category 1 or category 2A services, it has extensive information-gathering powers that it can use to require a provider to send it such records.

The noble Baroness, Lady Merron, was right to say that I read the terms of my broadband supply—I plead guilty to the nerdiness of doing that—but I have not read all the terms and conditions of every application and social medium I have downloaded, and I agree that many people do skim through them. They say the most commonly told lie on the planet at the moment is “I agree to the terms and conditions”, and the noble Baroness is right to point to the need for these to be intelligible, easily accessible and transparent—which of course we want to see.

In answer to her other question, the record-keeping duty will apply to all companies, but the requirement to publish is only for category 1 and category 2A companies.

The noble Baroness, Lady Kidron, asked me about Amendment 27A. If she will permit me, I will write to her with the best and fullest answer to that question.

I am grateful to noble Lords for their questions on this group of amendments.

Amendment 12A agreed.
Moved by
12B: Clause 6, page 5, line 16, at end insert “(2) to (6)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 19 below (because the new duty to supply records of risk assessments to OFCOM is only imposed on providers of Category 1 services).
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, let me start by saying how saying how pleased I, too, am that we are now in Committee. I thank all noble Lords for giving up their time to attend the technical briefings that officials in my department and I have held since Second Reading and for the collaborative and constructive nature of their contributions in those discussions.

In particular, not least because today is his birthday, I pay tribute to the noble Lord, Lord Stevenson of Balmacara, for his tireless work on the Bill—from his involvement in its pre-legislative scrutiny to his recall to the Front Bench in order to see the job through. We are grateful for his diligence and, if I may say so, the constructive and collaborative way in which he has gone about it. He was right to pay tribute both to my noble friend Lord Gilbert of Panteg, who chaired the Joint Committee, and to the committee’s other members, including all the other signatories to this amendment. The Bill is a better one for their work, and I repeat my thanks to them for it. In that spirit, I am grateful to the noble Lord for bringing forward this philosophical opening amendment. As noble Lords have said, it is a helpful place for us to start and refocus our thoughts as we begin our line-by-line scrutiny of this Bill.

Although I agree with the noble Lord’s broad description of his amendment’s objectives, I am happy to respond to the challenge that lies behind it and put the objectives of this important legislation clearly on the record at the outset of our scrutiny. The Online Safety Bill seeks to bring about a significant change in online safety. The main purposes of the Bill are: to give the highest levels of protection to children; to protect users of all ages from being exposed to illegal content; to ensure that companies’ approach focuses on proactive risk management and safety by design; to protect people who face disproportionate harm online including, for instance, because of their sex or their ethnicity or because they are disabled; to maintain robust protections for freedom of expression and privacy; and to ensure that services are transparent and accountable.

The Bill will require companies to take stringent measures to tackle illegal content and protect children, with the highest protections in the Bill devoted to protecting children; as the noble Baroness, Lady Benjamin, my noble friend Lord Cormack and others have again reminded us today, that is paramount. Children’s safety is prioritised throughout this Bill. Not only will children be protected from illegal content through its illegal content duties but its child safety duties add an additional layer of protection so that children are protected from harmful or inappropriate content such as grooming, pornography and bullying. I look forward to contributions from the noble Baroness, Lady Kidron, and others who will, I know, make sure that our debates are properly focused on that.

Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure both that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.

Regulated services will need to prioritise responding to online content and activity that present the highest risk of harm to users, including where this is linked to something classified as a protected characteristic under the terms of the Equality Act 2010. This will ensure that platforms protect users who are disproportionately affected by online abuse—for example, women and girls. When undertaking child safety and illegal content risk assessments, providers must consider whether certain people face a greater risk of harm online and ensure that those risks are addressed and mitigated.

The Bill will place duties relating to freedom of expression and privacy on both Ofcom and all in-scope companies. Those companies will have to consider and implement safeguards for freedom of expression when fulfilling their duties. Ofcom will need to carry out its new duties in a way that protects freedom of expression. The largest services will also have specific duties to protect democratic and journalistic content.

Ensuring that services are transparent about the risks on their services and the actions they are taking to address them is integral to this Bill. User-to-user services must set out in their terms of service how they are complying with their illegal and child safety duties. Search services must do the same in public statements. In addition, government amendments that we tabled yesterday will require the biggest platforms to publish summaries of their illegal and their child safety risk assessments, increasing transparency and accountability, and Ofcom will have a power to require information from companies to assess their compliance with providers’ duties.

Finally, the Bill will also increase transparency and accountability relating to platforms with the greatest influence over public discourse. They will be required to ensure that their terms of service are clear and properly enforced. Users will be able to hold platforms accountable if they fail to enforce those terms.

The noble Baroness, Lady Kidron, asked me to say which of the proposed new paragraphs (a) to (g), to be inserted by Amendment 1, are not the objectives of this Bill. Paragraph (a) sets out that the Bill must ensure that services

“do not endanger public health or national security”.

The Bill will certainly have a positive impact on national security, and a core objective of the Bill is to ensure that platforms are not used to facilitate terrorism. Ofcom will issue a stand-alone code on terrorism, setting out how companies can reduce the risk of their services being used to facilitate terrorist offences, and remove such content swiftly if it appears. Companies will also need to tackle the new foreign interference offence as a priority offence. This will ensure that the Bill captures state-sponsored disinformation, which is of most concern—that is, attempts by foreign state actors to manipulate information to interfere in our society and undermine our democratic, political and legal processes.

The Bill will also have a positive impact on public health but I must respectfully say that that is not a primary objective of the legislation. In circumstances where there is a significant threat to public health, the Bill already provides powers for the Secretary of State both to require Ofcom to prioritise specified objectives when carrying out its media literacy activity and to require companies to report on the action they are taking to address the threat. Although the Bill may lead to additional improvements—I am sure that we all want to see them—for instance, by increasing transparency about platforms’ terms of service relating to public health issues, making this a primary objective on a par with the others mentioned in the noble Lord’s amendment risks making the Bill much broader and more unmanageable. It is also extremely challenging to prohibit such content, where it is viewed by adults, without inadvertently capturing useful health advice or legitimate debate and undermining the fundamental objective of protecting freedom of expression online—a point to which I am sure we will return.

The noble Lord’s amendment therefore reiterates many objectives that are interwoven throughout the legislation. I am happy to say again on the record that I agree with the general aims it proposes, but I must say that accepting it would be more difficult than the noble Lord and others who have spoken to it have set out. Accepting this amendment, or one like it, would create legal uncertainty. I have discussed with the officials sitting in the Box—the noble Baroness, Lady Chakrabarti, rightly paid tribute to them—the ways in which such a purposive statement, as the noble Lord suggests, could be made; we discussed it between Second Reading and now.

I appreciate the care and thought with which the noble Lord has gone about this—mindful of international good practice in legislation and through discussion with the Public Bill Office and others, to whom he rightly paid tribute—but any deviation from the substantive provisions of the Bill and the injection of new terminology risk creating uncertainty about the proper interpretation and application of those provisions. We have heard that again today; for example, the noble Baroness, Lady Fox, said that she was not clear what the meaning of certain words may be while my noble friend Lady Stowell made a plea for simplicity in legislation. The noble Lord, Lord Griffiths, also gave an eloquent exposition of the lexicographical befuddlement that can ensue when new words are added. All pointed to some confusion; indeed, there have been areas of disagreement even in what I am sure the noble Lord, Lord Stevenson, thinks was a very consensual summary of the purposes of the Bill.

That legal uncertainty could provide the basis for an increased number of judicial reviews or challenges to the decisions taken under the Bill and its framework, creating significant obstacles to the swift and effective implementation of the new regulatory framework, which I know is not something that he or other noble Lords would want. As noble Lords have noted, this is a complicated Bill, but adding further statements and new terminology to it, for however laudable a reason, risks adding to that complication, which can only benefit those with, as the noble Baroness, Lady Kidron, put it, the deepest pockets.

However, lest he think that I and the Government have not listened to his pleas or those of the Joint Committee, I highlight, as my noble friend Lady Stowell did, that the Joint Committee’s original recommendation was that these objectives

“should be for Ofcom”.

The Government took that up in Schedule 4 to the Bill, and in Clause 82(4), which set out objectives for the codes and for Ofcom respectively. At Clause 82(4) the noble Lord will see the reference to

“the risk of harm to citizens presented by content on regulated services”

and

“the need for a higher level of protection for children than for adults”.

I agree with the noble Baroness, Lady Chakrabarti, that it is not impossible to add purposive statements to Bills and nor is it unprecedented. I echo her tribute to the officials and lawyers in government who have worked on this Bill and given considerable thought to it. She has had the benefit of sharing their experience and the difficulties of writing tightly worded legislation. In different moments of her career, she has also had the benefit of picking at the loose threads in legislation and poking at the holes in it. That is the purpose of lawyers who question the thoroughness with which we have all done our work. I will not call them “pesky lawyers”, as she did—but I did hear her say it. I understand the point that she was making in anticipation but reassure her that she has not pre-empted the points that I was going to make.

To the layperson, legislation is difficult to understand, which is why we publish Explanatory Notes, on which the noble Baroness and others may have had experience of working before. I encourage noble Lords, not just today but as we go through our deliberations, to consult those as well. I hope that noble Lords will agree that they are more easily understood, but if they do not do what they say and provide explanation, I will be very willing to listen to their thoughts on it.

So, while I am not going to give the noble Lord, Lord Stevenson, the birthday present of accepting his amendment, I hope that the clear statement that I gave at the outset from this Dispatch Box, which is purposive as well, about the objectives of the Bill, and my outline of how it tries to achieve them, is a sufficient public statement of our intent, and that it achieves what I hope he was intending to get on the record today. I invite him to withdraw his amendment.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Well, my Lords, it has been a very good debate, and we should be grateful for that. In some senses, I should bank that; we have got ourselves off to a good start for the subsequent debates and discussions that we will have on the nearly 310 amendments that we must get through before the end of the process that we have set out on.

However, let us pause for a second. I very much appreciated the response, not least because it was very sharp and very focused on the amendment. It would have been tempting to go wider and wider, and I am sure that the Minister had that in mind at some point, but he has not done that. The first substantial point that he made seemed to be a one-pager about what this Bill is about. Suitably edited and brought down to manageable size, it would fit quite well into the Bill. I am therefore a bit puzzled as to why he cannot make the jump, intellectually or otherwise, from having that written for him and presumably working on it late at night with candles so that it was perfect—because it was pretty good; I will read it very carefully in Hansard, but it seemed to say everything that I wanted to say and covered most of the points that everybody else thought of to say, in a way that would provide clarity for those seeking it.

The issue we are left with was touched on by the noble Baroness, Lady Stowell, in her very perceptive remarks. Have we got this pointing in the right direction? We should think about it as a way for the Government to get out of this slightly ridiculous shorthand of the safest place to be online, to a statement to themselves about what they are trying to do, rather than an instruction to Ofcom—because that is where it gets difficult and causes problems with the later stages. This is really Parliament and government agreeing to say this, in print, rather than just through reading Hansard. That then reaches back to where my noble friend Lady Chakrabarti is, and it helps the noble Baroness, Lady Harding, with her very good point, that this will not work if people do not even bother to get through the first page.

Young Female Racing Drivers

Lord Parkinson of Whitley Bay Excerpts
Tuesday 18th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Strathcarron Portrait Lord Strathcarron
- Hansard - - - Excerpts

To ask His Majesty’s Government what steps they are taking to support young female racing drivers to ensure that they enjoy the same opportunities as their male counterparts.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, the Government are committed to supporting women’s sport at every opportunity, pushing for greater participation, employment, commercial opportunities and visibility in the media. We warmly welcome the creation of the F1 Academy in providing opportunities for young female drivers to progress to higher levels of competition in motorsport and we support its focus on uncovering the next generation of young female drivers.

Lord Strathcarron Portrait Lord Strathcarron (Con)
- View Speech - Hansard - - - Excerpts

I thank my noble friend the Minister for his reply. While the UK remains the £9 billion-a-year epicentre of world motorsport and Formula 1 itself and seven out of 10 of the Grand Prix teams are based in the UK, and while three out of five of the fastest male drivers in the world are British and the fastest female driver in the world is British, last year the women’s W Series championship was curtailed through a lack of funding. Should the same fate befall the F1 Academy series, which my noble friend the Minister mentioned, would he use his legendary powers of persuasion to convince the motorsport hierarchy that investing in women’s motorsport is a good idea, not just in itself but to maintain the UK’s position as the premier leader in world motorsport?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I certainly agree with my noble friend that it is definitely a worthwhile investment. As recent achievements in football, rugby and tennis have shown, women’s successes in sport not only bring delight to the viewing public but inspire women and girls to take part and to get more active. As a Formula 1 fan myself, I warmly welcome the creation of the F1 Academy and look forward to its first race in Austria later this month. I am also pleased by the news that its races will align next season with Formula 1 race weekends. It is run by Susie Wolff, who is an inspiring role model. At the British Grand Prix in 2014, she became the first woman to take part in a Formula 1 race weekend in 22 years. With a British team taking part, and with British drivers including Chloe Grant, Abbi Pulling and Jessica Edgar hoping to follow hot on the heels of the three-time W Series winner Jamie Chadwick, it is clear that there are many reasons for British fans to be especially excited.

Lord Hain Portrait Lord Hain (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, is not the problem that initiatives around women’s participation in motorsport begin far too late, when all the best racing drivers start in karting at six or seven years old? Likewise, Ministers need to start promoting more women engineers, beginning with schoolgirls. Could the Government be much more positive towards motorsport, in which, as the noble Lord, Lord Strathcarron, said, the UK is a world leader? As such, the sport is a great ambassador for British high-performance engineering and talent, including championing sustainable fuels which are carbon neutral.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

The noble Lord is right to point to the many ways that women can get involved in motorsports, not just as drivers but as team principals, nutritionists, psychologists, talent scouts and in many other roles. Lots of people have obviously been inspired by the recent Netflix series, “Drive to Survive”, which perhaps did not give enough screen time to all the women who take part. There is definitely a role for the sport itself, as well as for government and parliamentarians in exchanges such as this, to draw attention to that and to inspire people to get involved at every level.

Lord Addington Portrait Lord Addington (LD)
- View Speech - Hansard - - - Excerpts

My Lords, can the Minister assure us that there will be a slightly more open and coherent attitude towards the full participation of women across non-traditional groups? At the moment, the Government seem to be following behind the sports themselves as opposed to leading. Will they tell us where that guidance will come from and who will be leading it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My department is in the process of finalising a new government sports strategy. Central to that is tackling the inequalities that exist in activity rates and making all sports more inclusive. We want to see people getting involved. I have pointed to recent successes of the Lionesses and the achievements of the Red Roses and the Great Britain team in tennis. Those British heroes are inspiring women and girls to get involved and we are keen to amplify their successes to inspire others.

Baroness Sugg Portrait Baroness Sugg (Con)
- View Speech - Hansard - - - Excerpts

My Lords, is my noble friend the Minister aware of Extreme E, the off-road electric series that requires teams to put up one male and one female driver? They use the same equipment, they race the same track, and they both make an equal contribution to the team’s performance. Will the Minister join me in welcoming that as creating new opportunities for female motorsport drivers?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I certainly do, and I know that Extreme E was important to Jamie Chadwick’s career progression before the W Series. I had the pleasure of taking part in the Lords versus Commons full-bore rifle match alongside my noble friend Lady Sugg, which is another sport in which men and women compete alongside each other on equal terms. In some settings, that is of course possible and to be encouraged.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as the Minister referred to, while the “Drive to Survive” series has been hugely successful, females in motorsport found that women spoke only for some six minutes and seven seconds of the six and a half hours of the series. They did that as fans or as workers providing food or applying make-up to drivers, which reflected that women are, to make an understatement, very much in the background of the industry. What discussions has the department had with key motorsport stakeholders about addressing the presence of women across the industry? Could the Department for Education perhaps be prevailed upon to do more to ensure that relevant apprenticeships and vocational courses are signposted to everyone, irrespective of their gender?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I certainly agree with the noble Baroness: we want to hear more from the women who are involved at the highest levels in motorsport, inspiring women such as Susie Wolff, and to remind people of the trailblazing women who have paved the way, such as Lella Lombardi and Desiré Wilson—who has a grandstand name after her at Brands Hatch. Officials at the department have spoken to Formula 1 about the creation of the F1 Academy. As I say, we warmly welcome that as a way of inspiring more people, and are working on the cross-government sports strategy, which, of course, involves liaising with the Department for Education to make sure that in schools we are enabling people to get involved, try new sports and go as far as their talent and ambitions take them.

Lord Kirkhope of Harrogate Portrait Lord Kirkhope of Harrogate (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest as a driver of a fast car, but I suggest that the game is up as far as your Lordships are concerned. The truth of the matter is that statistic after statistic says clearly that women are better drivers than men. Indeed, four times as many reckless driving cases are brought to our courts in relation to men than in relation to women. Does my noble friend agree that the time is now ripe for us to return to the issue of insurance premiums and to stop women being discriminated against with regard to them, reflecting their better driving?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My noble friend’s point is a matter for colleagues in the Department for Transport, but I shall certainly pass it on. I agree with him. Motor sports are ones in which women and men can compete on equal terms; they have done in the past and we would like to see more of that in future. We welcome initiatives to ensure that all women get involved and able to do so.

Baroness Deech Portrait Baroness Deech (CB)
- View Speech - Hansard - - - Excerpts

My Lords, is the Minister confident that the category of women drivers will be confined to those who are born women?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

Transgender participation in sport has been looked at by the UK sports councils, which have produced well-researched and well-considered guidance. As the sports councils concluded in that guidance, balancing inclusion, safety and fairness at all times is not possible in every sport setting. When it comes to competitive sport, the Government believe that fairness has to be the primary consideration.

Baroness Bray of Coln Portrait Baroness Bray of Coln (Con)
- View Speech - Hansard - - - Excerpts

Can my noble friend assure me that the Government will do all that they can, in the light of the fact that women are increasingly successful in the world of racing, to encourage young girls to start practising their driving skills early on go-carts—or girl carts, as I am told they are now known?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

The age of Formula 1 drivers shows that this is a young sport, and the track record of those who have been successful in it shows that they start at a very young age. That is why we want to make sure that we break down all possible barriers to participation, one of which is visibility. It is why it is so important to have prominent competitions in which women and girls can participate and inspire others.

Lord Dobbs Portrait Lord Dobbs (Con)
- Hansard - - - Excerpts

Can I encourage my noble friend not to get too involved in trying to run Formula 1 but instead to concentrate on drivers in London—ordinary Londoners who want to drive their kids to school in the morning, who want to drive their teenage sons and daughters to sports fields in the evening and who perhaps want to drive their elderly parents to the doctor or a hospital—by knocking on the head the bonkers plan of the Mayor of London to penalise everybody who wants to drive on any street in London?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My noble friend’s point will, I am sure, have been heard on the Benches opposite, and I am sure that they will pass on to the Mayor of London the strong views in this House and from drivers across the capital about his policies.

OFCOM (Duty regarding Prevention of Serious Self-harm and Suicide) Bill [HL]

Lord Parkinson of Whitley Bay Excerpts
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I am very grateful to the noble Baroness, Lady Finlay of Llandaff, for bringing forward her Bill, and to all noble Lords who have taken part in our debate, most particularly the noble Baroness, Lady Smith of Newnham, whose powerful, brave and personal words moved us all but also underlined the importance for so many families of the topic we are discussing today. The Government fully understand just how devastating these harms are, both to children and to adults, and the effect that those harms have on their families and friends, as well as the role that social media platforms and search engines can play in exacerbating them.

As the noble Baroness, Lady Finlay, outlined, her Bill was due to be read a second time the day after the death of Her late Majesty the Queen. That very sad reason for delay has meant that we are able to look at it alongside the Online Safety Bill, now before your Lordships’ House, which is helpful. I will endeavour to explain why the Government think that Bill deals with many of the issues raised, while keeping an open mind, as I said at its Second Reading on Wednesday, on suggestions for how it could do so more effectively.

I will first address the scope and intentions of the Online Safety Bill, particularly how it protects adults and children from horrific content such as this. As I outlined in our debate on Wednesday, the Online Safety Bill offers adult users a triple shield of protection, striking a balance between forcing platforms to be transparent about their actions and empowering adult users with tools to manage their experience online.

The first part of the shield requires all companies in scope of the Bill to tackle criminal activity online when it is flagged to them. They will have duties proactively to tackle priority illegal content and will need to prevent their services being used to facilitate the priority offences listed in the Bill, which include encouraging or assisting suicide.

The second part of the shield requires the largest user-to-user platforms, category 1 services under the Bill, to ensure that any terms of service they set are properly enforced. For instance, if a major social media platform says in its terms of service that it does not allow harmful suicide content, it must adhere to that. I will address this in greater detail in a moment, but Ofcom will have the power to hold platforms to their terms and conditions, which will help to create a safer, more transparent environment for all.

The third part of the shield requires category 1 services to provide adults with tools either to reduce the likelihood of encountering certain categories of content, if they so choose, or to alert them to the nature of that content. That includes content that encourages, promotes or provides instruction for suicide, self-harm or eating disorders. People will also have the ability to filter out content from unverified accounts, if they wish. That will give them the power to address the concern raised by my noble friend Lord Balfe about anonymous accounts. If anonymous accounts are pushing illegal content, the police already have powers through the Investigatory Powers Act to access communications data to bring the people behind that to book.

Through our triple shield, adult users will be empowered to make more informed choices about the services they use and have greater control over whom and what they engage with online.

As noble Lords know, child safety is a crucial component of the Online Safety Bill, and protecting children from harm remains our priority. As well as protecting children from illegal material, such as intentional encouragement of or assistance in suicide, all in-scope services likely to be accessed by children will be required to assess the risks to children on their service, and to provide safety measures to protect them from age-inappropriate and harmful content. This includes content promoting suicide, eating disorders and self-harm that does not meet a criminal threshold, as well as harmful behaviour such as cyberbullying.

Providers will also be required to consider, as part of their risk assessments, how functions such as algorithms could affect children’s exposure to illegal and other harmful content on their service. They must take steps to mitigate and manage any risks. Finally, providers may need to use age-assurance measures to identify the age of their users, to meet the child safety duties and to enforce age restrictions on their service.

A number of noble Lords talked about algorithms, so I will say a little more about that, repeating what I outlined on Wednesday. Under the Online Safety Bill, companies will need to take steps to mitigate the harm associated with their algorithms. That includes ensuring that algorithms do not promote illegal content, ensuring that predictive searches do not drive children towards harmful content and signposting children who search for harmful content towards resources and support.

Ofcom will also be given a range of powers to help it assess whether companies are fulfilling their duties in relation to algorithms. It will have powers to require information from companies about the operation of their algorithms, to interview employees, to require regulated service providers to undergo a skilled persons report, and to require audits of companies’ systems and processes. It will also have the power to inspect premises and access data and equipment, so the Bill is indeed looking at the harmful effects of algorithms.

Moreover, I am pleased that many of the ambitions that lie behind the noble Baroness’s Bill will be achieved through a new communications offence that will capture the intentional encouragement and assistance of self-harm, as noble Lords have highlighted today. That new offence will apply to all victims, adults as well as children, and is an important step forward in tackling such abhorrent content. The Government are considering how that offence should be drafted. We are working with colleagues at the Ministry of Justice and taking into account views expressed by the Law Commission. As I said on Wednesday, our door remains open and I am keen to discuss this with noble Lords from all parties and none to ensure we get this right. We look forward to further conversations with noble Lords between now and Committee.

Finally, I want briefly to mention how in our view the aims of the noble Baroness’s Bill risk duplicating some of the work the Government are taking forward in these areas. The Bill proposes requiring Ofcom to establish a unit to advise the Secretary of State on the use of user-to-user platforms and search engines to encourage and assist serious self-harm and activities associated with the risk of suicide. The unit’s advice would focus on the extent of harmful content, the effectiveness of current regulation and potential changes in regulation to help prevent these harms. The noble Baroness is right to raise the issue, and I think her Bill is intended to complement the Online Safety Bill regime to ensure that it remains responsive to the way in which specific harms develop over time.

On Wednesday we heard from my noble friend Lord Sarfraz about some of the emerging threats, but I hope I have reassured the noble Baroness and other noble Lords that suicide and self-harm content will be robustly covered by the regime that the Online Safety Bill sets up. It is up to Ofcom to determine how best to employ its resources to combat these harms effectively and swiftly. For instance, under the Online Safety Bill, Ofcom is required to build and maintain an in-depth understanding of the risks posed by in-scope services, meaning that the regime the Bill brings forward will remain responsive to the ways in which harms manifest themselves both online and offline, such as in cases of cyberstalking or cyberbullying.

The Government believe that Ofcom as the regulator is best placed to hold providers accountable and to respond to any failings in adhering to their codes of practice. It has the expertise to regulate and enforce the Online Safety Bill’s provisions and to implement the findings of its own research. Its work as the regulator will also consider evidence from experts across the sector, such as Samaritans, which has rightly been named a number of times today and kindly wrote to me ahead of this debate and our debate on the Online Safety Bill. We therefore think that this work covers the same ground as the advisory function of the unit proposed in the noble Baroness’s Bill, and I hope this has reassured her that the area that she highlights through it is indeed being looked at in the Government’s Bill.

That is why the Government believe that the Online Safety Bill now before your Lordships’ House represents the strong action that we need to prevent the encouragement or assistance of self-harm, suicide and related acts online, and why we think it achieves the same objectives as the noble Baroness’s Bill. It is further strengthened, as I say, by the new stand-alone offence that we are bringing forward which addresses communications that intentionally encourage or assist self-harm, about which I am happy to speak to noble Lords.

I am glad we have had the opportunity today, between Second Reading and Committee of that Bill, to look at this issue in detail, and I know we will continue to do so, both inside and outside the Chamber. For the reasons I have given, though, we cannot support the noble Baroness’s Private Member’s Bill today.