Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.

Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.

Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I join in the chorus of good wishes to the bungee-jumping birthday Baroness, Lady Kidron. I know she will not have thought twice about joining us today in Committee for scrutiny of the Bill, which is testament to her dedication to the cause of the Bill and, more broadly, to protecting children online. The noble Lord, Lord Clement-Jones, is right to note that we have already had a few birthdays along the way; I hope that we get only one birthday each before the Bill is finished.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My birthday is in October, so I hope not.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Very good—only one each, and hopefully fewer. I thank noble Lords for the points they raised in the debate on these amendments. I understand the concerns raised about how the design and operation of services can contribute to risk and harm online.

The noble Lord, Lord Russell, was right, when opening this debate, that companies are very successful indeed at devising and designing products and services that people want to use repeatedly, and I hope to reassure all noble Lords that the illegal and child safety duties in the Bill extend to how regulated services design and operate their services. Providers with services that are likely to be accessed by children will need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service. It also includes reviewing children’s use of higher-risk features, such as live streaming or private messaging. Service providers are also specifically required to consider the design of functionalities, algorithms and other features when delivering the child safety duties imposed by the Bill.

I turn first to Amendments 23 and 76 in the name of the noble Lord, Lord Russell. These would require providers to eliminate the risk of harm to children identified in the service’s most recent children’s risk assessment, in addition to mitigating and managing those risks. The Bill will deliver robust and effective protections for children, but requiring providers to eliminate the risk of harm to children would place an unworkable duty on providers. As the noble Baroness, Lady Fox, my noble friend Lord Moylan and others have noted, it is not possible to eliminate all risk of harm to children online, just as it is not possible entirely to eliminate risk from, say, car travel, bungee jumping or playing sports. Such a duty could lead to service providers taking disproportionate measures to comply; for instance, as noble Lords raised, restricting children’s access to content that is entirely appropriate for them to see.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Does the Minister accept that that is not exactly what we were saying? We were not saying that they would have to eliminate all risk: they would have to design to eliminate risks, but we accept that other risks will apply.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It is part of the philosophical ruminations that we have had, but the point here is that elimination is not possible through the design or any drafting of legislation or work that is there. I will come on to talk a bit more about how we seek to minimise, mitigate and manage risk, which is the focus.

Amendments 24, 31, 32, 77, 84, 85 and 295, from the noble Lord, Lord Russell, seek to ensure that providers do not focus just on content when fulfilling their duties to mitigate the impact of harm to children. The Bill already delivers on those objectives. As the noble Baroness, Lady Kidron, noted, it defines “content” very broadly in Clause 207 as

“anything communicated by means of an internet service”.

Under this definition, in essence, all communication and activity is facilitated by content.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I hope that the Minister has in his brief a response to the noble Baroness’s point about Clause 11(14), which, I must admit, comes across extraordinarily in this context. She quoted it, saying:

“The duties set out … are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.


Is not that exception absolutely at the core of what we are talking about today? It is surely therefore very difficult for the Minister to say that this applies in a very broad way, rather than purely to content.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will come on to talk a bit about dissemination as well. If the noble Lord will allow me, he can intervene later on if I have not done that to his satisfaction.

I was about to talk about the child safety duties in Clause 11(5), which also specifies that they apply to the way that a service is designed, how it operates and how it is used, as well as to the content facilitated by it. The definition of content makes it clear that providers are responsible for mitigating harm in relation to all communications and activity on their service. Removing the reference to content would make service providers responsible for all risk of harm to children arising from the general operation of their service. That could, for instance, bring into scope external advertising campaigns, carried out by the service to promote its website, which could cause harm. This and other elements of a service’s operations are already regulated by other legislation.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I apologise for interrupting. Is that the case, and could that not be dealt with by defining harm in the way that it is intended, rather than as harm from any source whatever? It feels like a big leap that, if you take out “content”, instead of it meaning the scope of the service in its functionality and content and all the things that we have talked about for the last hour and a half, the suggestion is that it is unworkable because harm suddenly means everything. I am not sure that that is the case. Even if it is, one could find a definition of harm that would make it not the case.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Taking it out in the way that the amendment suggests throws up that risk. I am sure that it is not the intention of the noble Lord or the noble Baroness in putting it, but that is a risk of the drafting, which requires some further thought.

Clause 11(2), which is the focus of Amendments 32, 85 and 295, already means that platforms have to take robust action against content which is harmful because of the manner of its dissemination. However, it would not be feasible for providers to fulfil their duties in relation to content which is harmful only by the manner of its dissemination. This covers content which may not meet the definition of content which is harmful to children in isolation but may be harmful when targeted at children in a particular way. One example could be content discussing a mental health condition such as depression, where recommendations are made repeatedly or in an amplified manner through the use of algorithms. The nature of that content per se may not be inherently harmful to every child who encounters it, but, when aggregated, it may become harmful to a child who is sent it many times over. That, of course, must be addressed, and is covered by the Bill.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Can the Minister assure us that he will take another look at this between Committee and Report? He has almost made the case for this wording to be taken out—he said that it is already covered by a whole number of different clauses in the Bill—but it is still here. There is still an exception which, if the Minister is correct, is highly misleading: it means that you have to go searching all over the Bill to find a way of attacking the algorithm, essentially, and the way that it amplifies, disseminates and so on. That is what we are trying to get to: how to address the very important issue not just of content but of the way that the algorithm operates in social media. This seems to be highly misleading, in the light of what the Minister said.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I do not think so, but I will certainly look at it again, and I am very happy to speak to the noble Lord as I do. My point is that it would not be workable or proportionate for a provider to prevent or protect all children from encountering every single instance of the sort of content that I have just outlined, which would be the effect of these amendments. I will happily discuss that with the noble Lord and others between now and Report.

Amendment 27, by the noble Lord, Lord Stevenson, seeks to add a duty to prevent children encountering targeted paid-for advertising. As he knows, the Bill has been designed to tackle harm facilitated through user-generated content. Some advertising, including paid-for posts by influencers, will therefore fall under the scope of the Bill. Companies will need to ensure that systems for targeting such advertising content to children, such as the use of algorithms, protect them from harmful material. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. The Bill is designed to reduce harm on services which host user-generated content, whereas online advertising poses a different set of problems, with different actors. The Government are taking forward work in this area through the online advertising programme, which will consider the full range of actors and sector-appropriate solutions to those problems.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I understand the Minister’s response, and I accept that there is a parallel stream of work that may well address this. However, we have been waiting for the report from the group that has been looking at that for some time. Rumours—which I never listen to—say that it has been ready for some time. Can the Minister give us a timescale?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I cannot give a firm timescale today but I will seek what further information I can provide in writing. I have not seen it yet, but I know that the work continues.

Amendments 28 and 82, in the name of the noble Lord, Lord Russell, seek to remove the size and capacity of a service provider as a relevant factor when determining what is proportionate for services in meeting their child safety duties. This provision is important to ensure that the requirements in the child safety duties are appropriately tailored to the size of the provider. The Bill regulates a large number of service providers, which range from some of the biggest companies in the world to small voluntary organisations. This provision recognises that what it is proportionate to require of providers at either end of that scale will be different.

Removing this provision would risk setting a lowest common denominator. For instance, a large multinational company could argue that it is required only to take the same steps to comply as a smaller provider.

Amendment 32A from the noble Lord, Lord Knight of Weymouth, would require services to have regard to the potential use of virtual private networks and similar tools to circumvent age-restriction measures. He raised the use of VPNs earlier in this Committee when we considered privacy and encryption. As outlined then, service providers are already required to think about how safety measures could be circumvented and take steps to prevent that. This is set out clearly in the children’s risk assessment and safety duties. Under the duty at Clause 10(6)(f), all services must consider the different ways in which the service is used and the impact of such use on the level of risk. The use of VPNs is one factor that could affect risk levels. Service providers must ensure that they are effectively mitigating and managing risks that they identify, as set out in Clause 11(2). The noble Lord is correct in his interpretation of the Bill vis-à-vis VPNs.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Is this technically possible?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Technical possibility is a matter for the sector—

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful to the noble Lord for engaging in dialogue while I am in a sedentary position, but I had better stand up. It is relevant to this Committee whether it is technically possible for providers to fulfil the duties we are setting out for them in statute in respect of people’s ability to use workarounds and evade the regulatory system. At some point, could he give us the department’s view on whether there are currently systems that could be used —we would not expect them to be prescribed—by platforms to fulfil the duties if people are using their services via a VPN?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.

The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.

Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.

Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.

As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.

I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.

My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My noble friend the Minister did not address the concern I set out that the Bill’s approach will overburden Ofcom. If Ofcom has to review the suitability of each set of alternative measures, we will create an even bigger monster than we first thought.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I do not think that it will. We have provided further resource for Ofcom to take on the work that this Bill will give it; it has been very happy to engage with noble Lords to talk through how it intends to go about that work and, I am sure, would be happy to follow up on that point with my noble friend to offer her some reassurance.

Responding to the point from my noble friend Lord Vaizey, the Bill is part of the UK’s overall digital regulatory landscape, which will deliver protections for children alongside the data protection requirements for children set out in the Information Commissioner’s age-appropriate design code. Ofcom has strong existing relationships with other bodies in the regulatory sphere, including through the Digital Regulation Co-operation Forum. The Information Commissioner has been added to this Bill as a statutory consultee for Ofcom’s draft codes of practice and relevant pieces of guidance formally to provide for the ICO’s input into its areas of expertise, especially relating to privacy.

Amendment 138 from the noble Lord, Lord Russell of Liverpool, would amend the criteria for non-designated content which is harmful to children to bring into scope content whose risk of harm derives from its potential financial impact. The Bill already requires platforms to take measures to protect all users, including children, from financial crime online. All companies in scope of the Bill will need to design and operate their services to reduce the risk of users encountering content amounting to a fraud offence, as set out in the list of priority offences in Schedule 7. This amendment would expand the scope of the Bill to include broader commercial harms. These are dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This amendment therefore risks creating regulatory overlap, which would cause confusion for business while not providing additional protections to consumers and internet users.

Amendment 261 in the name of the right reverend Prelate the Bishop of Oxford seeks to modify the existing requirements for the Secretary of State’s review into the effectiveness of the regulatory framework. The purpose of the amendment is to ensure that all aspects of a regulated service are taken into account when considering the risk of harm to users and not just content.

As we have discussed already, the Bill defines “content” very broadly and companies must look at every aspect of how their service facilitates harm associated with the spread of content. Furthermore, the review clause makes explicit reference to the systems and processes which regulated services use, so the review can already cover harm associated with, for example, the design of services.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we too support the spirit of these amendments very much and pay tribute to the noble Lord, Lord Russell, for tabling them.

In many ways, I do not need to say very much. I think the noble Baroness, Lady Kidron, made a really powerful case, alongside the way the group was introduced in respect of the importance of these things. We do want the positivity that the noble Baroness, Lady Harding, talked about in respect of the potential and opportunity of technology for young people. We want them to have the right to freedom of expression, privacy and reliable information, and to be protected from exploitation by the media. Those happen to be direct quotes from the UN Convention on the Rights of the Child, as some of the rights they would enjoy. Amendments 30 and 105, which the noble Lord, Lord Clement-Jones, tabled—I attached my name to Amendment 30—are very much in that spirit of trying to promote well-being and trying to say that there is something positive that we want to see here.

In particular, I would like to see that in respect of Ofcom. Amendment 187 is, in some ways, the more significant amendment and the one I most want the Minister to reflect on. That is the one that applies to Ofcom: that it should have reference to the UN Convention on the Rights of the Child. I think even the noble Lord, Lord Weir, could possibly agree. I understand his thoughtful comments around whether or not it is right to encumber business with adherence to the UN convention, but Ofcom is a public body in how it carries out its duties as a regulator. There are choices for regulation. Regulation can just be about minimum standards, but it can also be about promoting something better. What we are seeking here in trying to have reference to the UN convention is for Ofcom to regulate for something more positive and better, as well as police minimum standards. On that basis, we support the amendments.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I will start in the optimistic spirit of the debate we have just had. There are many benefits to young people from the internet: social, educational and many other ways that noble Lords have mentioned today. That is why the Government’s top priority for this legislation has always been to protect children and to ensure that they can enjoy those benefits by going online safely.

Once again, I find myself sympathetic to these amendments, but in a position of seeking to reassure your Lordships that the Bill already delivers on their objectives. Amendments 25, 78, 187 and 196 seek to add references to the United Nations Convention on the Rights of the Child and general comment 25 on children’s rights in relation to the digital environment to the duties on providers and Ofcom in the Bill.

As I have said many times before, children’s rights are at the heart of this legislation, even if the phrase itself is not mentioned in terms. The Bill already reflects the principles of the UN convention and the general comment. Clause 207, for instance, is clear that a “child” means a person under the age of 18, which is in line with the convention. All providers in scope of the Bill need to take robust steps to protect users, including children, from illegal content or activity on their services and to protect children from content which is harmful to them. They will need to ensure that children have a safe, age-appropriate experience on services designed for them.

Both Ofcom and service providers will also have duties in relation to users’ rights to freedom of expression and privacy. The safety objectives will require Ofcom to ensure that services protect children to a higher standard than adults, while also making sure that these services account for the different needs of children at different ages, among other things. Ofcom must also consult bodies with expertise in equality and human rights, including those representing the interests of children, for instance the Children’s Commissioner. While the Government fully support the UN convention and its continued implementation in the UK, it would not be appropriate to place obligations on regulated services to uphold an international treaty between state parties. We agree with the reservations that were expressed by the noble Lord, Lord Weir of Ballyholme, in his speech, and his noble friend Lady Foster.

The convention’s implementation is a matter for the Government, not for private businesses or voluntary organisations. Similarly, the general comment acts as guidance for state parties and it would not be appropriate to refer to that in relation to private entities. The general comment is not binding and it is for individual states to determine how to implement the convention. I hope that the noble Lord, Lord Russell, will feel reassured that children’s rights are baked into the Bill in more ways than a first glance may suggest, and that he will be content to withdraw his amendment.

The noble Lord, Lord Clement-Jones, in his Amendments 30 and 105, seeks to require platforms and Ofcom to consider a service’s benefits to children’s rights and well-being when considering what is proportionate to fulfil the child safety duties of the Bill. They also add children’s rights and well-being to the online safety objectives for user-to-user services. The Bill as drafted is focused on reducing the risk of harm to children precisely so that they can better enjoy the many benefits of being online. It already requires companies to take a risk-based and proportionate approach to delivering the child safety duties. Providers will need to address only content that poses a risk of harm to children, not that which is beneficial or neutral. The Bill does not require providers to exclude children or restrict access to content or services that may be beneficial for them.

Children’s rights and well-being are already a central feature of the existing safety objectives for user-to-user services in Schedule 4 to the Bill. These require Ofcom to ensure that services protect children to a higher standard than adults, while making sure that these services account for the different needs of children at different ages, among other things. On this basis, while I am sympathetic to the aims of the amendments the noble Lord has brought forward, I respectfully say that I do not think they are needed.

More pertinently, Amendment 30 could have unintended consequences. By introducing a broad balancing exercise between the harms and benefits that children may experience online, it would make it more difficult for Ofcom to follow up instances of non-compliance. For example, service providers could take less effective safety measures to protect children, arguing that, as their service is broadly beneficial to children’s well-being or rights, the extent to which they need to protect children from harm is reduced. This could mean that children are more exposed to more harmful content, which would reduce the benefits of going online. I hope that this reassures the noble Lord, Lord Russell, of the work the Bill does in the areas he has highlighted, and that it explains why I cannot accept his amendments. I invite him to withdraw Amendment 25.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords for taking part in this discussion. I thank the noble Lord, Lord Weir, although I would say to him that his third point—that, in his experience, the UNCRC is open to different interpretations by different departments—is my experience of normal government. Name me something that has not been interpreted differently by different departments, as it suits them.

--- Later in debate ---
Moved by
27A: Clause 11, page 11, line 19, at end insert—“(10A) A duty to summarise in the terms of service the findings of the most recent children’s risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to children).”Member’s explanatory statementThis amendment requires providers of Category 1 services to summarise (in their terms of service) the findings of their latest children’s risk assessment. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.