Online Safety Bill Debate
Full Debate: Read Full DebateLord Knight of Weymouth
Main Page: Lord Knight of Weymouth (Labour - Life peer)Department Debates - View all Lord Knight of Weymouth's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, we have already had some very significant birthdays during the course of the Bill, and I suspect that, over many more Committee days, there will be many more happy birthdays to celebrate.
This has been a fascinating debate and the Committee has thrown up some important questions. On the second day, we had a very useful discussion of risk which, as the noble Lord, Lord Russell, mentioned, was prompted by my noble friend Lord Allan. In many ways, we have returned to that theme this afternoon. The noble Baroness, Lady Fox, who I do not always agree with, asked a fair question. As the noble Baroness, Lady Kidron, said, it is important to know what harms we are trying to prevent—that is how we are trying to define risk in the Bill—so that is an absolutely fair question.
The Minister has shown flexibility. Sadly, I was not able to be here for the previous debate, and it is probably because I was not that he conceded the point and agreed to put children’s harms in the Bill. That takes us a long way further, and I hope he will demonstrate that kind of flexibility as we carry on through the Bill.
The noble Lord, Lord Moylan, and I have totally different views about what risk it is appropriate for children to face. I am afraid that I absolutely cannot share his view that there is this level of risk. I do not believe it is about eliminating risk—I do not see how you can—but the Bill should be about preventing online risk to children; it is the absolute core of the Bill.
As the noble Lord, Lord Russell, said, the Joint Committee heard evidence from Frances Haugen about the business model of the social media platforms. We listened to Ian Russell, the father of Molly, talk about the impact of an unguarded internet on his daughter. It is within the power of the social media companies to do something about that; this is not unreasonable.
I was very interested in what the noble Viscount, Lord Colville, said. He is right that this is about algorithms, which, in essence, are what we are trying to get to in all the amendments in this really important group. It is quite possible to tackle algorithms if we have a requirement in the Bill to do so, and that is why I support Amendment 261, which tries to address to that.
However, a lot of the rest of the amendments are trying to do exactly the same thing. There is a focus not just on moderating harmful content but on the harmful systems that make digital services systematically unsafe for children. I listened with great interest to what the noble Lord, Lord Russell, said about the 5Rights research which he unpacked. We tend to think that media platforms such as Reddit are relatively harmless but that is clearly not the case. It is very interesting that the use of avatars is becoming quite common in the advertising industry to track where advertisements are ending up—sometimes, on pornography sites. It is really heartening that an organisation such as 5Rights has been doing that and coming up with its conclusions. It is extremely useful for us as policymakers to see the kinds of risks that our children are undertaking.
We were reminded about the origins—way back, it now seems—of the Carnegie duty of care. In a sense, we are trying to make sure that that duty of care covers the systems. We have talked about the functionality and harms in terms of risk assessment, about the child safety duties and about the codes of practice. All those need to be included within this discussion and this framework today to make sure that that duty of care really sticks.
I am not going to go through all the amendments. I support all of them: ensuring functionalities for both types of regulated service, and the duty to consider all harms and not just harmful content. It is absolutely not just about the content but making sure that regulated services have a duty to mitigate the impact of harm in general, not just harms stemming from content.
The noble Baroness, Lady Harding, made a terrific case, which I absolutely support, for making sure that the codes of practice are binding and principle based. At the end of the day, that could be the most important amendment in this group. I must admit that I was quite taken with her description of the Government’s response, which was internally contradictory. It was a very weak response to what I, as a member of the Joint Committee, thought was a very strong and clear recommendation about minimum standards.
This is a really important group of amendments and it would not be a difficult concession for the Government to make. They may wish to phrase things in a different way but we must get to the business case and the operation of the algorithms; otherwise, I do not believe this Bill is going to be effective.
I very much take on board what about the noble Viscount said about looking to the future. We do not know very much about some of these new generative AI systems. We certainly do not know a great deal about how algorithms within social media companies operate. We will come, no doubt, to later amendments on the ability to find out more for researchers and so on, but transparency was one of the things our Joint Committee was extremely keen on, and this is a start.
My Lords, I too agree that this has been a really useful and interesting debate. It has featured many birthday greetings to the noble Baroness, Lady Kidron, in which I obviously join. The noble Lord, Lord Moylan, bounced into the debate that tested the elasticity of the focus of the group, and bounced out again. Like the noble Lord, Lord Clement-Jones, I was particularly struck by the speech from the noble Baroness, Lady Harding, on the non-mandatory nature of the codes. Her points about reducing Ofcom’s workload, and mandatory codes having precedent, were really significant and I look forward to the Minister’s response.
If I have understood it correctly, the codes will be generated by Ofcom, and the Secretary of State will then table them as statutory instruments—so they will be statutory, non-mandatory codes, but with statutory penalties. Trying to unravel that in my mind was a bit of a thing as I was sitting there. Undoubtedly, we are all looking forward to the Minister’s definition of harm, which he promised us at the previous meeting of the Committee.
I applaud the noble Lord, Lord Russell, for the excellent way in which he set out the issues in this grouping and—along with the Public Bill Office—for managing to table these important amendments. Due to the Bill’s complexity, it is an achievement to get the relatively simple issue of safety by design for children into amendments to Clause 10 on children’s risk assessment duties for user-to-user services; Clause 11 on the safety duties protecting children; and the reference to risk assessments in Clause 19 on record-keeping. There is a similar set of amendments applying to search; to the duties in Clause 36 on codes of practice duties; to Schedule 4 on the content of codes of practice; and to Clause 39 on the Secretary of State’s powers of direction. You can see how complicated the Bill is for those of us attempting to amend it.
What the noble Lord and his amendments try to do is simple enough. I listened carefully to the noble Baroness, Lady Fox, as always. The starting point is, when designing, to seek to eliminate harm. That is not to say that they will eliminate all potential harms to children, but the point of design is to seek to eliminate harms if you possibly can. It is important to be clear about that. Of course, it is not just the content but the systems that we have been talking about, and ensuring that the codes of practice that we are going to such lengths to legislate for are stuck to—that is the point made by the noble Baroness, Lady Harding—relieving Ofcom of the duty to assess all the alternative methods. We certainly support the noble Lord, Lord Russell, in his amendments. They reinforce that it is not just about the content; the algorithmic dissemination, in terms of volume and context, is really important, especially as algorithms are dynamic—they are constantly changing in response to the business models that underpin the user-to-user services that we are debating.
The business models want to motivate people to be engaged, regardless of safety in many ways. We have had discussion of the analogy on cars and planes from the noble Lord, Lord Allan. As I recall, in essence he said that in this space there are some things that you want to regulate like planes, to ensure that there are no accidents, and some where you trade off freedom and safety, as we do with the regulation of cars. In this case, it is a bit more like regulating for self-driving cars; in that context, you will design a lot more around trying to anticipate all the things that humans when driving will know instinctively, because they are more ethical individuals than you could ever programme an AI to be when driving a car. I offer that slight adjustment, and I hope that it helps the noble Lord, Lord Moylan, when he is thinking about trains, planes and automobiles.
In respect of the problem of the business models and their engagement over safety, I had contact this weekend and last week from friends much younger than I am, who are users of Snap. I am told that there is an AI chatbot on Snap, which I am sure is about engaging people for longer and collecting more data so that you can engage them even longer and, potentially, collect data to drive advertising. But you can pay to get rid of that chatbot, which is the business model moving somewhere else as and when we make it harder for it to make money as it is. Snap previously had location sharing, which you had to turn off. It created various harms and risks for children that their location was being shared with other people without them necessarily authorising it. We can all see how that could create issues.
Does the noble Lord have any reflections, talking about Snap, as to how the internet has changed in our time? It was once really for adults, when it was on a PC and it was only adults who had access to it. There has, of course, been a huge explosion in child access to the internet because of the mobile phone—as we have heard, two-thirds of 10 year-olds now have a mobile phone—and an app such as Snap now has a completely different audience from the one it had five or 10 years ago. Does the noble Lord have any reflections on what the consequences of the explosion of children’s access to applications such as Snap has been on those thinking about the harms and protection of children?
I am grateful to the noble Lord. In many ways, I am reminded of the article I read in the New York Times this weekend and the interview with Geoffrey Hinton, the now former chief scientist at Google. He said that as companies improve their AI systems, they become increasingly dangerous. He said of AI technology:
“Look at how it was five years ago and how it is now. Take the difference and propagate it forwards. That’s scary”.
Yes, the huge success of the iPhone, of mobile phones and all of us, as parents, handing our more redundant iPhones on to our children, has meant that children have huge access. We have heard the stats in Committee around the numbers who are still in primary school and on social media, despite the terms and conditions of those platforms. That is precisely why we are here, trying to get things designed to be safe as far as is possible from the off, but recognising that it is dynamic and that we therefore need a regulator to keep an eye on the dynamic nature of these algorithms as they evolve, ensuring that they are safe by design as they are being engineered.
My noble friend Lord Stevenson has tabled Amendment 27, which looks at targeted advertising, especially that which requires data collection and profiling of children. In that, he has been grateful to Global Action Plan for its advice. While advertising is broadly out of scope of the Bill, apart from in respect of fraud, it is significant for the Minister to reflect on the user experience for children. Whether it is paid or organic content, it is pertinent in terms of their safety as children and something we should all be mindful of. I say to the noble Lord, Lord Vaizey, that as I understand it, the age-appropriate design code does a fair amount in respect of the data privacy of children, but this is much more about preventing children encountering the advertising in the first place, aside from the data protections that apply in the age-appropriate design code. But the authority is about to correct me.
Just to add to what the noble Lord has said, it is worth noting that we had a debate, on Amendment 92, about aligning the age-appropriate design code likely to be accessed and the very important issue that the noble Lord, Lord Vaizey, raised about alignment of these two regimes. I think we can say that these are kissing cousins, in that they take a by-design approach. The noble Lord is completely right that the scope of the Bill is much broader than data protection only, but they take the same approach.
I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.
Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.
Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.
My Lords, I join in the chorus of good wishes to the bungee-jumping birthday Baroness, Lady Kidron. I know she will not have thought twice about joining us today in Committee for scrutiny of the Bill, which is testament to her dedication to the cause of the Bill and, more broadly, to protecting children online. The noble Lord, Lord Clement-Jones, is right to note that we have already had a few birthdays along the way; I hope that we get only one birthday each before the Bill is finished.
Very good—only one each, and hopefully fewer. I thank noble Lords for the points they raised in the debate on these amendments. I understand the concerns raised about how the design and operation of services can contribute to risk and harm online.
The noble Lord, Lord Russell, was right, when opening this debate, that companies are very successful indeed at devising and designing products and services that people want to use repeatedly, and I hope to reassure all noble Lords that the illegal and child safety duties in the Bill extend to how regulated services design and operate their services. Providers with services that are likely to be accessed by children will need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service. It also includes reviewing children’s use of higher-risk features, such as live streaming or private messaging. Service providers are also specifically required to consider the design of functionalities, algorithms and other features when delivering the child safety duties imposed by the Bill.
I turn first to Amendments 23 and 76 in the name of the noble Lord, Lord Russell. These would require providers to eliminate the risk of harm to children identified in the service’s most recent children’s risk assessment, in addition to mitigating and managing those risks. The Bill will deliver robust and effective protections for children, but requiring providers to eliminate the risk of harm to children would place an unworkable duty on providers. As the noble Baroness, Lady Fox, my noble friend Lord Moylan and others have noted, it is not possible to eliminate all risk of harm to children online, just as it is not possible entirely to eliminate risk from, say, car travel, bungee jumping or playing sports. Such a duty could lead to service providers taking disproportionate measures to comply; for instance, as noble Lords raised, restricting children’s access to content that is entirely appropriate for them to see.
Does the Minister accept that that is not exactly what we were saying? We were not saying that they would have to eliminate all risk: they would have to design to eliminate risks, but we accept that other risks will apply.
It is part of the philosophical ruminations that we have had, but the point here is that elimination is not possible through the design or any drafting of legislation or work that is there. I will come on to talk a bit more about how we seek to minimise, mitigate and manage risk, which is the focus.
Amendments 24, 31, 32, 77, 84, 85 and 295, from the noble Lord, Lord Russell, seek to ensure that providers do not focus just on content when fulfilling their duties to mitigate the impact of harm to children. The Bill already delivers on those objectives. As the noble Baroness, Lady Kidron, noted, it defines “content” very broadly in Clause 207 as
“anything communicated by means of an internet service”.
Under this definition, in essence, all communication and activity is facilitated by content.
I cannot give a firm timescale today but I will seek what further information I can provide in writing. I have not seen it yet, but I know that the work continues.
Amendments 28 and 82, in the name of the noble Lord, Lord Russell, seek to remove the size and capacity of a service provider as a relevant factor when determining what is proportionate for services in meeting their child safety duties. This provision is important to ensure that the requirements in the child safety duties are appropriately tailored to the size of the provider. The Bill regulates a large number of service providers, which range from some of the biggest companies in the world to small voluntary organisations. This provision recognises that what it is proportionate to require of providers at either end of that scale will be different.
Removing this provision would risk setting a lowest common denominator. For instance, a large multinational company could argue that it is required only to take the same steps to comply as a smaller provider.
Amendment 32A from the noble Lord, Lord Knight of Weymouth, would require services to have regard to the potential use of virtual private networks and similar tools to circumvent age-restriction measures. He raised the use of VPNs earlier in this Committee when we considered privacy and encryption. As outlined then, service providers are already required to think about how safety measures could be circumvented and take steps to prevent that. This is set out clearly in the children’s risk assessment and safety duties. Under the duty at Clause 10(6)(f), all services must consider the different ways in which the service is used and the impact of such use on the level of risk. The use of VPNs is one factor that could affect risk levels. Service providers must ensure that they are effectively mitigating and managing risks that they identify, as set out in Clause 11(2). The noble Lord is correct in his interpretation of the Bill vis-à-vis VPNs.
Technical possibility is a matter for the sector—
I am grateful to the noble Lord for engaging in dialogue while I am in a sedentary position, but I had better stand up. It is relevant to this Committee whether it is technically possible for providers to fulfil the duties we are setting out for them in statute in respect of people’s ability to use workarounds and evade the regulatory system. At some point, could he give us the department’s view on whether there are currently systems that could be used —we would not expect them to be prescribed—by platforms to fulfil the duties if people are using their services via a VPN?
This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.
The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.
Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.
Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.
As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.
I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.
My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—
My Lords, we too support the spirit of these amendments very much and pay tribute to the noble Lord, Lord Russell, for tabling them.
In many ways, I do not need to say very much. I think the noble Baroness, Lady Kidron, made a really powerful case, alongside the way the group was introduced in respect of the importance of these things. We do want the positivity that the noble Baroness, Lady Harding, talked about in respect of the potential and opportunity of technology for young people. We want them to have the right to freedom of expression, privacy and reliable information, and to be protected from exploitation by the media. Those happen to be direct quotes from the UN Convention on the Rights of the Child, as some of the rights they would enjoy. Amendments 30 and 105, which the noble Lord, Lord Clement-Jones, tabled—I attached my name to Amendment 30—are very much in that spirit of trying to promote well-being and trying to say that there is something positive that we want to see here.
In particular, I would like to see that in respect of Ofcom. Amendment 187 is, in some ways, the more significant amendment and the one I most want the Minister to reflect on. That is the one that applies to Ofcom: that it should have reference to the UN Convention on the Rights of the Child. I think even the noble Lord, Lord Weir, could possibly agree. I understand his thoughtful comments around whether or not it is right to encumber business with adherence to the UN convention, but Ofcom is a public body in how it carries out its duties as a regulator. There are choices for regulation. Regulation can just be about minimum standards, but it can also be about promoting something better. What we are seeking here in trying to have reference to the UN convention is for Ofcom to regulate for something more positive and better, as well as police minimum standards. On that basis, we support the amendments.
My Lords, I will start in the optimistic spirit of the debate we have just had. There are many benefits to young people from the internet: social, educational and many other ways that noble Lords have mentioned today. That is why the Government’s top priority for this legislation has always been to protect children and to ensure that they can enjoy those benefits by going online safely.
Once again, I find myself sympathetic to these amendments, but in a position of seeking to reassure your Lordships that the Bill already delivers on their objectives. Amendments 25, 78, 187 and 196 seek to add references to the United Nations Convention on the Rights of the Child and general comment 25 on children’s rights in relation to the digital environment to the duties on providers and Ofcom in the Bill.
As I have said many times before, children’s rights are at the heart of this legislation, even if the phrase itself is not mentioned in terms. The Bill already reflects the principles of the UN convention and the general comment. Clause 207, for instance, is clear that a “child” means a person under the age of 18, which is in line with the convention. All providers in scope of the Bill need to take robust steps to protect users, including children, from illegal content or activity on their services and to protect children from content which is harmful to them. They will need to ensure that children have a safe, age-appropriate experience on services designed for them.
Both Ofcom and service providers will also have duties in relation to users’ rights to freedom of expression and privacy. The safety objectives will require Ofcom to ensure that services protect children to a higher standard than adults, while also making sure that these services account for the different needs of children at different ages, among other things. Ofcom must also consult bodies with expertise in equality and human rights, including those representing the interests of children, for instance the Children’s Commissioner. While the Government fully support the UN convention and its continued implementation in the UK, it would not be appropriate to place obligations on regulated services to uphold an international treaty between state parties. We agree with the reservations that were expressed by the noble Lord, Lord Weir of Ballyholme, in his speech, and his noble friend Lady Foster.
The convention’s implementation is a matter for the Government, not for private businesses or voluntary organisations. Similarly, the general comment acts as guidance for state parties and it would not be appropriate to refer to that in relation to private entities. The general comment is not binding and it is for individual states to determine how to implement the convention. I hope that the noble Lord, Lord Russell, will feel reassured that children’s rights are baked into the Bill in more ways than a first glance may suggest, and that he will be content to withdraw his amendment.
The noble Lord, Lord Clement-Jones, in his Amendments 30 and 105, seeks to require platforms and Ofcom to consider a service’s benefits to children’s rights and well-being when considering what is proportionate to fulfil the child safety duties of the Bill. They also add children’s rights and well-being to the online safety objectives for user-to-user services. The Bill as drafted is focused on reducing the risk of harm to children precisely so that they can better enjoy the many benefits of being online. It already requires companies to take a risk-based and proportionate approach to delivering the child safety duties. Providers will need to address only content that poses a risk of harm to children, not that which is beneficial or neutral. The Bill does not require providers to exclude children or restrict access to content or services that may be beneficial for them.
Children’s rights and well-being are already a central feature of the existing safety objectives for user-to-user services in Schedule 4 to the Bill. These require Ofcom to ensure that services protect children to a higher standard than adults, while making sure that these services account for the different needs of children at different ages, among other things. On this basis, while I am sympathetic to the aims of the amendments the noble Lord has brought forward, I respectfully say that I do not think they are needed.
More pertinently, Amendment 30 could have unintended consequences. By introducing a broad balancing exercise between the harms and benefits that children may experience online, it would make it more difficult for Ofcom to follow up instances of non-compliance. For example, service providers could take less effective safety measures to protect children, arguing that, as their service is broadly beneficial to children’s well-being or rights, the extent to which they need to protect children from harm is reduced. This could mean that children are more exposed to more harmful content, which would reduce the benefits of going online. I hope that this reassures the noble Lord, Lord Russell, of the work the Bill does in the areas he has highlighted, and that it explains why I cannot accept his amendments. I invite him to withdraw Amendment 25.