Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberIt is a great pleasure to follow my noble friend Lord Russell and to thank him for his good wishes. I assure the Committee that there is nowhere I would rather spend my birthday, in spite of some competitive offers. I remind noble Lords of my interests in the register, particularly as the chair of 5Rights Foundation.
As my noble friend has set out, these amendments fall in three places: the risk assessments, the safety duties and the codes of practice. However, together they work on the overarching theme of safety by design. I will restrict my detailed remarks to a number of amendments in the first two categories. This is perhaps a good moment to recall the initial work of Carnegie, which provided the conceptual approach of the Bill several years ago in arguing for a duty of care. The Bill has gone many rounds since then, but I think the principle remains that a regulated service should consider its impact on users before it causes them harm. Safety by design, to which all the amendments in this group refer, is an embodiment of a duty of care. In thinking about these amendments as a group, I remind the Committee that both the proportionality provisions and the fact that this is a systems and processes Bill means that no company can, should or will be penalised for a single piece of content, a single piece of design or, indeed, low-level infringements.
Amendments 24, 31, 77 and 84 would delete “content” from the Government’s description of what is harmful to children, meaning that the duty is to consider harm in the round rather than just harmful content. The definition of “content” is drawn broadly in Clause 207 as
“anything communicated by means of an internet service”,
but the examples in the Bill, including
“written material … music and data of any description”,
once again fail to include design features that are so often the key drivers of harm to children.
On day three of Committee, the Minister said:
“The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service … This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children”.—[Official Report, 27/4/23; col. 1385.]
However, in looking at the child safety duties, Clause 11(5) says:
“The duties … in subsections (2) and (3) apply across all areas of a service, including the way it is designed, operated and used”,
but subsection (14) says:
“The duties set out in subsections (3) and (6)”—
which are the duties to operate proportionate systems and processes to prevent and protect children from encountering harmful content and to include them in terms of service—
“are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.
I hesitate to say whether that is contradictory. I am not actually sure, but it is confusing. I am concerned that while we are reassured that “content” means content and activity and that the risk assessment considers functionality, “harm” is then repeatedly expressed only in the form of content.
Over the weekend, I had an email exchange with the renowned psychoanalyst and author, Norman Doidge, whose work on the plasticity of the brain profoundly changed how we think about addiction and compulsion. In the exchange, he said that
“children’s exposures to super doses, of supernormal images and scenes, leaves an imprint that can hijack development”.
Then, he said that
“the direction seems to be that AI would be working out the irresistible image or scenario, and target people with these images, as they target advertising”.
His argument is that it is not just the image but the dissemination and tailoring of that image that maximises the impact. The volume and frequency of those images create habits in children that take a lifetime to change—if they change at all. Amendments 32 and 85 would remove this language to ensure that content that is harmful by virtue of its dissemination is accounted for.
I turn now to Amendments 28 and 82, which cut the reference to the
“size and capacity of the provider of the service”
in deeming what measures are proportionate. We have already discussed that small is not safe. Such platforms such as Yubo, Clapper and Discord have all been found to harm children and, as both the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, told us, small can become big very quickly. It is far easier to build to a set of rules than it is to retrofit them after the event. Again, I point out that Ofcom already has duties of proportionality; adding size and capacity is unnecessary and may tip the scale to creating loopholes for smaller services.
Amendment 138 seeks to reverse the exemption in Clause 54 of financial harms. More than half of the 100 top-grossing mobile phone apps contain loot boxes, which are well established as unfair and unhealthy, priming young children to gamble and leading to immediate hardship for parents landed with extraordinary bills.
By rights, Amendments 291 and 292 could fit in the future-proof set of amendments. The way that the Bill in Clause 204 separates out functionalities in terms of search and user-to-user is in direct opposition to the direction of travel in the tech sector. TikTok does shopping, Instagram does video, Amazon does search; autocomplete is an issue across the full gamut of services, and so on and so forth. This amendment simply combines the list of functionalities that must be risk-assessed and makes them apply on any regulated service. I cannot see a single argument against this amendment: it cannot be the Government’s intention that a child can be protected, on search services such as Google, from predictive search or autocomplete, but not on TikTok.
Finally, Amendment 295 will embed the understanding that most harm is cumulative. If the Bereaved Parents for Online Safety were in the Chamber, or any child caught up in self-harm, depression sites, gambling, gaming, bullying, fear of exposure, or the inexorable feeling of losing their childhood to an endless scroll, they would say at the top of their voices that it is not any individual piece of content, or any one moment or incident, but the way in which they are nudged, pushed, enticed and goaded into a toxic, harmful or dangerous place. Adding the simple words
“the volume of the content and the frequency with which the content is accessed”
to the interpretation of what can constitute harm in Clause 205 is one of the most important things that we can do in this Chamber. This Bill comes too late for a whole generation of parents and children but, if these safety by design amendments can protect the next generation of children, I will certainly be very glad.
My Lords, it is an honour, once again, to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, in this Committee. I am going to speak in detail to the amendments that seek to change the way the codes of practice are implemented. Before I do, however, I will very briefly add my voice to the general comments that the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, have just taken us through. Every parent in the country knows that both the benefit and the harm that online platforms can bring our children is not just about the content. It is about the functionality: the way these platforms work; the way they suck us in. They do give us joy but they also drive addiction. It is hugely important that this Bill reflects the functionality that online platforms bring, and not just content in the normal sense of the word “content”.
I will now speak in a bit more detail about the following amendments: Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A—I will finish soon, I promise—112, 122ZA, 122ZB and 122ZC.
My Lords, I want, apart from anything else, to speak in defence of philosophical ruminations. The only way we can scrutinise the amendments in Committee is to do a bit of philosophical rumination. We are trying to work out what the amendments might mean in terms of changing the Bill.
I read these amendments, noted their use of “eliminate” —we have to “eliminate” all risks—and wondered what that would mean. I do not want to feel that I cannot ask these kinds of difficult questions for fear that I will offend a particular group or that it would be insensitive to a particular group of parents. It is difficult but we are required as legislators to try to understand what each other are trying to change, or how we are going to try to change the law.
I say to those who have put “eliminate” prominently in a number of these amendments that it is impossible to eliminate all risks to children—is it not?—if they are to have access to the online world, unless you ban them from the platforms completely. Is “eliminate” really helpful here?
Previously in Committee, I talked a lot about the potential dangers, psychologically and with respect to development, of overcoddling young people, of cotton wool kids, and so on. I noted an article over the weekend by the science journalist Tom Chivers, which included arguments from the Oxford Internet Institute and various psychologists that the evidence on whether social media is harmful, particularly for teenagers, is ambiguous.
I am very convinced by the examples brought forward by the noble Baroness, Lady Kidron—and I too wish her a happy birthday. We all know about the targeting of young people and so forth, but I am also aware of the positives. I always try to balance these things out and make sure that we do not deny young people access to the positives. In fact, I found myself cheering at the next group of amendments, which is unusual. First, they depend on whether you are four or 14—in other words, you have to be age-specific—and, secondly, they recognise that we do not want to pass anything in the Bill that actually denies children access to either their own privacy or the capacity to know more.
I also wanted to explore a little the idea of expanding the debate away from content to systems, because this is something that I think I am not quite understanding. My problem is that moving away from the discussion on whether content is removed or accessible, and focusing on systems, does not mean that content is not in scope. My worry is that the systems will have an impact on what content is available.
Let me give some examples of things that can become difficult if we think that we do not want young people to encounter violence and nudity—which makes it seem as though we know what we are talking about when we talk about “harmful”. We will all recall that, in 2018, Facebook removed content from the Anne Frank Centre posted by civil rights organisations because it included photographs of the Holocaust featuring undressed children among the victims. Facebook apologised afterwards. None the less, my worry is these kinds of things happening. Another example, in 2016, was the removal of the Pulitzer Prize-winning photograph “The Terror of War”, featuring fleeing Vietnamese napalm victims in the 1970s, because the system thought it was something dodgy, given that the photo was of a naked child fleeing.
I need to understand how system changes will not deprive young people of important educational information such as that. That is what I am trying to distinguish. The point made by the noble Lord, Lord Moylan, about “harmful” not being defined—I have endlessly gone on about this, and will talk more about it later—is difficult because we think that we know what we mean by “harmful” content.
Finally, on the amendments requiring compliance with Ofcom codes of practice, that would give an extraordinary amount of power to the regulator and the Secretary of State. Since I have been in this place, people have rightly drawn my attention to the dangers of delegating power to the Executive or away from any kind of oversight—there has been fantastic debate and discussion about that. It seems to me that these amendments advocate delegated powers being given to the Secretary of State and Ofcom, an unelected body —the Secretary of State could amend for reasons of public policy in order to protect children—and this is to be put through the negative procedure. In any other instance, I would have expected outcry from the usual suspects, but, because it involves children, we are not supposed to object. I worry that we need to have more scrutiny of such amendments and not less, because in the name of protecting children unintended consequences can occur.
I want to answer the point that amendments cannot be seen in isolation. Noble Lords will remember that we had a long and good debate about what constituted harms to children. There was a big argument and the Minister made some warm noises in relation to putting harms to children in the Bill. There is some alignment between many people in the Chamber whereby we and Parliament would like to determine what harm is, and I very much share the noble Baroness’s concern about pointing out what that is.
On the issue of the system versus the content, I am not sure that this is the exact moment but the idea of unintended consequences keeps getting thrown up when we talk about trying to point the finger at what creates harm. There are unintended consequences now, except neither Ofcom nor the Secretary of State or Parliament but only the tech sector has a say in what the unintended consequences are. As someone who has been bungee jumping, I am deeply grateful that there are very strict rules under which that is allowed to happen.
My Lords, I support the amendments in this group that, with regard to safety by design, will address functionality and harms—whatever exactly we mean by that—as well as child safety duties and codes of practice. The noble Lord, Lord Russell, and the noble Baronesses, Lady Harding and Lady Kidron, have laid things out very clearly, and I wish the noble Baroness, Lady Kidron, a happy birthday.
I also support Amendment 261 in the name of my right reverend friend the Bishop of Oxford and supported by the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville. This amendment would allow the Secretary of State to consider safety by design, and not just content, when reviewing the regime.
As we have heard, a number of the amendments would amend the safety duties to children to consider all harms, not just harmful content, and we have begun to have a very interesting debate on that. We know that service features create and amplify harms to children. These harms are not limited to spreading harmful content; features in and of themselves may cause harm—for example, beautifying filters, which can create unrealistic body ideals and pressure on children to look a certain way. In all of this, I want us to listen much more to the voices of children and young people—they understand this issue.
Last week, as part of my ongoing campaign on body image, including how social media can promote body image anxiety, I met a group of young people from two Gloucestershire secondary schools. They were very good at saying what the positives are, but noble Lords will also be very familiar with many of the negative issues that were on their minds, which I will not repeat here. While they were very much alive to harmful content and the messages it gives them, they were keen to talk about the need to address algorithms and filters that they say feed them strong messages and skew the content they see, which might not look harmful but, because of design, accentuates their exposure to issues and themes about which they are already anxious. Suffice to say that underpinning most of what they said to me was a sense of powerlessness and anxiety when navigating the online world that is part of their daily lives.
The current definition of content does not include design features. Building in a safety by design principle from the outset would reduce harms in a systematic way, and the amendments in this group would address that need.
I am grateful to the noble Lord. In many ways, I am reminded of the article I read in the New York Times this weekend and the interview with Geoffrey Hinton, the now former chief scientist at Google. He said that as companies improve their AI systems, they become increasingly dangerous. He said of AI technology:
“Look at how it was five years ago and how it is now. Take the difference and propagate it forwards. That’s scary”.
Yes, the huge success of the iPhone, of mobile phones and all of us, as parents, handing our more redundant iPhones on to our children, has meant that children have huge access. We have heard the stats in Committee around the numbers who are still in primary school and on social media, despite the terms and conditions of those platforms. That is precisely why we are here, trying to get things designed to be safe as far as is possible from the off, but recognising that it is dynamic and that we therefore need a regulator to keep an eye on the dynamic nature of these algorithms as they evolve, ensuring that they are safe by design as they are being engineered.
My noble friend Lord Stevenson has tabled Amendment 27, which looks at targeted advertising, especially that which requires data collection and profiling of children. In that, he has been grateful to Global Action Plan for its advice. While advertising is broadly out of scope of the Bill, apart from in respect of fraud, it is significant for the Minister to reflect on the user experience for children. Whether it is paid or organic content, it is pertinent in terms of their safety as children and something we should all be mindful of. I say to the noble Lord, Lord Vaizey, that as I understand it, the age-appropriate design code does a fair amount in respect of the data privacy of children, but this is much more about preventing children encountering the advertising in the first place, aside from the data protections that apply in the age-appropriate design code. But the authority is about to correct me.
Just to add to what the noble Lord has said, it is worth noting that we had a debate, on Amendment 92, about aligning the age-appropriate design code likely to be accessed and the very important issue that the noble Lord, Lord Vaizey, raised about alignment of these two regimes. I think we can say that these are kissing cousins, in that they take a by-design approach. The noble Lord is completely right that the scope of the Bill is much broader than data protection only, but they take the same approach.
I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.
Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.
Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.
I will come on to talk a bit about dissemination as well. If the noble Lord will allow me, he can intervene later on if I have not done that to his satisfaction.
I was about to talk about the child safety duties in Clause 11(5), which also specifies that they apply to the way that a service is designed, how it operates and how it is used, as well as to the content facilitated by it. The definition of content makes it clear that providers are responsible for mitigating harm in relation to all communications and activity on their service. Removing the reference to content would make service providers responsible for all risk of harm to children arising from the general operation of their service. That could, for instance, bring into scope external advertising campaigns, carried out by the service to promote its website, which could cause harm. This and other elements of a service’s operations are already regulated by other legislation.
I apologise for interrupting. Is that the case, and could that not be dealt with by defining harm in the way that it is intended, rather than as harm from any source whatever? It feels like a big leap that, if you take out “content”, instead of it meaning the scope of the service in its functionality and content and all the things that we have talked about for the last hour and a half, the suggestion is that it is unworkable because harm suddenly means everything. I am not sure that that is the case. Even if it is, one could find a definition of harm that would make it not the case.
Taking it out in the way that the amendment suggests throws up that risk. I am sure that it is not the intention of the noble Lord or the noble Baroness in putting it, but that is a risk of the drafting, which requires some further thought.
Clause 11(2), which is the focus of Amendments 32, 85 and 295, already means that platforms have to take robust action against content which is harmful because of the manner of its dissemination. However, it would not be feasible for providers to fulfil their duties in relation to content which is harmful only by the manner of its dissemination. This covers content which may not meet the definition of content which is harmful to children in isolation but may be harmful when targeted at children in a particular way. One example could be content discussing a mental health condition such as depression, where recommendations are made repeatedly or in an amplified manner through the use of algorithms. The nature of that content per se may not be inherently harmful to every child who encounters it, but, when aggregated, it may become harmful to a child who is sent it many times over. That, of course, must be addressed, and is covered by the Bill.
My Lords, I support the amendments in the name of the noble Lord, Lord Russell, to require regulated services to have regard to the UN Convention on the Rights of the Child. As we continue to attempt to strengthen the Bill by ensuring that the UK will be the safest place for children to be online, there is a danger that platforms may take the easy way out in complying with the new legislation and just block children entirely from their sites. Services must not shut children out of digital spaces altogether to avoid compliance with the child safety duties, rather than designing services with their safety in mind. Children have rights and, as the UN convention makes clear, they must be treated according to their evolving capacities and in their best interests in consideration of their well-being.
Being online is now an essential right, not an option, to access education, entertainment and friendship, but we must try to ensure that it is a safe space. As the 5Rights Foundation points out, the Bill risks infringing children’s rights online, including their rights to information and participation in the digital world, by mandating that services prevent children from encountering harmful content, rather than ensuring services are made age appropriate for children and safe by design, as we discussed earlier. As risk assessments for adults have been stripped from the Bill, this has had the unintended consequence of rendering a child user relative to an adult user even more costly, as services will have substantial safety duties to comply with to protect children. 5Rights Foundation warns that this will lead services to determine that it is not worth designing services with children’s safety in mind but that it could be more cost effective to lock them out entirely.
Ofcom must have a duty to have regard for the UNCRC in its risk assessments. Amendment 196 would ensure that children’s rights are reflected in Ofcom’s assessment of risks, so that Ofcom must have regard for children’s rights in balancing their rights to be safe against their rights to access age-appropriate digital spaces. This would ensure compliance with general comment No. 25, as the noble Lord, Lord Russell, mentioned, passed in 2021, to protect children’s rights to freedom of expression and privacy. I urge the Ministers to accept these amendments to ensure that the UK will be not only the safest place for children to be online but the best place too, by respecting and protecting their rights.
My Lords, I support all the amendments in this group, and will make two very brief points. Before I do, I believe that those who are arguing for safety by design and to put harms in the Bill are not trying to restrict the freedom of children to access the internet but to give the tech sector slightly less freedom to access children and exploit them.
My first point is a point of principle, and here I must declare an interest. It was my very great privilege to chair the international group that drafted general comment No. 25 on children’s rights in relation to the digital environment. We did so on behalf of the Committee on the Rights of the Child and, as my noble friend Lord Russell said, it was adopted formally in 2021. To that end, a great deal of work has gone into balancing the sorts of issues that have been raised in this debate. I think it would interest noble Lords to know that the process took three years, with 150 submissions, many by nation states. Over 700 children in 28 countries were consulted in workshops of at least three hours. They had a good shout and, unlike many of the other general comments, this one is littered with their actual comments. I recommend it to the Committee as a very concise and forceful gesture of what it might be to exercise children’s rights in a balancing way across all the issues that we are discussing. I cannot remember who, but somebody said that the online world is not optional for children: it is where they grow up; it is where they spend their time; it is their education; it is their friendships; it is their entertainment; it is their information. Therefore, if it is not optional, then as a signatory to the UNCRC we have a duty to respect their rights in that environment.
My second point is rather more practical. During the passage of the age-appropriate design code, of which we have heard much, the argument was made that children were covered by the amendment itself, which said they must be kept in mind and so on. I anticipate that argument being made here—that we are aligning with children’s rights, apart from the fact that they are indivisible and must be done in their entirety. In that case, the Government happily accepted that it should be explicit, and it was put in the Data Protection Act. It was one of the most important things that happened in relation to the age-appropriate design code. We might hope that, when this Bill is an Act, it will all be over—our job will be done and we can move on. However, after the Data Protection Act, the most enormous influx of lobbying happened, saying, “Please take the age down from 18 to 13”. The Government, and in that case the ICO, shrugged their shoulders and said, “We can’t; it’s on the face of the Bill”, because Article 1 of the UNCRC says that a child is anyone under the age of 18.
The evolving capacities of children are central to the UNCRC, so the concerns of the noble Baroness, Lady Fox, which I very much share, that a four year-old and a 14 year-old are not the same, are embodied in that document and in the general comment, and therefore it is useful.
These amendments are asking for that same commitment here—to children and to their rights, and to their rights to protection, which is at the heart of so much of what we are debating, and their well-being. We need their participation; we need a digital world with children in it. Although I agreed very much with the noble Baroness, Lady Bennett, and her fierce defending of children’s rights, there are 1 billion children online. If two-thirds of them have not seen anything upsetting in the last year, that rather means that one-third of 1 billion children have—and that is too many.
My Lords, I did not intend to speak in this debate but I have been inspired by it.
I was here for the encryption debate last week, which I did not speak in. One of the contributions was around unintended consequences of the legislation, and I am concerned about unintended consequences here.
I absolutely agree with the comments of the noble Baroness, Lady Bennett, around the need for children to engage on the internet. Due to a confidence and supply agreement with the then Government back in 2017, I ensured that children and adults alike in Northern Ireland have the best access to the internet in the United Kingdom, and I am very proud of that. Digital literacy is covered in a later amendment, Amendment 91, which I will be strongly supporting. It is something that everybody needs to be involved in, not least our young people—and here I declare an interest as the mother of a 16 year-old.
I have two concerns. The first was raised by my friend the noble Lord, Lord Weir, around private companies being legally accountable for upholding an international human rights treaty. I am much more comfortable with Amendments 187 and 196, which refer to Ofcom. I think that is where the duty should be. I have an issue not with the convention but with private companies being held responsible for it; Ofcom should be the body responsible.
Secondly, I listened very carefully to what the noble Baroness, Lady Kidron, said about general comment No. 25. If what I say is incorrect, I hope she will say so. Is general comment No. 25 a binding document on the Government? I understood that it was not.
We need to see the UNCRC included in the Bill. The convention is never opened up again, and how it makes itself relevant to the modern world is through the general comments; that is how the Committee on the Rights of the Child would interpret it.
So it is an interpretive document. The unintended consequences piece was around general comment No. 25 specifically having reference to children being able to seek out content. That is certainly something that I would be concerned about. I am sure that we will discuss it further in the next group of amendments, which are on pornography. If young people were able to seek out harmful content, that would concern me greatly.
I support Amendments 187 and 196, but I have some concerns about the unintended consequences of Amendment 25.