Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Healy of Primrose Hill
Main Page: Baroness Healy of Primrose Hill (Labour - Life peer)Department Debates - View all Baroness Healy of Primrose Hill's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Lords ChamberMy Lords, I welcome the Bill, but regret the time it has taken to arrive. To make the UK the safest place in the world to be online, it must be strengthened, and I will support amendments that would ensure greater protection for children through proper age assurance. The damage to children from exploitation by social media cannot continue. The state must regulate, using severe penalties, to force platforms to behave with greater responsibility as they cannot be trusted to self-regulate. The rise in suicide and self-harm and the loss of self-esteem are ruining young lives. The platforms must take greater responsibility; they have the money and the technology to do this but need stronger incentives to act, such as the promised executive criminal liability amendment.
Ofcom faces a formidable challenge in policing companies to adhere to its terms and conditions about content moderation. Heavy fines are not enough. Ofcom will need guidance in setting codes of practice from not only the three commissioners but NGOs, such as the Internet Watch Foundation, and an advocacy body for children to continually advise on emerging harms. A new regulatory regime to address illegal and harmful content online is essential but, having removed legal but harmful from the original Bill, we lost the opportunity to detoxify the internet.
Concentrating on the big platforms will miss the growth of bespoke platforms that promote other harms such as incel culture, a threat to women but also to young men. Incels, involuntarily celibates, use mainstream platforms such as YouTube to reel in unsuspecting young men before linking them to their own small, specialist websites, but these are outside the scope of category 1 provision and therefore any minimum standards. These sites include not only sexist and misogynistic material but anti-Semitic, racist, homophobic and transphobic items, and even paedophilia. One of the four largest incel forums is dedicated to suicide and self-harm. HOPE not hate, the anti-fascist campaign, has warned that smaller platforms used by the far right to organise and radicalise should be under the same level of scrutiny as category 1 platforms.
User empowerment features, part of the triple shield, such as options to filter out content from unverified users and abusive content, put the onus on the user to filter out material rather than filters being turned on by default. Ofcom must ensure a statutory duty to promote media literacy by the largest platforms as part of their conditions of service. The Bill should make children’s risk assessment consistent across all services, and should tackle the drivers of harm and the design of the service, not just the content.
I welcome the new offences targeting harmful behaviour, including epilepsy trolling, cyber flashing and the sending of manufactured deepfake intimate images without consent. Despite the Bill adding controlling or coercive behaviour to the list of priority offences, more needs to be done to protect women, one in three of whom has experienced online abuse. Ofcom must add a mandatory code of practice regarding violence against women and girls so that tech companies understand they have a duty to prioritise their safety.
The Bill must prevent the relentless promotion of suicide and self-harm that has destroyed the lives of young people and their families. I commend the bravery of Ian Russell, who is campaigning to prevent other deaths following the tragic suicide of his daughter, Molly. I back the amendments from the noble Baroness, Lady Kidron, to ensure that coroners and bereaved families can access social media content. I applaud all those campaigners who want to see the Bill implemented urgently, and I will work with other noble Lords to strengthen it.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Healy of Primrose Hill
Main Page: Baroness Healy of Primrose Hill (Labour - Life peer)Department Debates - View all Baroness Healy of Primrose Hill's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberI appreciate the intervention by the noble Baroness; I hope through this grit we may conjure up a pearl of some sort. The original concept of the Bill, as championed by the noble Baroness, would have been a generalised set of duties of care which could have stretched much more broadly. It has evolved in a particular direction and become ever more specific and tailored to those three services: user-to-user, search, and pornography services. Having arrived at that point, it is difficult to then open it back up and stretch it to reach other forms of service.
My intention in intervening in this debate is to raise some of those concerns because I think they are legitimate. I may be at the more sceptical end of the political world, but I am at the more regulation-friendly end of the tech community. This is said in a spirit of trying to create a Bill that will actually work. I have done the work, and I know how hard Ofcom’s job will be. That sums up what I am trying to say: my concern is that we should not give Ofcom an impossible job. We have defined something quite tight—many people still object to it, think it is too loose and do not agree with it—but I think we have something reasonably workable. I am concerned that, however tempting it is, by re-opening Pandora’s box we may end up creating something less workable.
That does not mean we should forget about app stores and non-user-to-user content, but we need to think of a way of dealing with those which does not necessarily just roll over the mechanism we have created in the Online Safety Bill to other forms of application.
I strongly support the amendments in the name of the noble Baroness, Lady Kidron, because I want to see this Bill implemented but strengthened in order to fulfil the admirable intention that children must be safe wherever they are online. This will not be the case unless child safety duties are applicable in all digital environments likely to be accessed by children. This is not overly ambitious or unrealistic; the platforms need clarity as to these new responsibilities and Ofcom must be properly empowered to enforce the rules without worrying about endless legal challenges. These amendments will give that much-needed clarity in this complex area.
As the Joint Committee recommended, this regulatory alignment would simplify compliance with businesses while giving greater clarity to people who use the service and greater protection for children. It would give confidence to parents and children that they need not work out if they are in a regulated or unregulated service while online. The Government promised that the onus for keeping young people safe online would sit squarely on the tech companies’ shoulders.
Without these amendments, there is a real danger that a loophole will remain whereby some services, even those that are known to harm, are exempt, leaving thousands of children exposed to harm. They would also help to future-proof the Bill. For example, some parts of the metaverse as yet undeveloped may be out of scope, but already specialist police units have raised concerns that abuse rooms, limited to one user, are being used to practise violence and sexual violence against women and girls.
We can and must make this good Bill even better and support all the amendments in this group.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Healy of Primrose Hill
Main Page: Baroness Healy of Primrose Hill (Labour - Life peer)Department Debates - View all Baroness Healy of Primrose Hill's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, perhaps I may intervene briefly, because Scotland and Wales have already been mentioned. My perception of the Bill is that we are trying to build something fit for the future, and therefore we need some broad underlying principles. I remind the Committee that the Well-being of Future Generations Act (Wales) Act set a tone, and that tone has run through all aspects of society even more extensively than people imagined in protecting the next generation. As I have read them, these amendments set a tone to which I find it difficult to understand why anyone would object, given that that is a core principle, as I understood it, behind building in future-proofing that will protect children, among others.
My Lords, I support the amendments in the name of the noble Lord, Lord Russell, to require regulated services to have regard to the UN Convention on the Rights of the Child. As we continue to attempt to strengthen the Bill by ensuring that the UK will be the safest place for children to be online, there is a danger that platforms may take the easy way out in complying with the new legislation and just block children entirely from their sites. Services must not shut children out of digital spaces altogether to avoid compliance with the child safety duties, rather than designing services with their safety in mind. Children have rights and, as the UN convention makes clear, they must be treated according to their evolving capacities and in their best interests in consideration of their well-being.
Being online is now an essential right, not an option, to access education, entertainment and friendship, but we must try to ensure that it is a safe space. As the 5Rights Foundation points out, the Bill risks infringing children’s rights online, including their rights to information and participation in the digital world, by mandating that services prevent children from encountering harmful content, rather than ensuring services are made age appropriate for children and safe by design, as we discussed earlier. As risk assessments for adults have been stripped from the Bill, this has had the unintended consequence of rendering a child user relative to an adult user even more costly, as services will have substantial safety duties to comply with to protect children. 5Rights Foundation warns that this will lead services to determine that it is not worth designing services with children’s safety in mind but that it could be more cost effective to lock them out entirely.
Ofcom must have a duty to have regard for the UNCRC in its risk assessments. Amendment 196 would ensure that children’s rights are reflected in Ofcom’s assessment of risks, so that Ofcom must have regard for children’s rights in balancing their rights to be safe against their rights to access age-appropriate digital spaces. This would ensure compliance with general comment No. 25, as the noble Lord, Lord Russell, mentioned, passed in 2021, to protect children’s rights to freedom of expression and privacy. I urge the Ministers to accept these amendments to ensure that the UK will be not only the safest place for children to be online but the best place too, by respecting and protecting their rights.
My Lords, I support all the amendments in this group, and will make two very brief points. Before I do, I believe that those who are arguing for safety by design and to put harms in the Bill are not trying to restrict the freedom of children to access the internet but to give the tech sector slightly less freedom to access children and exploit them.
My first point is a point of principle, and here I must declare an interest. It was my very great privilege to chair the international group that drafted general comment No. 25 on children’s rights in relation to the digital environment. We did so on behalf of the Committee on the Rights of the Child and, as my noble friend Lord Russell said, it was adopted formally in 2021. To that end, a great deal of work has gone into balancing the sorts of issues that have been raised in this debate. I think it would interest noble Lords to know that the process took three years, with 150 submissions, many by nation states. Over 700 children in 28 countries were consulted in workshops of at least three hours. They had a good shout and, unlike many of the other general comments, this one is littered with their actual comments. I recommend it to the Committee as a very concise and forceful gesture of what it might be to exercise children’s rights in a balancing way across all the issues that we are discussing. I cannot remember who, but somebody said that the online world is not optional for children: it is where they grow up; it is where they spend their time; it is their education; it is their friendships; it is their entertainment; it is their information. Therefore, if it is not optional, then as a signatory to the UNCRC we have a duty to respect their rights in that environment.
My second point is rather more practical. During the passage of the age-appropriate design code, of which we have heard much, the argument was made that children were covered by the amendment itself, which said they must be kept in mind and so on. I anticipate that argument being made here—that we are aligning with children’s rights, apart from the fact that they are indivisible and must be done in their entirety. In that case, the Government happily accepted that it should be explicit, and it was put in the Data Protection Act. It was one of the most important things that happened in relation to the age-appropriate design code. We might hope that, when this Bill is an Act, it will all be over—our job will be done and we can move on. However, after the Data Protection Act, the most enormous influx of lobbying happened, saying, “Please take the age down from 18 to 13”. The Government, and in that case the ICO, shrugged their shoulders and said, “We can’t; it’s on the face of the Bill”, because Article 1 of the UNCRC says that a child is anyone under the age of 18.
The evolving capacities of children are central to the UNCRC, so the concerns of the noble Baroness, Lady Fox, which I very much share, that a four year-old and a 14 year-old are not the same, are embodied in that document and in the general comment, and therefore it is useful.
These amendments are asking for that same commitment here—to children and to their rights, and to their rights to protection, which is at the heart of so much of what we are debating, and their well-being. We need their participation; we need a digital world with children in it. Although I agreed very much with the noble Baroness, Lady Bennett, and her fierce defending of children’s rights, there are 1 billion children online. If two-thirds of them have not seen anything upsetting in the last year, that rather means that one-third of 1 billion children have—and that is too many.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Healy of Primrose Hill
Main Page: Baroness Healy of Primrose Hill (Labour - Life peer)Department Debates - View all Baroness Healy of Primrose Hill's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I rise briefly to support Amendments 34 and 35, from the noble Baroness, Lady Morgan, and others in this essential group. It is not enough to say the new triple shield will help prevent adults seeing harmful but legal material if they so wish. Having removed “harmful but legal” from the original Bill, there is now a need to ensure that the default options are the safest for users in regard to suicide, self-harm, eating disorders and abuse and hate content.
As the Bill stands, adults can still see the most dangerous content online. Young people over 18 may be especially vulnerable if faced with a torrent of images edited digitally to represent unattainable beauty standards; it can result in poor body image detrimental to mental health, resulting in shame, anxiety and, in some cases, suicide. As other noble Lords have said, anorexia has the highest mortality rate of any mental health problem. We know pro-anorexia sites are rife online. Vulnerable adults should be protected.
These amendments would make a real difference to the Bill. Changing the user empowerment provisions to require category 1 providers to have the safest options as the default for users would be a straightforward way of increasing the protection of most internet users who do not want to have this material bombard them. It would not overburden the tech companies and could do some good. It would not curtail freedom of speech, as tech-savvy users could easily flip a switch if they wished to opt in to some of the most dangerous content, which will still be available online, rather than receiving it by default.
Even with the Government’s best intentions to prevent encouragement of serious self-harm, we know they cannot criminalise all the legal content that treads the line between glorification and outright encouragement, as the noble Baroness, Lady Morgan, said. As the Communications and Digital Select Committee, on which I now serve, said in its 2021 report,
“the Online Safety Bill should require category 1 platforms to give users a comprehensive toolkit of settings, overseen by Ofcom, allowing users to decide what types of content they see and from whom. Platforms should be required to make these tools easy to find and use. The safest settings should always be the default”.
I hope the Government accept these valuable and simple amendments. They are supported by the Mental Health Foundation, to whom I owe thanks for this briefing, together with many other experts in the field of mental health.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Healy of Primrose Hill
Main Page: Baroness Healy of Primrose Hill (Labour - Life peer)Department Debates - View all Baroness Healy of Primrose Hill's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I will speak to a number of amendments in this group. I want to make the point that misinformation and disinformation was probably the issue we struggled with the most in the pre-legislative committee. We recognised the extraordinary harm it did, but also—as the noble Baroness, Lady Fox, said—that there is no one great truth. However, algorithmic spread and the drip, drip, drip of material that is not based on any search criteria or expression of an opinion but simply gives you more of the same, particularly the most shocking, moves very marginal views into the mainstream.
I am concerned that our debates over the last five days have concentrated so much on content, and that the freedom we seek does not take enough account of the way in which companies currently exercise control over the information we see. Correlations such as “Men who like barbecues are also susceptible to conspiracy theories” are then exploited to spread toxic theories that end in real-world harm or political tricks that show, for example, the Democrats as a paedophile group. Only last week I saw a series of pictures, presented as “evidence”, of President Biden caught in a compromising situation that gave truth to that lie. As Maria Ressa, the Nobel Peace Prize winner for her contribution to the freedom of expression, said in her acceptance speech:
“Tech sucked up our personal experiences and data, organized it with artificial intelligence, manipulated us with it, and created behavior at a scale that brought out the worst in humanity”.
That is the background to this set of amendments that we must take seriously.
As the noble Lord, Lord Bethell, said, Amendment 52 will ensure that platforms undertake a health misinformation risk assessment and provide a clear policy on dealing with harmful, false and misleading information. I put it to the Committee that, without this requirement, we will keep the status quo in which clicks are king, not health information.
It is a particular pleasure to support the noble Lord, Lord Moylan, on his Amendments 59 and 107. Like him, I am instinctively against taking material down. There are content-neutral ways of marking or questioning material, offering alternatives and signposting to diverse sources—not only true but diverse. These can break this toxic drip feed for long enough for people to think before they share, post and make personal decisions about the health information that they are receiving.
I am not incredibly thrilled by a committee for every occasion, but since the Bill is silent on the issue of misinformation and disinformation—which clearly will be supercharged by the rise of large language data models—it would be good to give a formal role to this advisory committee, so that it can make a meaningful and formal contribution to Ofcom as it develops not only this code of conduct but all codes of conduct.
Likewise, I am very supportive of Amendment 222, which seeks independence for the chair of the advisory body. I have seen at first hand how a combination of regulatory capture and a very litigious sector with deep pockets slows down progress and transparency. While the independence of the chair should be a given, our collective lived experience would suggest otherwise. This amendment would make that requirement clear.
Finally, and in a way most importantly, Amendment 224 would allow Ofcom to consider after the effect whether the code of conduct is necessary. This strikes a balance between adding to its current workload, which we are trying not to do, and tying one hand behind its back in the future. I would be grateful to hear from the Minister why we would not give Ofcom this option as a reasonable piece of future-proofing, given that this issue will be ever more important as AI creates layers of misinformation and disinformation at scale.
My Lords, I support Amendment 52, tabled by my noble friend Lady Merron. This is an important issue which must be addressed in the Bill if we are to make real progress in making the internet a safer space, not just for children but for vulnerable adults.
We have the opportunity to learn lessons from the pandemic, where misinformation had a devastating impact, spreading rapidly online like the virus and threatening to undermine the vaccine rollout. If the Government had kept their earlier promise to include protection from harmful false health content in their indicative list of harmful content that companies would have been required to address under the now removed adult safety duties, these amendments would not be necessary.
It is naive to think that platforms will behave responsibly. Currently, they are left to their own devices in how they tackle health misinformation, without appropriate regulatory oversight. They can remove it at scale or leave it completely unchecked, as illustrated by Twitter’s decision to stop enforcing its Covid-19 misinformation policies, as other noble Lords have pointed out.
It is not a question of maintaining free speech, as some might argue. It was the most vulnerable groups who suffered from the spread of misinformation online—pregnant women and the BAME community, who had higher illness rates. Studies have shown that, proportionately, more of them died, not just because they were front-line workers but because of rumours spread in the community which resulted in vaccine hesitancy, with devastating consequences. As other noble Lords have pointed out, in 2021 the Royal College of Obstetricians and Gynaecologists found that only 42% of women who had been offered the vaccine accepted it, and in October that year one in five of the most critically ill Covid patients were unvaccinated, pregnant women. That is a heartbreaking statistic.
Unfortunately, it is not just vaccine fears that are spread on the internet. Other harmful theories can affect patients with cancer, mental health issues and sexual health issues, and, most worryingly, can affect children’s health. Rumours and misinformation play on the minds of the most vulnerable. The Government have a duty to protect people, and by accepting this amendment they would go some way to addressing this.
Platforms must undertake a health misinformation risk assessment and have a clear policy on dealing with harmful, false and misleading health information in their terms of service. They have the money and the expertise to do this, and Parliament must insist. As my noble friend Lady Merron said, I do not think that the Minister can say that the false communications offence in Clause 160 will address the problem, as it covers only a user sending a knowingly false communication with the intention of causing harm. The charity Full Fact has stated that this offence will exclude most health misinformation that it monitors online.
My Lords, this has been a very interesting debate. I absolutely agree with what the noble Baroness, Lady Kidron, said right at the beginning of her speech. This was one of the most difficult areas that the Joint Committee had to look at. I am not saying that anything that we said was particularly original. We tried to say that this issue could be partly addressed by greater media literacy, which, no doubt, we will be talking about later today; we talked about transparency of system design, and about better enforcement of service terms and conditions. But things have moved on. Clearly, many of us think that the way that the current Bill is drafted is inadequate. However, the Government did move towards proposing a committee to review misinformation and disinformation. That is welcome, but I believe that these amendments are taking the thinking and actions a step forward.
My Lords, over the last decade, I have been in scores of schools, run dozens of workshops and spoken to literally thousands of children and young people. A lot of what I pass off as my own wisdom in this Chamber is, indeed, their wisdom. I have a couple of points, and I speak really from the perspective of children under 18 with regard to these amendments, which I fully support.
Media literacy—or digital literacy, as it is sometimes called—is not the same as e-safety. E-safety regimes concentrate on the behaviour of users. Very often, children say that what they learn in those lessons is focused on adult anxieties about predators and bullies, and when something goes wrong, they feel that they are to blame. It puts the responsibility on children. This response, which I have heard hundreds of times, normally comes up after a workshop in which we have discussed reward loops, privacy, algorithmic bias, profiling or—my own favourite—a game which reveals what is buried in terms and conditions; for example, that a company has a right to record the sound of a device or share their data with more than a thousand other companies. When young people understand the pressures that they are under and which are designed into the system, they feel much better about themselves and rather less enamoured of the services they are using. It is my experience that they then go on to make better choices for themselves.
Secondly, we have outsourced much of digital literacy to companies such as Google and Meta. They too concentrate on user behaviour, rather than looking at their own extractive policies focused on engagement and time spent. With many schools strapped for cash and expertise, this teaching is widespread. However, when I went to a Google-run assembly, children aged nine were being taught about features available only on services for those aged over 13—and nowhere was there a mention of age limits and why they are important. It cannot be right that the companies are grooming children towards their services without taking full responsibility for literacy, if that is the literacy that children are being given in school.
Thirdly, as the Government’s own 2021 media literacy strategy set out, good media literacy is one line of defence from harm. It could make a crucial difference in people making informed and safe decisions online and engaging in a more positive online debate, at the same time as understanding that online actions have consequences offline.
However, while digital literacy and, in particular, critical thinking are fundamental to a contemporary education and should be available throughout school and far beyond, they must not be used as a way of putting responsibility on the user for the company’s design decisions. I am specifically concerned that in the risk-assessment process, digital literacy is one of the ways that a company can say it has mitigated a potential risk or harm. I should like to hear from the Minister that that is an additional responsibility and not instead of responsibility.
Finally, over all these years I have always asked at the end of the session what the young people care about the most. The second most important thing is that the system should be less addictive—it should have less addiction built into it. Again, I point the Committee in the direction of the safety-by-design amendments in the name of my noble friend Lord Russell that try to get to the crux of that. They are not very exciting amendments in this debate but they get to the heart of it. However, the thing the young people most often say is, “Could you do something to get my parents to put down their phones?” I therefore ask the Minister whether he can slip something into the Bill, and indeed ask the noble Lord, Lord Grade, whether that could emerge somewhere in the guidance. That is what young people want.
My Lords, I strongly support the amendments in the name of my noble friend Lord Knight and others in this group.
We cannot entirely contain harmful, misleading and dangerous content on the internet, no matter how much we strengthen the Bill. Therefore, it is imperative that we put a new duty on category 1 and category 2A services to require them to put in place measures to promote the media literacy of users so that they can use the service safely.
I know that Ofcom takes the issue of media literacy seriously, but it is regrettable that the Government have dropped their proposal for a new media literacy duty for Ofcom. So far, I see no evidence that the platforms take media literacy seriously, so they need to be made to understand that they have corporate social responsibilities towards their clients.
Good media literacy is the first line of defence from bad information and the kind of misinformation we have discussed in earlier groups. Schools are trying to prepare their pupils to understand that the internet can peddle falsehoods as well as useful facts, but they need support, as the noble Baroness, Lady Kidron, just said. We all need to increase our media literacy, especially with the increasing use of artificial intelligence, as it can make the difference between decisions based on sound evidence and decisions based on poorly informed opinions that can harm health and well-being, social cohesion and democracy.
In 2022, Ofcom found that a third of internet users are unaware of the potential for inaccurate or biased information online, and 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to do so, as my noble friend Lord Knight, has pointed out.
Amendment 91 would mean that platforms have to instigate measures to give users an awareness and understanding of the nature and characteristics of the content that may be on the service, its potential impact and how platforms operate. That is a sensible and practical request that is not beyond the ability of companies to provide, and it will be to everyone’s benefit.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Healy of Primrose Hill
Main Page: Baroness Healy of Primrose Hill (Labour - Life peer)Department Debates - View all Baroness Healy of Primrose Hill's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberI absolutely agree. Of course, good law is a good system, not a good person.
I turn to the comments that I was going to make. Uncharacteristically, I am a little confused about this issue and I would love the Minister’s help. My understanding on reading the Bill very closely is that self-harm and suicide content that meets a legal definition will be subject to the priority illegal content duties. In the case of children, we can safely anticipate that content of this kind will be named primary priority content. Additionally, if such content is against the terms of service of a regulated company, it can be held responsible to those terms. It will have to provide a user empowerment tool on category 1 services so that it can be toggled out if an adult user wishes. That is my understanding of where this content has already been dealt with in the Bill. To my mind, this leaves the following ways in which suicide and self-harm material, which is the subject of this group of amendments, is not covered by the Bill. That is what I would like the Minister to confirm, and I absolutely stand by to be corrected.
In the case of adults, if self-harm and suicide material does not meet a bar of illegal content and the service is not category 1, there is no mechanism to toggle it out. Ofcom has no power to require a service to ensure tools to toggle self-harm and suicide material out by default. This means that self-harm and suicide material can be as prevalent as they like—pushed, promoted and recommended, as I have just explained—if it is not contrary to the terms of service, so long as it does not reach the bar of illegal content.
Search services are not subject to these clauses— I am unsure about that. In the case of both children and adults, if self-harm and suicide material is on blogs or services with limited functionality, it is out of scope of the Bill and there is absolutely nothing Ofcom can do. For non-category 1 services—the majority of services which claim that an insignificant number of children access their site and thus that they do not have to comply with the child safety duties—there are no protections for a child against this content.
I put it like that because I believe that each of the statements I just made could have been fixed by amendments already discussed during the past six days in Committee. We are currently planning to leave many children without the protection of the safety duties, to leave vulnerable adults without even the cover of default protections against material that has absolutely no public interest and to leave companies to decide whether to promote or use this material to fuel user engagement—even if it costs well-being and lives.
I ask the Minister to let me know if I have misunderstood, but I think it is really quite useful to see what is left once the protections are in place, rather than always concentrating on the protections themselves.
My Lords, I support the noble Baroness, Lady Finlay of Llandaff, in her Amendment 96 and others in this group. The internet is fuelling an epidemic of self-harm, often leading to suicide among young people. Thanks to the noble Baroness, Lady Kidron, I have listened to many grieving families explaining the impact that social media had on their beloved children. Content that includes providing detailed instructions for methods of suicide or challenges or pacts that seek agreement to undertake mutual acts of suicide or deliberate self-injury must be curtailed, or platforms must be made to warn and protect vulnerable adults.
I recognise that the Government acknowledge the problem and have attempted to tackle it in the Bill with the new offence of encouraging or assisting serious self-harm and suicide and by listing it as priority illegal content. But I agree with charities such as Samaritans, which says that the Government are taking a partial approach by not accepting this group of amendments. Samaritans considers that the types of suicide and self-harm content that is legal but unequivocally harmful includes information, depictions, instructions and advice on methods of self-harm or suicide, content that portrays self-harm and suicide as positive or desirable and graphic descriptions or depictions of self-harm and suicide.
With the removal of regulation of legal but harmful content, much suicide and self-harm content can remain easily available, and platforms will not even need to consider the risk that such content could pose to adult users. These amendments aim to ensure that harmful self-harm and suicide content is addressed across all platforms and search services, regardless of their functionality or reach, and, importantly, for all persons regardless of age.
In 2017 an inquiry into suicides of young people found suicide-related internet use in 26% of deaths in under-20s and 13% of deaths in 20 to 24 year-olds. Three-quarters of people who took part in Samaritans’ research with Swansea University said that they had harmed themselves more severely after viewing self-harm content online, as the noble Baroness, Lady Finlay, pointed out. People of all ages can be susceptible to harm from this dangerous content. There is shocking evidence that between 2011 and 2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm, and 82% of those patients were over 25.
Suicide is complex and rarely caused by one thing. However, there is strong evidence of associations between financial difficulties, mental health and suicide. People on the lowest incomes have a higher suicide risk than those who are wealthier, and people on lower incomes are also the most affected by rising prices and other types of financial hardship. In January and February this year the Samaritans saw the highest percentage of first-time phone callers concerned about finance or unemployment—almost one in 10 calls for help in February. With the cost of living crisis and growing pressure on adults to cope with stress, it is imperative that the Government urgently bring in these amendments to help protect all ages from harmful suicide and self-harm content by putting a duty on providers of user-to-user services to properly manage such content.
A more comprehensive online safety regime for all ages will also increase protections for children, as research has shown that age verification and restrictions across social media and online platforms are easily bypassed by them. As the Bill currently stands, there is a two-tier approach to safety which can still mean that children may circumnavigate safety controls and find this harmful suicide and self-harm content.
Finally, user empowerment duties that we debated earlier are no substitute for regulation of access to dangerous suicide and self-harm online content through the law that these amendments seek to achieve.
My Lords, I thank the noble Baroness, Lady Finlay, for introducing the amendments in the way she did. I think that what she has done, and what this whole debate has done, is to ask the question that the noble Baroness, Lady Kidron, posed: we do not know yet quite where the gaps are until we see what the Government have in mind in terms of the promised new offence. But it seems pretty clear that something along the lines of what has been proposed in this debate needs to be set out as well.
One of the most moving aspects of being part of the original Joint Committee on the draft Bill was the experience of listening to Ian Russell and the understanding, which I had not come across previously, of the sheer scale of the kind of material that has been the subject of this debate on suicide and self-harm encouragement. We need to find an effective way of dealing with it and I entirely take my noble friend’s point that this needs a combination of protectiveness and support. I think the combination of these amendments is designed to do precisely that and to learn from experience through having the advisory committee as well.
It is clear that, by itself, user empowerment is just not going to be enough in all of this. I think that is the bottom line for all of us. We need to go much further, and we owe a debt to the noble Baroness, Lady Finlay, for raising these issues and to the Samaritans for campaigning on this subject. I am just sorry that my noble friend Lady Tyler cannot be here because she is a signatory to a number of the amendments and feels very strongly about these issues as well.
I do not think I need to unpack a great deal of the points that have been made. We know that suicide is a leading cause of death in males under 50 and females under 35 in the UK. We know that so many of the deaths are internet-related and we need to find effective methods of dealing with this. These are meant to be practical steps.
I take the point of the noble Baroness, Lady Fox, not only that it is a social problem of some magnitude but that the question of definitions is important. I thought she strayed well beyond where I thought the definition of “self-harm” actually came. But one could discuss that. I thought the noble Baroness, Lady Kidron, saying that we want good law, not relying on good people, was about definitions. We cannot just leave it to the discretion of an individual, however good they may be, moderating on a social media platform.
My Lords, I strongly support Amendment 97 in the name of the noble Baroness, Lady Morgan. We must strengthen the Bill by imposing an obligation on Ofcom to develop and issue a code of practice on violence against women and girls. This will empower Ofcom and guide services in meeting their duties in regard to women and girls, and encourage them to recognise the many manifestations of online violence that disproportionately affect women and girls.
Refuge, the domestic abuse charity, has seen a growing number of cases of technology-facilitated domestic abuse in recent years. As other noble Lords have said, this tech abuse can take many forms but social media is a particularly powerful weapon for perpetrators, with one in three women experiencing online abuse, rising to almost two in three among young women. Yet the tech companies have been too slow to respond. Many survivors are left waiting weeks or months for a response when they report abusive content, if indeed they receive one at all. It appears that too many services do not understand the risks and nature of VAWG. They do not take complaints seriously and they think that this abuse does not breach community standards. A new code would address this with recommended measures and best practice on the appropriate prevention of and response to violence against women and girls. It would also support the delivery of existing duties set out in the Bill, such as those on illegal content, user empowerment and child safety.
I hope the Minister can accept this amendment, as it would be in keeping with other government policies, such as in the strategic policing requirement, which requires police forces to treat violence against women and girls as a national threat. Adding this code would help to meet the Government’s national and international commitments to tackling online VAWG, such as the tackling VAWG strategy and the Global Partnership for Action on Gender-Based Online Harassment and Abuse.
The Online Safety Bill is a chance to act on tackling the completely unacceptable levels of abuse of women and girls by making it clear through Ofcom that companies need to take this matter seriously and make systemic changes to the design and operation of their services to address VAWG. It would allow Ofcom to add this as a priority, as mandated in the Bill, rather than leave it as an optional extra to be tackled at a later date. The work to produce this code has already been done thanks to Refuge and other charities and academics who have produced a model that is freely available and has been shared with Ofcom. So it is not an extra burden and does not need to delay the implementation of the Bill; in fact, it will greatly aid Ofcom.
The Government are to be congratulated on their amendment to include controlling or coercive behaviour in their list of priority offences. I would like to congratulate them further if they can accept this valuable Amendment 97.
My Lords, I start by commending my noble friend Lady Morgan on her clear introduction to this group of amendments. I also commend the noble Baroness, Lady Kidron, on her powerful speech.
From those who have spoken so far, we have a clear picture of the widespread nature of some of the abuse and offences that women experience when they go online. I note from what my noble friend Lady Morgan said that there is widespread support from a range of organisations outside the Committee for this group of amendments. She also made an important and powerful point about the potential chilling effect of this kind of activity on women, including women in public life, being able to exercise their right to freedom of expression.
I feel it is important for me to make it clear that—this is an obvious thing—I very much support tough legal and criminal sanctions against any perpetrator of violence or sexual abuse against women. I really do understand and support this, and hear the scale of the problem that is being outlined in this group of amendments.
Mine is a dissenting voice, in that I am not persuaded by the proposed solution to the problem that has been described. I will not take up a lot of the Committee’s time, but any noble Lords who were in the House when we were discussing a group of amendments on another piece of legislation earlier this year may remember that I spoke against making misogyny a hate crime. The reason why I did that then is similar, in that I feel somewhat nervous about introducing a code of conduct which is directly relevant to women. I do not like the idea of trying to address some of these serious problems by separating women from men. Although I know it is not the intention of a code such as this or any such measures, I feel that it perpetuates a sense of division between men and women. I just do not like the idea that we live in a society where we try to address problems by isolating or categorising ourselves into different groups of people, emphasising the sense of weakness and being victims of any kind of attack or offence from another group, and assuming that everybody who is in the other group will be a perpetrator of some kind of attack, criticism or violence against us.
My view is that, in a world where we see some of this serious activity happening, we should do more to support young men and boys to understand the proper expectations of them. When we get to the groups of amendments on pornography and what more we can do to prevent children’s access to it, I will be much more sympathetic. Forgive me if this sounds like motherhood and apple pie, but I want us to try to generate a society where basic standards of behaviour and social norms are shared between men and women, young and old. I lament how so much of this has broken down, and a lot of the problems we see in society are the fault of political and—dare I say it?—religious leaders not doing more to promote some of those social norms in the past. As I said, I do not want us to respond to the situation we are in by perpetuating more divisions.
I look forward to hearing what my noble friend the Minister has to say, but I am nervous about the solution proposed in the amendments.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Healy of Primrose Hill
Main Page: Baroness Healy of Primrose Hill (Labour - Life peer)Department Debates - View all Baroness Healy of Primrose Hill's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I have put my name to Amendment 220E, in order that the Internet Watch Foundation is duly recognised for its work and there is clarity for its role in the future regulatory landscape. So far, no role has yet been agreed with Ofcom. This could have a detrimental effect on the vital work of the IWF in combating the proliferation of child sexual abuse images and videos online.
As other noble Lords have said, the work of the IWF in taking down the vile webpages depicting sexual abuse of children is vital to stemming this tide of abuse on the internet. Worryingly, self-generated images of children are on the rise, and now account for two-thirds of the content that is removed by the IWF. Seven to 10 year-olds are now the fastest-growing age group appearing in these images. As the noble Baroness, Lady Morgan, said, girls appear in 96% of the imagery the IWF removes from the internet—up almost 30 percentage points from a decade ago. The abuse of boys is also on the rise. In the past year the IWF has seen an 138% increase in images involving them, often linked to sexual extortion.
This amendment attempts to clarify the future role of the IWF, so we await the response from the Government with interest. Tackling this growing plague of child sexual abuse is going to take all the expert knowledge that can be found, and Ofcom would be strengthened in its work by formally co-operating with the IWF.
Briefly, I also support Amendment 226, in the name of my noble friend Lord Knight, to require Ofcom to establish an advocacy body for children. I raised this at Second Reading, as I believe that children must be represented not just by the Children's Commissioner, welcome though that is, but by a body that actively includes them, not just speaks for them. The role of the English Children’s Commissioner as a statutory consultee is not an alternative to advocacy. The commissioner’s role is narrowly focused on inputting into the codes of practice at the start of the regulatory cycle, not as an ongoing provider of children’s experiences online.
This body would need to be UK-wide, with dedicated staff to consistently listen to children through research projects and helplines. It will be able to monitor new harms and rapidly identify emerging risks through its direct continual contact with children. This body would assist Ofcom and strengthen its ability to keep up with new technology. The new body will be able to share insights with the regulator to ensure that decisions are based on a live understanding of children’s safety online and to act as an early warning system. Establishing such a body would increase trust in Ofcom’s ability to stay in touch with those it needs to serve and be recognised by the tech companies as a voice for children.
There must be a mechanism that ensures children’s interests and safety online are promoted and protected. Children have a right to participate fully in the digital world and have their voices heard, so that tech companies can design services that allow them to participate in an age-appropriate way to access education, friendships and entertainment in a safe environment, as the Bill intends. One in three internet users is a child; their rights cannot be ignored.
My Lords, I support Amendment 220E in the names of my noble friend Lord Clement-Jones and the noble Baroness, Lady Morgan of Cotes. I also support the amendments in the name of the noble Baroness, Lady Kidron, and Amendment 226, which deals with children’s mental health.
I have spoken on numerous occasions in this place about the devastating impact child sexual abuse has and how it robs children of their childhoods. I am sure everyone here will agree that every child has the right to a childhood free of sexual exploitation and abuse. That is why I am so passionate about protecting children from some of the most shocking and obscene harm you can imagine. In the case of this amendment and child sexual abuse, we are specifically talking about crimes against children.
The Internet Watch Foundation is an organisation I am proud to support as one of its parliamentary champions, because its staff are guardian angels who work tirelessly beyond the call of duty to protect children. In April 2019, I was honoured to host the IWF’s annual report here in Parliament. I was profoundly shocked and horrified by what I heard that day and in my continued interactions with the IWF.
That day, the IWF told the story of a little girl called Olivia. Olivia was just three years old when IWF analysts saw her. She was a little girl, with big green eyes and golden-brown hair. She was photographed and filmed in a domestic setting. This could have been any bedroom or bathroom anywhere in the country, anywhere in the world. Sadly, it was her home and she was with somebody she trusted. She was in the hands of someone who should have been there to look after her and nurture her. Instead, she was subjected to the most appalling sexual abuse over several years.
The team at the IWF have seen Olivia grow up in these images. They have seen her be repeatedly raped, and the torture she was subjected to. They tracked how often they saw Olivia’s images and videos over a three-month period. She appeared 347 times. On average that is five times every single day. In three in five of those images, she was being raped and tortured. Her imagery has also been identified as being distributed on commercial websites, where people are profiting from this appalling abuse.
I am happy to say that Olivia, thankfully, was rescued by law enforcement in 2013 at the age of eight, five years after her abuse began. Her physical abuse ended when the man who stole her childhood was imprisoned, but those images remain in circulation to this day. We know from speaking with adult survivors who have experienced revictimisation that it is the mental torture that blights lives and has an impact on their ability to leave their abuse in the past.
This Bill is supposed to help children like Olivia—and believe you me, she is just one of many, many children. The scale of these images in circulation is deeply worrying. In 2022, the IWF removed a record number of 255,000 web pages containing images of the sexual abuse and exploitation of children. Each one of these web pages can contain anything from one individual image of a child like Olivia, to thousands.
The IWF’s work is vital in removing millions of images from the internet each and every year, day in, day out. These guardian angels work tirelessly to stop this. As its CEO Susie Hargreaves often tells me, the world would be a much better place if the IWF did not have to exist, because this would mean that children were not suffering from sexual abuse or having such content spread online. But sadly, there is a need for the IWF. In fact, it is absolutely vital to the online safety landscape in the UK. As yet, this Bill does not go anywhere near far enough in recognising the important contribution the IWF has to make in implementing this legislation.
Victims of sexual abuse rely upon the IWF to protect and fight for them, safe in the knowledge that the IWF is on their side, working tirelessly to prevent millions of people potentially stumbling across their images and videos. This amendment is so important because, as my noble friend said, any delay to establishing roles and responsibilities of organisations like the IWF in working with Ofcom under the regulator regime risks leaving a vacuum in which the risks to children like Olivia will only increase further.
I urge the Government to take action to ensure that Ofcom clarifies how it intends to work with the Internet Watch Foundation and acknowledges the important part it has to play. We are months away from the Bill finally receiving Royal Assent. For children like Olivia, it cannot come soon enough; but it will not work as well as it could without the involvement of the Internet Watch Foundation. Let us make sure that we get this right and safeguard our children by accepting this amendment.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Healy of Primrose Hill
Main Page: Baroness Healy of Primrose Hill (Labour - Life peer)Department Debates - View all Baroness Healy of Primrose Hill's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I am very pleased to support the noble Baroness, Lady Kidron, with these amendments. I also welcome the fact that we have, I hope, reached the final day of this stage of the Bill, which means that it is getting closer to becoming an Act of Parliament. The amendments to these clauses are a very good example of why the Bill needs to become an Act sooner rather than later.
As we heard during our earlier debates, social media platforms have for far too long avoided taking responsibility for the countless harms that children face on their services. We have, of course, heard about Molly Russell’s tragic death and heard from the coroner’s inquest report that it was on Instagram that Molly viewed some of the most disturbing posts. Despite this, at the inquest Meta’s head of health and well-being policy shied away from taking blame and claimed that the posts which the coroner said contributed to Molly’s death
“in a more than minimal way”
were, in Meta’s words, “safe”. Molly’s family and others have to go through the unthinkable when they lose their child in such a manner. Their lives can be made so much harder when they attempt to access their child’s social media accounts and activities only to be denied by the platforms.
The noble Baroness’s various amendments are not only sensible but absolutely the right thing to do. In many ways, it is a great tragedy that we have had to wait for this piece of primary legislation for these companies to start being compelled and told. I understand what the noble Lord, Lord Allan, very rationally said—companies should very much welcome these amendments—but it is a great shame that often they have not behaved better in these circumstances previously.
There is perhaps no point going into the details, because we want to hear from the Minister about what the Government will propose. I welcome the fact that the Government have engaged early-ish on these amendments and on these matters.
The amendments would force platforms to comply with coroners in investigations into the death of a child, have a named senior manager in relation to inquests and allow easier access to a child’s social media account for bereaved families. We will have to see what the Government’s amendments do to reflect that. One of the areas that the noble Baroness said had perhaps not been buttoned down is the responsibility for a named senior manager in relation to an inquest. This is requiring that:
“If Ofcom has issued a notice to a service provider they must name a senior manager responsible for providing material on behalf of the service and to inform that individual of the consequences for not complying”.
The noble Lord, Lord Allan, set out very clearly why having a named contact in these companies is important. Bereaved families find it difficult, if not impossible, to make contact with tech companies: they get lost in the automated systems and, if they are able to access a human being, they are told that the company cannot or will not give that information. We know that different coroners have had widely differing experiences getting information from the social media platforms, some refusing altogether and others obfuscating. Only a couple of companies have co-operated fully, and in only one or two instances. Creating a single point of contact, who understands the law—which, as we have just heard, is not necessarily always straightforward, particularly if it involves different jurisdictions—understands what is technically feasible and has the authority and powers afforded to the regulator will ensure a swifter, more equitable and less distressing process.
I have really set this out because we will obviously hear what the Minister will set out, but if it does not reflect having a named senior manager, then I hope very much that we are able to discuss that between this and the next stage.
Social media platforms have a responsibility to keep their users safe. When they fail, they should be obligated to co-operate with families and investigations, rather than seeking to evade them. Seeing what their child was viewing online before their death will not bring that child back, but it will help families on their journey towards understanding what their young person was going through, and towards seeking justice. Likewise, ensuring that platforms comply with inquests will help to ease the considerable strain on bereaved families. I urge noble Lords to support these amendments or to listen to what the Government say. Hopefully, we can come up with a combined effort to put an end to the agony that these families have been through.
My Lords, I strongly support this group of amendments in the name of the noble Baroness, Lady Kidron, and other noble Lords. I, too, acknowledge the campaign group Bereaved Families for Online Safety, which has worked so closely with the noble Baroness, Lady Kidron, 5Rights and the NSPCC to bring these essential changes forward.
Where a child has died, sadly, and social media is thought to have played a part, families and coroners have faced years of stonewalling, often never managing to access data or information relevant to that death; this adds greatly to their grief and delays the finding of some kind of closure. We must never again see a family treated as Molly Russell’s family was treated, when it took five years of campaigning to get partial sight of material that the coroner found so distressing that he concluded that it contributed to her death in a more than minimal way; nor can it be acceptable for a company to refuse to co-operate, as in the case of Frankie Thomas, where Wattpad failed to provide the material requested by the coroner on the grounds that it is not based within the UK’s jurisdiction. With the threat of a fine of only £1,000 to face, companies feel little need to comply. These amendments would mean that tech companies now had to comply with Ofcom’s information notices or face a fine of up to 10% of their global revenue.
Coroners’ powers must be strengthened by giving Ofcom the duty and power to require relevant information from companies in cases where there is reason to suspect that a regulated service provider may hold information relevant to a child’s death. Companies may not want to face up to the role they have played in the death of a child by their irresponsible recommending and pushing of violent, sexual, depressive and pro-suicide material through algorithmic design, but they need to be made to answer when requested by a coroner on behalf of a bereaved family.
Amendment 215 requires a named senior manager, a concept that I am thankful is already enshrined in the Bill, to receive and respond to an information notice from Ofcom to ensure that a child’s information, including their interactions and behaviour and the actions of the regulated service provider, is preserved and made available. This could make a profound difference to how families will be treated by these platforms in future. Too often in the past, they have been evasive and unco-operative, adding greatly to the inconsolable grief of such bereaved parents. As Molly Russell's father Ian said:
“Having lived through Molly’s extended inquest, we think it is important that in future, after the death of a child, authorities’ access to data becomes … a matter of course”
and
“A more compassionate, efficient and speedy process”.
I was going to ask the Government to accept these amendments but, having listened to the noble Baroness, Lady Kidron, I am looking forward to their proposals. We must ensure that a more humane route for families and coroners to access data relating to the death of a child is at last available in law.
My Lords, I support the amendments standing in the name of the noble Baroness, Lady Kidron, and other noble Lords. I have listened to noble Lords, so I am not going to repeat what has been said. I pay my respects to the family because as someone who is still going through the criminal justice system, I absolutely feel the anguish of these families.
While we are talking about a digital platform, we are also talking about human lives, and that is what we have to remain focused on. I am not a techno, and all these words in the digital world sound like a lot of Japanese to me. I am not ignorant about what noble Lords are saying, but it has made me realise that, while we have gone forward, for a lot of people and families it still feels like wading through jelly.
I want to speak about how the families will feel and how they will connect through all of these gateways to get what they should quite rightly have about their loved ones’ lives and about what has been said about them online. Surely the platforms should have a duty of care, then perhaps we would not be here discussing these amendments. Noble Lords have spoken about the technical aspects of these amendments. By that, we mean data and the role of the coroner. As a former victims’ commissioner, I had many discussions with the Chief Coroner about other victims who have suffered loss as well. I think that people do not understand how victims’ families feel in the courtroom because you feel alone, and I imagine there are more legal aspects from these mega companies than these families can afford.