Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Bull
Main Page: Baroness Bull (Crossbench - Life peer)Department Debates - View all Baroness Bull's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Lords ChamberMy Lords, no one who has heard Molly Russell’s story can be in any doubt about the need to better protect young people online, and I join others in paying tribute to her family for their tireless campaign.
As we have heard, vulnerability online does not evaporate on turning 18. Some adults will be at risk because mental illness, disability, autism, learning disabilities or even age leaves them unable to protect themselves from harm. Others will be vulnerable only at certain times, or in relation to specific issues. The “legal but harmful” provisions were not perfect, but stripping out adult safety duties—when, as the Minister himself said, three-quarters of adults are fearful of going online—is a backward step.
With category 1 services no longer required to assess risks to adults, it is hard to agree when the Minister says this will be
“a regulatory regime which has safety at its heart”.
Without risk assessments, how will platforms work out what they need to include in their terms and conditions? How will users make informed choices? How will the effectiveness of user empowerment tools be measured? Without the real-time information that risk assessments provide, how will the regulator stay on top of new risks, and advise the Secretary of State accordingly?
Instead, the Bill sets out duties for category 1 services to write and enforce their own terms and conditions—they will be “author, judge and jury”, to quote my noble friend Lady Kidron—and to provide tools that empower adult users to increase control over types of content listed at Clause 12. Harms arise and spread quickly online, yet this list is static, and it has significant gaps already. Harmful or false health content is missing, as are harms relating to body image, despite evidence linking body shaming to eating disorders, self-harm and suicide ideation. Smaller sites that target specific vulnerabilities, including suicide forums, would fall outside scope of these duties.
Describing this list as “content over which users may wish to increase control” is euphemism at its best. This is not content some might consider in poor taste, or a bit off-colour. This is content encouraging or promoting suicide, self-harm and eating disorders. It is content that is abusive or incites hate on the basis of race, ethnicity, religion, disability, sex, gender, sexual orientation and misogyny, which evidence connects directly to violence against women and girls.
And yet tools to hide this content will be off by default, meaning that people at the point of crisis, those seeking advice on self-harm or starvation, will need to find and activate those settings when they may well be in an affected mental state that leaves them unable to self-protect. The complexities of addiction and eating disorders disempower choice, undermining the very basis on which Clause 12 is built.
We heard it said today that all adults, given the tools, are capable of protecting themselves from online abuse and harm. This is just not true. Of course, many adults are fortunate to be able to do so, but as my noble and expert friends Lady Hollins and Lady Finlay explained, there are many adults who, for reasons of vulnerability or capacity, cannot do so. Requiring the tools to be on by default would protect adults at risk and cause no hardship whatever to those who are not: a rational adult will be as capable of finding the off button as the one that turns them on.
Last week, Ministers defended the current approach on the basis that failing to give all users equal access to all material constitutes a chilling effect on freedom of expression. It is surely more chilling that this Bill introduces a regime in which content promoting suicide, self-harm, or racist and misogynistic abuse is deemed acceptable, and is openly available, harming some but influencing many, as long as the platform in question gives users an option to turn it off. This cannot be right, and I very much hope Ministers will go back and reconsider.
When the Government committed to making the UK the safest place in the world to be online, I find it hard to believe that this is the environment that they had in mind.