Online Harms Debate
Full Debate: Read Full DebateElsie Blundell
Main Page: Elsie Blundell (Labour - Heywood and Middleton North)Department Debates - View all Elsie Blundell's debates with the Department for Science, Innovation & Technology
(1 day, 8 hours ago)
Commons Chamber
Mrs Elsie Blundell (Heywood and Middleton North) (Lab)
I thank the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom) for securing this crucial debate. Since my election, constituents in Heywood and Middleton North have repeatedly raised issues about online harms, especially as they see those who control the platforms seeking to shirk accountability at every turn. That is why we cannot discount the significance of the Online Safety Act. That critical piece of legislation—the first of its kind in putting a range of new duties on social media companies and search engines to mitigate the harms that those online can pose to our constituents—was a welcome step taken by the previous Government and implemented by this Labour Government.
Perhaps to a greater extent than in any other area of policy, we must recognise that the frontiers of online media are constantly expanding, technology is evolving, and our daily life is increasingly determined by what takes place on phones, laptops and tablets. Though the Act was immensely welcome—it goes some way towards dealing with this complex set of challenges—we cannot wait another 20 years before we come to substantively revisit this topic.
To underscore why constant adaptation to these threats is necessary, I would like to touch on three themes. First, there is the proliferation of misinformation and disinformation. The integrity of our democracy and the tone of our discourse through to our continued belief in facts, evidence and science are all on the line in the war being waged unrelentingly in these digital spaces, where online actors are determined to amplify falsehoods to erode a sense of public trust that has taken generations to foster. The meteoric rise of AI has made the challenge all the more pressing.
People’s behaviour is being tracked on apps, and algorithms responding to them are driving misleading and sensationalist content into the most impressionable, vulnerable and isolated minds—so many of them are young people who are growing up unable to tell fact from fiction. We know that adults are also susceptible to such trends.
This week—of all weeks, when we have seen a deeply concerning outbreak of meningitis in Canterbury and east Kent—we see misinformation and blatantly anti-science positioning rear its ugly head once again, as we saw in the covid-19 pandemic and have seen countless times since. It is a really obvious thing to say, but the onus is on us to speak with one voice as MPs on such a critical topic as public health and to confront those harmful narratives at their source.
A great deal more thinking needs to be done in digital spaces when it comes to misinformation, whether medical or otherwise. That requires strengthened regulation and real intent between the Government, Ofcom and the platforms. I am pleased that the Online Safety Act has provisions to capture myths and disinformation where they are illegal or harmful to children, but we have much further to go in curtailing the weaponisation of online platforms to spread lies, conspiracies and harmful falsehoods to millions across the country.
Secondly, I would like to speak about the protection of children. I have raised the issue of technology-assisted child sexual abuse on several occasions in this place. It needs to be tackled from both sides—the judicial and the digital—so I wholly welcome the Online Safety Act and the Government’s wider work in this area. From stopping companies like X, or AI tools like Grok, generating vile, sexualised images of children and non-consensual, intimate deepfakes to the commitment to ban nudification apps and to introduce a legal duty requiring tech platforms to remove non-consensual intimate images within 48 hours of being posted, it is clear that the Government stand firmly against those who would do our children harm.
That being said, TACSA also has further dimensions that warrant serious consideration. It can take many forms, such as the distribution of child sexual abuse material, sexual harassment, exposure to sexually explicit materials and grooming, to name a few. Despite the prevalence and seriousness of these crimes, there is an over-reliance on non-custodial sentences across our judicial landscape, with magistrates courts dominating outcomes, and gaps in the unduly lenient sentencing scheme. Online or technology-assisted child sexual abuse has profound and lasting impacts on children for their whole lives, comparable to that of physical abuse. Digital regulation and our justice system must reflect the insidiousness and seriousness of such crimes, and I would welcome the Minister’s comments on that when he concludes the debate.
Finally, I will briefly touch on how discourse in digital spaces is increasingly affecting our communities. Following the Manchester synagogue attack last year, the Centre for Countering Digital Hate identified a troubling rise in antisemitism online, where violence against the Jewish community was celebrated and further encouraged. We only need open X, Facebook or other platforms to see a disgraceful barrage of abuse levelled at our Muslim community too, with platforms giving previously fringe far-right voices the means to amplify their dangerous and divisive rhetoric to millions. The harm that these actors can inflict on the capacity of our communities to come together is being played out each and every day. All too often they can hide behind anonymous accounts, and real people—my constituents and people across the country—are having to face the consequences. I am proud to represent a diverse constituency, but I fear the power that those online have to direct actions and attitudes in real life. I hope that the Minister will touch on that pertinent topic.
I welcome this Government’s efforts to curtail online harms. Indeed, I welcome the work of any Government in doing so. Things, however, are moving at a staggering rate. We therefore cannot view the Online Safety Act 2023 simply as a job well done; rather, we should see it as another rung on a growing ladder. To keep our constituents—especially children—and our communities safe, we need to ensure that our thinking is consistent with the expanding nature of these digital spaces. Ultimately, that means recognising that, for all their utility in connecting us with one another, these platforms also have a near unlimited capacity to do people harm. I truly fear the consequences of failing to recognise that.