Online Harms Debate
Full Debate: Read Full DebateCaroline Dinenage
Main Page: Caroline Dinenage (Conservative - Gosport)Department Debates - View all Caroline Dinenage's debates with the Department for Digital, Culture, Media & Sport
(2 years, 1 month ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your stewardship, Mr Dowd. I congratulate my right hon. Friend the Member for East Hampshire (Damian Hinds) on securing this vital and timely debate. Time is really of the essence if we are going to deliver the Online Safety Bill in this Session.
The scenario whereby the Bill falls is almost unthinkable. Thousands of man hours have been put in by the team at the Department for Digital, Culture, Media and Sport, by the Home Office team, and by the Joint Committee on the Draft Online Safety Bill, which the Minister chaired so brilliantly. There have been successive ministerial refinements by quite a few of the people in the Chamber, and numerous parliamentary debates over many years. Most importantly, the stakes in human terms just could not be higher.
As my right hon. Friend said, that was painfully underlined recently during the inquest into Molly Russell’s death. Her story is well documented. It is stories like Molly’s that remind us how dangerous the online world can be. While it is magnificent and life-changing in so many ways, the dark corners of the internet remain a serious concern for children and scores of other vulnerable people.
Of course, the priorities of the Bill must be to protect children, to tackle serious harm, to root out illegal content and to ensure that online platforms are doing what they say they are doing in enforcing their own terms and conditions. Contrary to the lazy accusations, largely by those who have not taken the time to read this hefty piece of legislation, the Bill does not set out to restrict free speech, to protect the feelings of adult users or to somehow legislate for people’s right not to be offended.
Following on from other Members, I will talk about the legal but harmful issue. There is no easy way to define “legal but harmful”, because it is so opaque. Even the name is clunky and unappetising, as my right hon. Friend the Member for East Hampshire said. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) sometimes uses the phrase “lawful but awful”, which often seems more appropriate, but it does not necessarily work from a legislative point of view.
If Molly Russell’s tragic case teaches us anything, it is that dreadful, harmful online content cannot be defined simply by what is strictly legal or illegal, because algorithms do not differentiate between harmless and harmful content. They see a pattern and they exploit it. They are, quite simply, echo chambers. They take our fears and our paranoia, and surround us with unhealthy voices that simply reinforce them, however dangerous or hateful they are. Fundamentally, they breadcrumb users into more content, slowly, piece by piece, cultivating an interest. They take us down a path we might not otherwise have followed—one that is seemingly harmless at the start, but that eventually is anything but.
We have a moral duty to keep children safe on online platforms, but we also have a moral duty to keep other users safe. People of all ages need to be protected from extremely harmful online content, particularly around suicide, self-harm and eating disorders, where the line between what is legal and what is illegal is so opaque. There is an inherent legal complexity in defining what legal but harmful really means.
It feels like this part of the Bill has become a lightning rod for those who think it will result in an overly censorious approach. That is an entirely misleading misinterpretation of what it seeks to achieve. I feel that, perversely, not putting in place protections would be inherently more of a bar to freedom of speech, because users’ content can be taken down at the moment with random unpredictability and without any justification or redress. Others are afraid to speak up, fearing pile-on harassment and intimidation from anonymous accounts.
The fact is that this is a once-in-a-generation opportunity to make this legislation effective and meaningful.