Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Hodge of Barking
Main Page: Baroness Hodge of Barking (Labour - Life peer)Department Debates - View all Baroness Hodge of Barking's debates with the Department for Digital, Culture, Media & Sport
(2 years, 7 months ago)
Commons ChamberThank you, Madam Deputy Speaker. I hope that I will take only three minutes.
The human cost of abuse on the internet is unquantifiable—from self-harm to suicide, grooming to child abuse, and racism to misogyny. A space we thought gave the unheard a legitimate voice has become a space where too many feel forced to stay offline. As a Jewish female politician online, I have seen my identities perversely tied together to discredit my character and therefore silence my voice. I am regularly accused of being a “Zionist hag”, a “paedophile” and a “Nazi”. But this is not just about politicians. We all remember the tsunami of racism following the Euros, and we know women are targeted more online than men. Social media firms will not tackle this because their business model encourages harmful content. Nasty content attracts more traffic; more traffic brings more advertising revenue; and more revenue means bigger profits. Legislation is necessary to make the social media firms act. However, this Bill will simply gather dust if Ofcom and the police remain underfunded. The “polluter pays” principle—that is, securing funding through a levy on the platforms—would be much fairer than taxpayers picking up the bill for corporate failures.
I cherish anonymity for whistleblowers and domestic violence victims—it is vital—but when it is used as a cloak to harm others, it should be challenged. The Government’s halfway measure allows users to choose to block anonymous posts by verifying their own identity. That ignores police advice not to block abusive accounts, as those accounts help to identify genuine threats to individuals, and it ignores the danger of giving platforms the power to verify identities. We should think about the Cambridge Analytica scandal. Surely a third party with experience in unique identification should carry out checks on users. Then we all remain anonymous to platforms, but can be traced by law enforcement if found guilty of harmful abuse. We can then name and shame offenders.
On director liability, fines against platforms become a business cost and will not change behaviour, so personal liability is a powerful deterrent. However, enforcing this liability only when a platform fails to supply information to Ofcom is feeble. Directors must be made liable for breaching safety duties.
Finally, as others have said, most regulations apply only to category 1 platforms. Search engines fall through the cracks; BitChute, Gab, 4chan—all escape, but as we saw in the attacks on Pittsburgh’s synagogue and Christchurch’s mosque, all these platforms helped to foster those events. Regulation must be based on risk, not size. Safety should be embedded in any innovative products, so concern about over-regulating innovation is misplaced. This is the beginning of a generational change. I am grateful to Ministers, because I do think they have listened. If they continue to listen, we can make Britain the safest place online.