Draft Online Safety Bill Report

Baroness Hodge of Barking Excerpts
Thursday 13th January 2022

(2 years, 9 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Hodge of Barking Portrait Dame Margaret Hodge (Barking) (Lab)
- View Speech - Hansard - -

I congratulate the hon. Member for Folkestone and Hythe (Damian Collins) and the members of his Committee on bringing forward an incredibly thorough and very good report. I know Ministers have been consulting well with all Back Benchers, and I hope they do not pay lip service to the report’s conclusions, but really take on its important recommendations. What is interesting about this whole debate is that there is a broad consensus on the Back Benches. None of us are bound by ideology on these issues; our approach is based on our experience, the data and the wide body of research.

I will also say at the beginning that the business model of the platforms means that they will never tackle this themselves. They make their money by encouraging traffic on their platforms, and they encourage traffic by allowing abusive content to exist there. Their algorithms are there almost to control and encourage more abusive content. The idea that there can be any self-regulation in the legislation to be proposed by the Government is false.

I will draw attention to three sets of issues in the short time available to me. The first, the recommendations on paid-for scams and frauds, has already been discussed. It is ridiculous that user-generated content can be subject to regulation but that paid-for scams and frauds cannot be. Everybody who gave evidence to the Committee, including the Financial Conduct Authority, pleaded for its inclusion. The figure I have is from Action Fraud: 85% of the £1.7 billion lost in fraudulent scams in the past year resulted from cyber-enabled frauds. During the pandemic, this figure of course exploded. Again, there is no incentive for the platforms to do anything about this. They get paid for by the advertisements so they wish to encourage them. Indeed, there is a double benefit in this particular space for them, because the FCA also pays for them to prioritise the legitimate websites over the scam adds, so again self-regulation will not work. I know that Ministers support the proposal, and I hope that they are not swayed by advice that it is not legally possible, as I just do not accept that. I hope that they do not miss this opportunity by way of promises of legislation down the line.

Stephen Timms Portrait Stephen Timms (East Ham) (Lab)
- View Speech - Hansard - - - Excerpts

I very much agree with the point my right hon. Friend is making and with the recommendation in the report. I wonder whether she noticed that the Prime Minister told the Liaison Committee in July that

“one of the key objectives of the Online Safety Bill is to tackle online fraud.”

Does she agree that it cannot possibly do that if it misses out scam adverts?

Baroness Hodge of Barking Portrait Dame Margaret Hodge
- View Speech - Hansard - -

I completely agree with my right hon. Friend on that, and I hope that the Minister will confirm that he will include this in the legislation.

The second issue I wish to raise relates to anonymity. No one wants to undermine anonymity—we all recognise that it is crucial for whistleblowers, for victims of domestic violence or child abuse, and for others—but we do want to tackle anonymous abuse. Sadly, most of the vile abuse that appears online is anonymous, as we have seen in the spreading of disinformation, particularly in relation to the pandemic. I have seen it in my experience, and it really undermines my right to participate in democratic debate. If people paint someone online as being a terrible person, as a hypocrite or as a hateful, wicked woman, which is what they do with me, that person is then not trusted on anything and therefore their voice is shut down in the democratic debate.

What we are all after is not tackling anonymity but ensuring third party verification of the identity of people so that they can be traced if and when they put abusive content online. The proposals that came from the Law Commission, and which one of the four ex-Culture Secretaries who has worked on this issue has diligently pursued, to introduce a new offence to tackle serious online harms more effectively is very important. It is about shifting from content to the effects of the online harm.

My third point relates to director liability. All my experience in working in the field of tackling illicit finance and economic crime demonstrates to me that if we do not introduce director liability for when wrongdoing occurs in the actions of individuals associated with a company, we do not change the behaviour of those companies. Even fines of £50 million are not significant against Facebook’s gross revenue of more than £29 billion. I do not understand why we have to wait for two years to implement director liability, as it could be done immediately. I would be grateful to the Minister if he said that he will implement that.

The last thing I should say, in my final seconds, is on anonymity. I would like the Minister simply to confirm this afternoon whether he will tackle anonymous abuse and put in place the Law Commission’s proposals. When is the timeframe for that? I very much welcome the report and commend all those who worked so hard to put it together, and I hope we can make progress swiftly on a problem that is growing in British society and that is undermining, not supporting, democracy.