Racist Abuse on Social Media Debate

Full Debate: Read Full Debate
Department: Home Office

Racist Abuse on Social Media

Baroness Hodge of Barking Excerpts
Wednesday 14th July 2021

(3 years, 4 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Victoria Atkins Portrait Victoria Atkins
- View Speech - Hansard - - - Excerpts

My right hon. Friend alights on an important point. This power is already within the reach of internet companies. Those companies seem to think that their community rules somehow take precedence over the laws of our country, and I imagine that is the same across other countries in the world. The message to those tech companies is this: please listen to the public’s outrage at some of the posts festering on your platforms, and deal with them. It is simply not acceptable to expect players, or victims of such abuse, to deal with it themselves. The tech companies have the algorithms and no doubt the powers to intervene, and they should use them now.

Baroness Hodge of Barking Portrait Dame Margaret Hodge (Barking) (Lab) [V]
- View Speech - Hansard - -

My question is a similar one. The racist abuse targeted at black footballers has been absolutely abhorrent. The tech giants could have stopped it, but they chose not to because it suits their business model. In October 2020, Mark Zuckerberg decided, literally on a whim, to remove holocaust denial from his Facebook, and he did that. In February 2021, after a public outcry, Instagram made a U-turn, changed its policy and started to regulate some direct messages of racial abuse.

Does the Minister agree that it is not the powers or the capability of the tech giants that is lacking, but the will? Everybody knew that the Wembley final could result in a torrent of abuse, yet the online platforms chose not to plan, not to monitor and not to act. Does she further agree that if we are to turn empty rhetoric into action, it is not enough to fine the companies, but the Government must legislate to hold the senior executives to personal account? They should be personally liable for failing to remove harmful content from their platforms.