Christmas Adjournment Debate

Full Debate: Read Full Debate
Department: HM Treasury

Christmas Adjournment

James Berry Excerpts
Tuesday 20th December 2016

(7 years, 4 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
James Berry Portrait James Berry (Kingston and Surbiton) (Con)
- Hansard - -

I start by thanking all the emergency services for their work over the Christmas period, especially those who will be working while we are enjoying time with our families. As I propose to speak on a Home Affairs issue, I pay particular tribute to the police.

I was out on a Walk the Met session with the Chessington safer neighbourhood team just last week and saw the excellent work they do for us day in, day out. Kingston is now the safest borough in London and I want to put on record my thanks to Chief Superintendent Glenn Tunstall, who retires in three days as Kingston’s borough commander with that accolade. I am pleased that I started my dealings with Chief Superintendent Tunstall with a campaign for more police officers in Kingston town centre and ended it with a campaign for automatic number plate recognition software on the A3 corridor, both of which he pushed for and our Conservative council is delivering.

Today I want to speak about a national challenge for the police—the rise of hate speech and extremism online. I will refer to Facebook and Twitter because they are the most widely used social networks, not because they are the only platforms on which these issues arise or the only companies that bear responsibility for them. Social media has revolutionised the way we communicate, the way we receive news and information and the way companies advertise. Undoubtedly, it has many social benefits and can be used as a force for good, but social media platforms are being abused by those who wish to do our society and individuals in our society grave harm.

It is important to remember at all times that these social media platforms are not established and maintained out of a sense of altruism. They are designed to make money for their owners, principally through advertising revenues. The revenues of Facebook in particular are enormous and I do not criticise the company for that.

Right now, in less than a minute, any Member of this House with an iPhone would be able to find copious amounts of hate speech on Twitter—racism, especially anti-Semitism and Islamophobia, homophobia and many other forms of discrimination, and not just language that would not survive the Equality Act 2010, but language that is downright abusive and would not survive our criminal law.

In the Home Affairs Committee’s recent report on anti-Semitism, we outlined how a Jewish colleague received 2,500 abusive tweets over a few days using the hashtag #filthyjewbitch. Two of her abusers have already been sent to prison for this. Now there can be no dispute that that hashtag is offensive, abusive and racist, yet if one searches for that hashtag now, as I did just a few moments ago, one will find it still on Twitter, not from two hours ago or even two weeks ago, but from two years ago. I say that that is a disgrace, especially after the matter has been raised by a Committee of this House.

Although hate speech makes up a very small proportion of the overall traffic on social networking sites, when we live our lives more and more online, and this speech exists online in a way that it does not exist in the street or in the way we speak to one another, there is a risk that it becomes normalised and gives a licence to others to repeat it and to do worse.

I turn to the other factor—extremism. The issue does not stop at hate speech. Just as social media are used by people to advertise holidays and beauty products, they are used by those who want to advertise terrorism. It is no exaggeration to say that Daesh has run the most successful propaganda campaign since Goebbels in Nazi Germany, yet Daesh has a much wider audience because of the reach of social media. It has managed to persuade people who enjoy all the rights and privileges that we enjoy in this country to travel to Syria to work with a barbarous medieval regime or to commit atrocities here in Europe, like those which we saw in Nice and appear to have seen in Berlin.

I am not going to overstate my case and blame all of this on social media, because that is certainly not the reality, but I am going to say that young people in Britain today are being radicalised in their bedrooms, and the gateway to a lot of the radical material online is the common social media platforms such as Twitter and Facebook. In addition to being a conduit through which extremists are recruited, social media are used by Daesh and its supporters to generate propaganda to attract support and funds. Social media platforms that are used by millions of our constituents every minute of every day are being abused by people who want to peddle extremism and hate. What are social media companies doing about that? The answer is, far too little. I have not heard one Member of this House demur from that proposition.

I am not sure that we, as a society, should accept the proposition that organisations such as social media companies should be allowed to create something to make money that has the potential to do harm, or at least to facilitate harm, and then claim that because it has become so big, it is unreasonable to expect them to do more to prevent that harm. I say that the polluter should pay.

Who is left to pick up the pieces? The police. With the Home Affairs Committee and the right hon. Member for Leicester East (Keith Vaz), who was then our Chair, I visited Scotland Yard to see the unit where dozens of officers spend all day every day going through Twitter, Facebook and other social networking sites to flag up this material. They do that not really for any law enforcement purpose—they are not there to apply for a court order—but merely so that they can tell Twitter, in particular, that something violates its own in-house terms of use. To its credit, Twitter often removes that material, but why should the police have to do the searching? The Committee also visited The Hague, where Europol has a similar unit for non-English language material. My question is this: why should our constituents’ taxes be used to fund our police to do the work that social media companies should be doing themselves?

My father, who passed away three years ago this week, was fond of quoting Margaret Thatcher. I have not been able to verify this quote, but she once said that she did not like people coming to her with problems but no solutions. I will therefore present three options in the few minutes remaining. The first is to consider legislation. The most straightforward approach would be to make social media companies liable for what they allow or enable to be published on their platforms. For other reasons, including libel and copyright law, that would be devastating for those companies; they do not want it to happen. The German Government announced only last week that they will consider legislating for fines of up to half a million euros for social media companies that fail to remove within 24 hours posts that breach Germany’s hate speech laws. We can be emboldened by the fact that our friends and allies in Europe are considering legislation.

The second option is to encourage social action. Social media companies rely on their members seeing the advertising from which they make money. If we voted with our feet, they would not be able to survive. If we, as users of social media—most, if not all of us are —made it clear that we would not stand for hate speech or extremism on those platforms, that would send a very clear message.

The third option, which I favour, is that social media companies get their own house in order, take a bit of responsibility and, for once, show some real leadership. They could establish, or at least fund, a not-for-profit organisation that employs people to identify and remove offending posts, that uses and develops their technological brilliance in order to filter out that material for manual checking, and that has police officers stationed there, paid for by that organisation, to gather intelligence and progress any cases that need legal input. There is a model for that in the National Centre for Missing & Exploited Children in Washington, DC, which I have had the fortune to visit. It is a not-for-profit organisation, funded by the technology sector—in large part by Facebook and Google—that tackles, among other things, online child exploitation. Why can that not apply to online hate speech and extremism?

I suggest that social media companies go away for Christmas, have a long, hard think and come back early in the new year with a proper proposal for an organisation of that kind, so that they can tackle online extremism and hate speech. If they do not do so, they should expect to be scrutinised in this House and for there to be concerted calls for legislation to make them do so in 2017.

On that note, I wish you, Mr Speaker, and everyone else present a merry Christmas and a happy new year.