Thursday 11th January 2018

(6 years, 11 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Bew Portrait Lord Bew (CB)
- Hansard - -

My Lords, I add my thanks to those of other noble Lords to my noble friend Lady Kidron for initiating this debate. I begin by declaring an interest as chair of the Committee on Standards in Public Life, which has a particular interest in this subject, as the role of social media took up a large part of the report that we produced before Christmas at the Prime Minister’s request on intimidation in public life.

For the committee’s review and publication of the report on intimidation in public life, we were interested in the role of social media companies in relation to illegal content, particularly threats of violence and illegal hate speech, such as racist abuse. Let me say straightaway that we recognised that, in many respects, social media is a force for good and democratic expression and is a democratising force in our public life. It promotes in many important respects engagement with politics. None the less, the scale of the problem which confronted us disturbed us. We had to come to terms with the fact that the legislative framework governing the responsibility of social media platforms is based on the EU e-commerce directive of 2000, which was framed well before social media companies and online news platforms existed in their present form, when they were essentially fledgling bodies.

The e-commerce directive shields companies from liability for illegal content where they are simply “hosts” and where their relationship to content is “technical, automatic or passive”. This exemption from liability requires that the company does not have knowledge of the illegal content, and takes it down expeditiously if it becomes aware of it. This formed the basis for what is known as the “notice and takedown” model. Our committee took the view that it is no longer appropriate to see social media companies as mere platforms. These companies choose the format in which users can post content and they curate that content, using algorithms to analyse and select content, including for commercial benefit. This is well beyond the role of a passive host. But nor are they publishers which should be held fully responsible for all their content, because they do not approve every item that appears on their platform and they do not create the content themselves.

Our committee concluded from this that we need new categories and new ways of thinking about this problem that go beyond the platform/publisher distinction; that we need to think properly about the role and responsibility of social media companies; that there should certainly be a shift in the liability towards social media companies for illegal material; and that the Government should bring forward legislation so to do. I have to say, when the committee started work on this, this was not a conclusion that was in our minds, but it was a function of our many discussions during the period of work on that document.

This shift in liability could be for particular types of content, or could be based on how difficult or how expensive automatic monitoring or removal of types of content is. As my committee made clear in our report, to address intimidation will require all those in public life—this is broadly across the problem of intimidation—to come together and work constructively.

I am very grateful to the right reverend Prelate the Bishop of Gloucester for mentioning my next point. Our committee also agrees with her that the BCS, the Chartered Institute for IT, which is convening discussions between the social media companies and the political parties to think about solutions to online abuse and its effect on the democratic process, is doing valuable work. The Committee on Standards in Public Life fully supports that work.