All 1 Debates between Mike Gapes and Damian Collins

Sub-Committee on Disinformation

Debate between Mike Gapes and Damian Collins
Thursday 4th April 2019

(5 years, 8 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Mike Gapes Portrait Mike Gapes (in the Chair)
- Hansard - -

We begin with a Select Committee statement. Damian Collins will speak on the publication of the 10th report of the Digital, Culture, Media and Sport Committee, on the launch of the Sub-Committee on Disinformation for up to 10 minutes, during which no interventions may be taken. At the conclusion of the statement, I will call hon. Members to put questions on the subject of the statement and call Damian Collins to respond to them in turn. Hon. Members can expect to be called only once. Questions should be brief. I call the Chair of the Digital, Culture, Media and Sport Committee, Damian Collins.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Gapes. Earlier this week, the Select Committee published its 10th report on the creation of the Sub-Committee on Disinformation, which followed our reports on disinformation and fake news—the final report was published in February this year, with the interim report published in July last year. Our reports on that important subject were based on an inquiry that spanned about 18 months and that took in thousands of questions and many witnesses.

The focus on disinformation and fake news comes from our belief that there is a genuine danger to democracy and society in the deliberate and malicious targeting of disinformation at citizens, largely using social media to influence what they see and their opinions about politics, society and institutions. In the discussion about disinformation, much of the focus has been on it being used in election campaigns or around political events, but it is by no means limited to that. Disinformation is becoming a serious issue in the health sphere, in particular, with anti-vaccine information and stories being disseminated through social media.

The problem of disinformation is not limited to the period of our inquiry. When we established our initial inquiry, we were particularly concerned about the role of disinformation in the United States presidential election and other elections around the world, and about the role of foreign states and, in particular, agencies such as the Internet Research Agency in St Petersburg that deliberately create campaigns and mechanisms to spread disinformation through social media and target people relentlessly.

That has become a bigger societal problem as people increasingly get their news and information through social media. In this country, about half the population receives news principally through social media. That means that, rather than going to a curated news space, such as a newspaper, a broadcaster’s piece of news or a news organisation’s website, they are receiving news and information that has been shared by their friends on social media in bitesize chunks, or they are being targeted with information by advertisers and other organisations that promote content.

We know that, during the US presidential election, the number of shares of the top 20 fake news stories was greater than that of the top 20 real news stories. The issue is fundamental to the way people receive news and information because, on the channel where they increasingly receive it, they often do not know why they are receiving it or much about the organisation that is sending it. Disinformation is often dressed up to look like real news, but it could be hyper-partisan content from people with a high degree of bias or, more seriously, content that is totally fabricated. That has been an issue for some time, but it is of growing importance because of the scale and reach of social media.

When we look at the potential application of technology, the problem is only set to get worse, given the phenomenon of deep fake content. That is when someone takes a recording of your voice—I am sure they would not do it in your case, Mr Gapes—and creates a fake video image of you, then writes their own words and has them played out through your mouth in the film. We are all familiar with those grainy films that emerge during political campaigns whose production quality is not great because they are often shot on someone’s smartphone. Imagine the capability to do that easily in a totally fake way and to release a film of a politician supposedly saying something malicious or misleading during the final days of an election campaign. That capability exists, and we need the tools in place to fight back against it.

Since we published the Committee’s report in February, we have seen other events that lead us to believe that this is an ongoing and growing problem. We were all shocked and appalled at the way in which harmful footage from the terrorist attack in Christchurch, New Zealand, was livestreamed on Facebook and shared continuously on social media platforms around the world, and particularly YouTube, for a number of days afterwards.

We are also concerned about the role of organisations that spread news and information about political events in this country—this is particularly linked to Brexit—but that we do not know much about. The Committee’s inquiry identified an organisation called Mainstream Network, which was contacting people through social media with adverts and asking them to lobby their MP to vote in favour of a hard Brexit and to “Chuck Chequers”—to use the expression at the time—and not support the Prime Minister’s negotiating strategy.

People have a right to a political opinion, and there is nothing wrong with that, but when they are being targeting by an organisation and they do know who is doing that, who is providing the money or who is supporting that organisation, that becomes a problem. In our campaigns as politicians, we have to put legal imprints on our leaflets, posters and flyers to make it clear who they are from, but people do not have to do that online, and those loopholes are being exploited. We have also seen campaigns and organisations other than just Mainstream Network, such as We are the 52% and Britain’s Future, where large amounts of money are being spent to target people with messaging, but we do not know who is doing that. That is going on all the time and on a growing scale.

The purpose of the Sub-Committee is to provide an institutional home for the Select Committee to build on the work of its initial inquiry, to look at new incidents of disinformation campaigns, where they lack transparency and where they are deliberately misleading, and to recognise that this is a form of harmful content that needs to be addressed. We look forward to the publication of the Government’s White Paper on online harms, which I believe will happen early next week, so that we can see what ideas they propose and understand more about their response to the Select Committee report, which covered many of those issues. The Sub-Committee will look at the issues arising from the White Paper and at the areas where the Government are looking for a response and consultation.

Mike Gapes Portrait Mike Gapes (in the Chair)
- Hansard - -

Order. Interventions are not allowed.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Sub-Committee will be the forum through which we look for areas where the Committee can analyse and respond to the White Paper. It will also be the forum through which we seek to hold regular sessions with important organisations and people who are investigating similar issues, and particularly the Information Commissioner.

The first meeting of the new Sub-Committee will be on Tuesday 23 April when we return from the short Easter recess. We will then question the Information Commissioner, principally about her investigation into the work of Mainstream Network and connected organisations, to understand more about who is funding that organisation and who is behind the dissemination of the content that it is sharing. That will be an important first step in the Sub-Committee’s work.

I appreciate that hon. Members have questions that they want to ask me—one of my Committee colleagues wished to jump the gun—so I will not use up every second of my 10 minutes. The Sub-Committee is a new step for the Digital, Culture, Media and Sport Committee, which has never created a Sub-Committee before. We have done so because we recognise the concerns about the spread of disinformation and the pivotal role that social media play in that.

Disinformation is a growing issue for democracy and society, and we need to provide robust public policy responses to tackle it at source, as well as through the channels through which it is shared. We also need to look principally at the responsibilities of big technology companies to act more effectively against the dissemination of disinformation, to provide more tools for their users to help them identify untrustworthy sources of information, and to provide greater transparency about who is promoting that content.

--- Later in debate ---
Mike Gapes Portrait Mike Gapes (in the Chair)
- Hansard - -

That is fine, but we have limited time, because we have another statement and then a normal debate after that. Thank you very much. Damian Collins, did you wish to respond?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I will respond briefly. To add to the Minister’s comments, we have all benefited enormously from the work of Elizabeth Denham and the ICO. It has demonstrated that it is one of the world-leading organisations in its field, and the fact that it has invested so much of its time into this area has helped enormously. This was an extremely long inquiry, and I place on record my thanks to all the Committee Clerks, particularly Chloe Challender the Committee Clerk and Josephine Willows the Committee specialist. They worked tirelessly, well above and beyond the call of duty, to support the Committee in its investigations.

The Minister has touched on some important issues. We discussed closed groups earlier, which are an important mechanism for allowing content to be shared virally and at great speed, particularly on Facebook. That sharing can be done not just through advertising, but through those closed groups. We know that social media platforms can observe what is going on in closed groups, and part of their responsibility should be to monitor that activity, particularly if those groups are being used to spread harmful content.

Encrypted media is also an important issue, and I have some concerns about the vision that Mark Zuckerberg has set out for Facebook, effectively bringing Facebook, Instagram and WhatsApp together. If that means all content being shared through encrypted channels, it would give the platforms an excuse to say that, because they cannot see what is being shared, they have no responsibility for it. I do not think that is acceptable, especially when those platforms will be using data gathered about their users to help facilitate contact via encrypted channels, and will still have a good understanding of what is going on. That is why the idea of a regulatory system is such an important step forward. As we have seen from the way Ofcom works with broadcasters, we need a regulator that has statutory powers—the power to go in and investigate, with the backing of Parliament—and the flexibility to look at new challenges as they arise and establish new standards for what is a responsible, ethical and acceptable way of working.

Elsewhere in the world, encrypted channels are increasingly becoming the principal mechanism for sharing information in election campaigns, particularly WhatsApp in India and Brazil. In any country that has a smartphone-connected electorate—as so many countries now do —sharing of political information through encrypted media will be an increasingly big problem. In our report, we tried to address many of the issues that exist today, and there are things that we can get on and deal with now. However, we may look back in five years’ time and say that, even having done all those things, the challenge of responding to disinformation being spread through encrypted media is one we still have to crack. We cannot leave that challenge to the tech companies on their own; we cannot leave it to them to solve that problem for us. We need to establish a clear legal framework, whereby it is clear what duty of care and responsibility tech companies have to ensure that their technology is not abused by people who seek to do others harm.