Thursday 4th April 2019

(5 years ago)

Westminster Hall
Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Digital, Culture, Media and Sport Committee
Select Committee statement
16:19
Mike Gapes Portrait Mike Gapes (in the Chair)
- Hansard - - - Excerpts

We begin with a Select Committee statement. Damian Collins will speak on the publication of the 10th report of the Digital, Culture, Media and Sport Committee, on the launch of the Sub-Committee on Disinformation for up to 10 minutes, during which no interventions may be taken. At the conclusion of the statement, I will call hon. Members to put questions on the subject of the statement and call Damian Collins to respond to them in turn. Hon. Members can expect to be called only once. Questions should be brief. I call the Chair of the Digital, Culture, Media and Sport Committee, Damian Collins.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Gapes. Earlier this week, the Select Committee published its 10th report on the creation of the Sub-Committee on Disinformation, which followed our reports on disinformation and fake news—the final report was published in February this year, with the interim report published in July last year. Our reports on that important subject were based on an inquiry that spanned about 18 months and that took in thousands of questions and many witnesses.

The focus on disinformation and fake news comes from our belief that there is a genuine danger to democracy and society in the deliberate and malicious targeting of disinformation at citizens, largely using social media to influence what they see and their opinions about politics, society and institutions. In the discussion about disinformation, much of the focus has been on it being used in election campaigns or around political events, but it is by no means limited to that. Disinformation is becoming a serious issue in the health sphere, in particular, with anti-vaccine information and stories being disseminated through social media.

The problem of disinformation is not limited to the period of our inquiry. When we established our initial inquiry, we were particularly concerned about the role of disinformation in the United States presidential election and other elections around the world, and about the role of foreign states and, in particular, agencies such as the Internet Research Agency in St Petersburg that deliberately create campaigns and mechanisms to spread disinformation through social media and target people relentlessly.

That has become a bigger societal problem as people increasingly get their news and information through social media. In this country, about half the population receives news principally through social media. That means that, rather than going to a curated news space, such as a newspaper, a broadcaster’s piece of news or a news organisation’s website, they are receiving news and information that has been shared by their friends on social media in bitesize chunks, or they are being targeted with information by advertisers and other organisations that promote content.

We know that, during the US presidential election, the number of shares of the top 20 fake news stories was greater than that of the top 20 real news stories. The issue is fundamental to the way people receive news and information because, on the channel where they increasingly receive it, they often do not know why they are receiving it or much about the organisation that is sending it. Disinformation is often dressed up to look like real news, but it could be hyper-partisan content from people with a high degree of bias or, more seriously, content that is totally fabricated. That has been an issue for some time, but it is of growing importance because of the scale and reach of social media.

When we look at the potential application of technology, the problem is only set to get worse, given the phenomenon of deep fake content. That is when someone takes a recording of your voice—I am sure they would not do it in your case, Mr Gapes—and creates a fake video image of you, then writes their own words and has them played out through your mouth in the film. We are all familiar with those grainy films that emerge during political campaigns whose production quality is not great because they are often shot on someone’s smartphone. Imagine the capability to do that easily in a totally fake way and to release a film of a politician supposedly saying something malicious or misleading during the final days of an election campaign. That capability exists, and we need the tools in place to fight back against it.

Since we published the Committee’s report in February, we have seen other events that lead us to believe that this is an ongoing and growing problem. We were all shocked and appalled at the way in which harmful footage from the terrorist attack in Christchurch, New Zealand, was livestreamed on Facebook and shared continuously on social media platforms around the world, and particularly YouTube, for a number of days afterwards.

We are also concerned about the role of organisations that spread news and information about political events in this country—this is particularly linked to Brexit—but that we do not know much about. The Committee’s inquiry identified an organisation called Mainstream Network, which was contacting people through social media with adverts and asking them to lobby their MP to vote in favour of a hard Brexit and to “Chuck Chequers”—to use the expression at the time—and not support the Prime Minister’s negotiating strategy.

People have a right to a political opinion, and there is nothing wrong with that, but when they are being targeting by an organisation and they do know who is doing that, who is providing the money or who is supporting that organisation, that becomes a problem. In our campaigns as politicians, we have to put legal imprints on our leaflets, posters and flyers to make it clear who they are from, but people do not have to do that online, and those loopholes are being exploited. We have also seen campaigns and organisations other than just Mainstream Network, such as We are the 52% and Britain’s Future, where large amounts of money are being spent to target people with messaging, but we do not know who is doing that. That is going on all the time and on a growing scale.

The purpose of the Sub-Committee is to provide an institutional home for the Select Committee to build on the work of its initial inquiry, to look at new incidents of disinformation campaigns, where they lack transparency and where they are deliberately misleading, and to recognise that this is a form of harmful content that needs to be addressed. We look forward to the publication of the Government’s White Paper on online harms, which I believe will happen early next week, so that we can see what ideas they propose and understand more about their response to the Select Committee report, which covered many of those issues. The Sub-Committee will look at the issues arising from the White Paper and at the areas where the Government are looking for a response and consultation.

Mike Gapes Portrait Mike Gapes (in the Chair)
- Hansard - - - Excerpts

Order. Interventions are not allowed.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Sub-Committee will be the forum through which we look for areas where the Committee can analyse and respond to the White Paper. It will also be the forum through which we seek to hold regular sessions with important organisations and people who are investigating similar issues, and particularly the Information Commissioner.

The first meeting of the new Sub-Committee will be on Tuesday 23 April when we return from the short Easter recess. We will then question the Information Commissioner, principally about her investigation into the work of Mainstream Network and connected organisations, to understand more about who is funding that organisation and who is behind the dissemination of the content that it is sharing. That will be an important first step in the Sub-Committee’s work.

I appreciate that hon. Members have questions that they want to ask me—one of my Committee colleagues wished to jump the gun—so I will not use up every second of my 10 minutes. The Sub-Committee is a new step for the Digital, Culture, Media and Sport Committee, which has never created a Sub-Committee before. We have done so because we recognise the concerns about the spread of disinformation and the pivotal role that social media play in that.

Disinformation is a growing issue for democracy and society, and we need to provide robust public policy responses to tackle it at source, as well as through the channels through which it is shared. We also need to look principally at the responsibilities of big technology companies to act more effectively against the dissemination of disinformation, to provide more tools for their users to help them identify untrustworthy sources of information, and to provide greater transparency about who is promoting that content.

Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

I was not certain whether I was allowed to intervene, but I will ask my question now. I welcome the advent of the Sub-Committee. In terms of the scale, this is not just about Russia or potential foreign actors intervening in our Brexit-related political crisis from a UK base or from overseas; it goes on worldwide. It is not just one foreign actor, but perhaps up to 39 foreign actors. Does my hon. Friend, the Committee Chair, agree that we need the Sub-Committee to be long standing and its scope to be as wide as possible in looking at all those other countries and what they are up to in terms of British politics?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My hon. Friend makes a very good point. This is a worldwide problem. As he knows, we took evidence during our inquiry about problems to do with disinformation in South America and across Europe—this is not just about Russian campaigns of disinformation. The reason why we decided to create this institutional home for our work on disinformation is that such work runs beyond the scope of any one particular inquiry; indeed, looking to develop successor inquiries with a narrow, defined remit could restrict us from looking at other material from elsewhere around the world.

We look forward to the Government’s White Paper and their response to the Select Committee report, because this country could provide a world-leading framework for understanding the liabilities and obligations of technology companies in terms of acting against known sources of disinformation, and I would include disinformation as a form of harmful content, along with other forms of extreme harmful content.

My hon. Friend is quite right that this is a global problem, and I hope our work in exposing what is going on can benefit other inquiries. As he knows, one reason why we established the international grand committee as part of our disinformation inquiry was to aid our partnership work with other Parliaments that are investigating these issues so that we could benefit from their insights and to share our own work.

Ian C. Lucas Portrait Ian C. Lucas (Wrexham) (Lab)
- Hansard - - - Excerpts

Less than two weeks ago, in the current febrile political environment, I was sent information from a closed Facebook group making the entirely false allegation that I had paid for two coaches to go to the march in London. I was made aware of that only because an individual contacted me and gave me the information. Does the hon. Gentleman agree that it is really important that closed groups on platforms are investigated and that this issue is dealt with urgently by Government? If so, what role does he see the Sub-Committee playing in that process?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Gentleman, who is a member of the Select Committee, makes an important point. He will know that we discussed the role of groups with Facebook during our investigation. We believe they play a significant role in spreading disinformation; it is not just through targeted advertising that someone can drive content through a platform such as that. Indeed, as he knows, the Committee’s final report on disinformation touched on how far-right organisations are using closed Facebook groups with hundreds of thousands of members to spread content very quickly through the web. Content posted into the group by a group administrator goes immediately to the top of the news feed of members, who may in turn share it.

These closed groups may be closed to the public, but Facebook can tell what is going on in them, and it should act where closed groups are behaving irresponsibly or maliciously in spreading lies and disinformation about people. It can see who the administrators are and who is doing that.

As a consequence of the attacks in Christchurch in particular—having an independent regulator with the power to go into the tech companies to see what is going on would facilitate this—we should do an audit of the sorts of groups and organisations that were sharing and promoting the vile content involved. That could provide a really important map of the way in which these far-right groups, in particular, co-ordinate online and spread disinformation.

The hon. Gentleman is quite right that this is not just about global news stories such as the Christchurch attacks; disinformation is also taking place in individual communities. We should be able to report such things to Facebook and know that it will investigate and take action against groups, including by closing them or the administrator down if necessary.

Lisa Cameron Portrait Dr Lisa Cameron (East Kilbride, Strathaven and Lesmahagow) (SNP)
- Hansard - - - Excerpts

I thank the hon. Gentleman and all members of the Committee for a very important report. I know that the Minister is working extremely hard on these issues.

My question is about making it easier or more streamlined for the police to investigate closed Facebook pages. At this point in time, it seems to be very difficult for the police to access information even when they have suspicions about it. The fact that individuals can post anonymously without giving their own details seems to exacerbate the situation whereby they feel they can post whatever they like without any responsibility.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady raises a number of very important issues. Co-operation with the authorities is important. We have seen too many cases where different social media companies have been criticised for not readily sharing information with the police as part of an investigation. Often the companies have very narrow terms of reference for when they would do that; sometimes if there is an immediate threat to life or if information might be related to a potential terror attack, they will act. However, we see hideous crimes that affect families in a grievous way and those families want the crimes to be investigated efficiently and speedily, and for the police to get access to any relevant information. I think we would have to say that the current system is not working effectively enough and that more should be done.

There should be more of an obligation on the companies to share proactively with the authorities information that they have observed. They might not have been asked for it yet, but it could be important or relevant to a police investigation. Part of, if you like, the duty of care of the tech companies should be to alert the relevant authorities to a problem when they see it and not wait to be asked as part of a formal investigation. Again, that sort of proactive intervention would be necessary.

I also share a general concern, in that I believe tech companies could do more to observe behaviour on their platforms that could lead to harm. That includes self-harm resulting from a vulnerable person accessing content that might lead them towards a pattern of self-harm. Indeed, one of the particular concerns that emerged from the Molly Russell case was the content she was engaging with on Instagram.

The companies should take a more proactive responsibility to identify people who share content that may lead to the radicalisation of individuals or encourage them to commit harmful acts against other citizens. I think the companies have the power to identify that sort of behaviour online, and there should be more of an obligation on them to share their knowledge of that content when they see it.

Khalid Mahmood Portrait Mr Khalid Mahmood (Birmingham, Perry Barr) (Lab)
- Hansard - - - Excerpts

It is always a pleasure to serve under your stewardship, Mr Gapes.

The Committee has produced an absolutely superb report—such detail—and it is to be welcomed. It raises serious issues in relation to the power of the platform providers, and their lack of usage of the powers they have to identify people and to do something with that information. That is very important. The Government should consider how to tackle the people who put this material on these platforms. We should get the providers to work through these issues with the Government and stop the false information that is being put up.

This issue affects huge numbers of people because, as the Chair of the Select Committee said, a lot of people take such information as gospel, as most of their media input is from social media, so it has a huge effect. I urge the Government to look at this issue seriously and to consider how we can push the social media platform providers to have a better response and remove false media reports that are put online.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Gentleman is absolutely right. One of the issues at the heart of this—it comes up again and again throughout our report—is the obligations of the tech companies. A social media platform is not necessarily the publisher of content; it has been posted there by a user of the platform. However, the social media company can observe everything that is going on and it curates the content as well.

When someone goes on social media, if they just saw what their friends had posted most recently, that would be one thing, but because social media algorithms direct users towards particular content, we are concerned not only that harmful content can exist, but that when individuals start to engage with it, they are directed to even more of it. I think that we should not only consider the responsibilities of the tech companies to remove harmful content when it is posted, but question the ethics of algorithms and systems that can direct people towards harmful content.

Angela Crawley Portrait Angela Crawley (Lanark and Hamilton East) (SNP)
- Hansard - - - Excerpts

I congratulate the hon. Member for Folkestone and Hythe (Damian Collins) on an excellent, wide-ranging and groundbreaking report, and I congratulate all the members and staff of the Digital, Culture, Media and Sport Committee on it. My hon. Friend the Member for Argyll and Bute (Brendan O’Hara), who demonstrates great knowledge of and enthusiasm for this inquiry, asked me to make a few points.

The inquiry started an ongoing worldwide conversation about the threats posed by shadowy, unaccountable and anti-democratic forces. As I understand it, in February the Digital, Culture, Media and Sport Committee hosted its first ever international grand committee, which included representatives of countries such as Canada, Ireland, Argentina, Belgium, Brazil, Singapore, France and Latvia. The Committee has also formed a new Sub-Committee as part of that international grand committee.

I recognise that it must have been difficult in a fast-moving environment to produce the formal report of an 18-month inquiry in such a timely fashion. I congratulate the Committee on establishing the Sub-Committee. Although the hon. Gentleman may already have answered this question, can he say exactly when the White Paper, which has been delayed repeatedly, will be published? Does he have any information on that White Paper that he could outline today?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I have some information on that, but given that the Minister is here, I will leave it to her to respond. The official word is “imminently”, which I think means “very imminently”. We look forward to the White Paper; it is an important piece of work that I hope will lay the foundations for turning the work of our inquiry, and other work that the Government have done, into real policy. We could establish in this country a world-leading framework for dealing with these issues.

Margot James Portrait The Minister for Digital and the Creative Industries (Margot James)
- Hansard - - - Excerpts

Life in Parliament is full of surprises at the moment. I must confess that I had a complete misunderstanding about today’s hearing; I thought it was in the main Chamber. When I alighted on the Order Paper on my return from a meeting outside the House and saw that this hearing was absent from it, I thought that it must have been moved—along with so many other things in Parliament at the moment. That explains why I have no official documentation whatsoever.

However, as my hon. Friend the Member for Folkestone and Hythe (Damian Collins) knows, this is my top priority across what is a very broad brief. I will therefore respond based on my own understanding, the excellent remarks that have been made by hon. Members, and of course the report of my hon. Friend’s Select Committee, which I read from cover to cover. I commend his work as Chairman, and all hon. Members who serve on that Committee, which exemplifies the power and potential that a Select Committee can bring to policy making. I am delighted to hear of the new development that my hon. Friend has announced: the Sub-Committee that he has set up specifically to tackle disinformation sounds like an excellent initiative.

I was delighted to hear that at the first meeting of that Sub-Committee, Members will be able to question and hear from the Information Commissioner, whose office is the leading data protection agency across Europe. That is partly because of the reputation of Elizabeth Denham, the commissioner; partly because of the huge additional resources that we have given the Information Commissioner’s Office; and partly because the office is leading on an investigation into the misuse of data, primarily by Facebook but by other platforms as well.

Mike Gapes Portrait Mike Gapes (in the Chair)
- Hansard - - - Excerpts

Order. Can I direct the Minister to ask some questions?

Margot James Portrait Margot James
- Hansard - - - Excerpts

Yes, please do. I need some direction.

Mike Gapes Portrait Mike Gapes (in the Chair)
- Hansard - - - Excerpts

This is not the normal procedure.

Margot James Portrait Margot James
- Hansard - - - Excerpts

I see. I am so sorry. You have been very forbearing with me as I completely misinterpreted my role.

Margot James Portrait Margot James
- Hansard - - - Excerpts

I thought I was making closing remarks. Should I be asking questions?

Margot James Portrait Margot James
- Hansard - - - Excerpts

I will convert some of the comments I was going to make into questions, then.

My hon. Friend the Member for Folkestone and Hythe indicated that he might want to know when the White Paper is coming out. We intend to publish it early next week—Monday, in fact. That White Paper is very broad, and I think it is an excellent piece of work. It has been informed by the work of my hon. Friend’s Committee, as well as by many other Members and external bodies, and also by the hard work of our officials in the Department for Digital, Culture, Media and Sport.

The White Paper will raise a number of questions, and I will take the opportunity to ask my hon. Friend about closed groups, encrypted content, and anonymity. From my knowledge of the White Paper, I think those are the three biggest challenges when it comes to delivering on the objectives that my hon. Friend has set out for internet companies. There are various experts working in those areas of encryption and private groups, and I would welcome my hon. Friend’s comments.

Margot James Portrait Margot James
- Hansard - - - Excerpts

Is that all right, Mr Gapes?

Mike Gapes Portrait Mike Gapes (in the Chair)
- Hansard - - - Excerpts

That is fine, but we have limited time, because we have another statement and then a normal debate after that. Thank you very much. Damian Collins, did you wish to respond?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I will respond briefly. To add to the Minister’s comments, we have all benefited enormously from the work of Elizabeth Denham and the ICO. It has demonstrated that it is one of the world-leading organisations in its field, and the fact that it has invested so much of its time into this area has helped enormously. This was an extremely long inquiry, and I place on record my thanks to all the Committee Clerks, particularly Chloe Challender the Committee Clerk and Josephine Willows the Committee specialist. They worked tirelessly, well above and beyond the call of duty, to support the Committee in its investigations.

The Minister has touched on some important issues. We discussed closed groups earlier, which are an important mechanism for allowing content to be shared virally and at great speed, particularly on Facebook. That sharing can be done not just through advertising, but through those closed groups. We know that social media platforms can observe what is going on in closed groups, and part of their responsibility should be to monitor that activity, particularly if those groups are being used to spread harmful content.

Encrypted media is also an important issue, and I have some concerns about the vision that Mark Zuckerberg has set out for Facebook, effectively bringing Facebook, Instagram and WhatsApp together. If that means all content being shared through encrypted channels, it would give the platforms an excuse to say that, because they cannot see what is being shared, they have no responsibility for it. I do not think that is acceptable, especially when those platforms will be using data gathered about their users to help facilitate contact via encrypted channels, and will still have a good understanding of what is going on. That is why the idea of a regulatory system is such an important step forward. As we have seen from the way Ofcom works with broadcasters, we need a regulator that has statutory powers—the power to go in and investigate, with the backing of Parliament—and the flexibility to look at new challenges as they arise and establish new standards for what is a responsible, ethical and acceptable way of working.

Elsewhere in the world, encrypted channels are increasingly becoming the principal mechanism for sharing information in election campaigns, particularly WhatsApp in India and Brazil. In any country that has a smartphone-connected electorate—as so many countries now do —sharing of political information through encrypted media will be an increasingly big problem. In our report, we tried to address many of the issues that exist today, and there are things that we can get on and deal with now. However, we may look back in five years’ time and say that, even having done all those things, the challenge of responding to disinformation being spread through encrypted media is one we still have to crack. We cannot leave that challenge to the tech companies on their own; we cannot leave it to them to solve that problem for us. We need to establish a clear legal framework, whereby it is clear what duty of care and responsibility tech companies have to ensure that their technology is not abused by people who seek to do others harm.