(5 years, 7 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Mr Gapes. Earlier this week, the Select Committee published its 10th report on the creation of the Sub-Committee on Disinformation, which followed our reports on disinformation and fake news—the final report was published in February this year, with the interim report published in July last year. Our reports on that important subject were based on an inquiry that spanned about 18 months and that took in thousands of questions and many witnesses.
The focus on disinformation and fake news comes from our belief that there is a genuine danger to democracy and society in the deliberate and malicious targeting of disinformation at citizens, largely using social media to influence what they see and their opinions about politics, society and institutions. In the discussion about disinformation, much of the focus has been on it being used in election campaigns or around political events, but it is by no means limited to that. Disinformation is becoming a serious issue in the health sphere, in particular, with anti-vaccine information and stories being disseminated through social media.
The problem of disinformation is not limited to the period of our inquiry. When we established our initial inquiry, we were particularly concerned about the role of disinformation in the United States presidential election and other elections around the world, and about the role of foreign states and, in particular, agencies such as the Internet Research Agency in St Petersburg that deliberately create campaigns and mechanisms to spread disinformation through social media and target people relentlessly.
That has become a bigger societal problem as people increasingly get their news and information through social media. In this country, about half the population receives news principally through social media. That means that, rather than going to a curated news space, such as a newspaper, a broadcaster’s piece of news or a news organisation’s website, they are receiving news and information that has been shared by their friends on social media in bitesize chunks, or they are being targeted with information by advertisers and other organisations that promote content.
We know that, during the US presidential election, the number of shares of the top 20 fake news stories was greater than that of the top 20 real news stories. The issue is fundamental to the way people receive news and information because, on the channel where they increasingly receive it, they often do not know why they are receiving it or much about the organisation that is sending it. Disinformation is often dressed up to look like real news, but it could be hyper-partisan content from people with a high degree of bias or, more seriously, content that is totally fabricated. That has been an issue for some time, but it is of growing importance because of the scale and reach of social media.
When we look at the potential application of technology, the problem is only set to get worse, given the phenomenon of deep fake content. That is when someone takes a recording of your voice—I am sure they would not do it in your case, Mr Gapes—and creates a fake video image of you, then writes their own words and has them played out through your mouth in the film. We are all familiar with those grainy films that emerge during political campaigns whose production quality is not great because they are often shot on someone’s smartphone. Imagine the capability to do that easily in a totally fake way and to release a film of a politician supposedly saying something malicious or misleading during the final days of an election campaign. That capability exists, and we need the tools in place to fight back against it.
Since we published the Committee’s report in February, we have seen other events that lead us to believe that this is an ongoing and growing problem. We were all shocked and appalled at the way in which harmful footage from the terrorist attack in Christchurch, New Zealand, was livestreamed on Facebook and shared continuously on social media platforms around the world, and particularly YouTube, for a number of days afterwards.
We are also concerned about the role of organisations that spread news and information about political events in this country—this is particularly linked to Brexit—but that we do not know much about. The Committee’s inquiry identified an organisation called Mainstream Network, which was contacting people through social media with adverts and asking them to lobby their MP to vote in favour of a hard Brexit and to “Chuck Chequers”—to use the expression at the time—and not support the Prime Minister’s negotiating strategy.
People have a right to a political opinion, and there is nothing wrong with that, but when they are being targeting by an organisation and they do know who is doing that, who is providing the money or who is supporting that organisation, that becomes a problem. In our campaigns as politicians, we have to put legal imprints on our leaflets, posters and flyers to make it clear who they are from, but people do not have to do that online, and those loopholes are being exploited. We have also seen campaigns and organisations other than just Mainstream Network, such as We are the 52% and Britain’s Future, where large amounts of money are being spent to target people with messaging, but we do not know who is doing that. That is going on all the time and on a growing scale.
The purpose of the Sub-Committee is to provide an institutional home for the Select Committee to build on the work of its initial inquiry, to look at new incidents of disinformation campaigns, where they lack transparency and where they are deliberately misleading, and to recognise that this is a form of harmful content that needs to be addressed. We look forward to the publication of the Government’s White Paper on online harms, which I believe will happen early next week, so that we can see what ideas they propose and understand more about their response to the Select Committee report, which covered many of those issues. The Sub-Committee will look at the issues arising from the White Paper and at the areas where the Government are looking for a response and consultation.