Online Filter Bubbles: Misinformation and Disinformation Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Online Filter Bubbles: Misinformation and Disinformation

Damian Collins Excerpts
Tuesday 16th January 2024

(3 months, 3 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

I congratulate my hon. Friend the Member for Weston-super-Mare (John Penrose) on securing the debate. It could not be more important or timely; as he alluded to in his speech, half the world is voting this year. We have already seen some of those elections take place in Taiwan and Bangladesh, and in America the Republican party had the Iowa caucus last night. It is interesting to see a demonstration of the impact of disinformation on ordinary people going about their daily lives. It has been heavily reported in America that 60% of Republicans voting in the caucus believe that the last presidential election was stolen and illegitimate and that Donald Trump was elected as President.

The challenge of disinformation is not just from foreign state interference. When we first started talking about the issue some five or six years ago, we were looking principally at the influence networks of Russia and Iran and their ability to try to reshape the way in which people saw the world and the institutions in their own countries to sow fear and discord and make people distrust the institutions of their own society, the legitimacy of their courts, the freedom of their elections and the truth of their media. However, it is happening in our society as well. The pandemic was a demonstration of the potency of conspiracy networks such as QAnon to persuade people that the vaccine was not safe, and we see it today to persuade people that our public institutions and elections are not safe. It is being done to undermine the fabric of democracy. There is a lot more to being a democracy than holding elections, and having faith in our institutions, trusting our media and trusting the news and information that we get are all essential to the citizen’s job of casting their vote every four or five years to determine who should run their country. If that is attacked and undermined, it is an attack on our entire democratic way of life. This year, we will see that challenge in a way that we have not seen before, with a level of technical sophistication that we have not seen before, and we should be concerned about it.

I will respond briefly to the remarks by the hon. Member for Glasgow South (Stewart Malcolm McDonald) in his speech. I was briefly the Minister responsible for the Counter Disinformation Unit, and I thought that I had better meet it, because it is not a particularly public-facing organisation, to see what it had to say. The Government quite rightly have different strategies for dealing with disinformation across Government: some of it is led by policing and security; some of it is led by looking at bad actors internally; and some of it is led by the Foreign Office and Ministry of Defence looking at bad actors externally. The Government should trigger different responses: some that respond with news and information that challenge conspiracy theories and networks, and some that identify networks of disinformation being controlled and operated by foreign states against which we want companies and platforms to take action. That was included in the National Security Act 2023 last year, and the Online Safety Act places a further obligation on companies to act in response to intelligence reports that they receive. If they do not take action against those known networks of disinformation controlled and run by hostile foreign states, action can be taken against the companies as well.

That is why the Online Safety Act was so important; it creates, for the first time, the principle of liability of platforms for the information that they distribute and promote to other users. Central to the debate on the Bill that became the Online Safety Act was finally answering the false question that was posed all the time: are platforms, such as Facebook, actually platforms or publishers? They do not write the content, but they do distribute it. People have first amendment rights in America to speak freely, and we have freedom of speech rights in this country—that is not the same as the right actively to be promoted to millions of people on a social media platform. They are different things. The companies promote content to users to hold their attention, drive engagement and increase advertising revenue. It is a business decision for which they should be held to account, and the Online Safety Act now gives a regulator the power to hold companies to account for how they do that.

I listened carefully to what my hon. Friend the Member for Weston-super-Mare said about whether we could borrow from the broadcasting code to try to create standards. Can we break filter bubbles by trying to give people access to different sorts of information? I think this is a difficult area, and there are subtle differences between a broadcaster and a social media platform. It is true that they both reach big audiences. It is also true that social media platforms exercise editorial decisions, just like a broadcaster does. However, the reason why it was so important for broadcasting and broadcasting licences to make sure that there were fair standards for balance and probity was that there were not that many broadcasters when the licences were introduced. The list has now grown. People tuning in do not necessarily know what they will get, because the content is selected and programmed by the programme maker and the channel.

I would say that social media have become not broadcast media, but the ultimate narrowcast media, because the content to which people are being exposed is designed for them. An individual’s principal experience of being on social media is not of searching for things, but of having things played and promoted to them, so the responsibility should lie with companies for the decisions they make about what to promote. There is nothing wrong with people having preferences—people have preferences when they buy a newspaper. I am sure that when the hon. Member for Strangford (Jim Shannon) watches services by Rev. Ian Paisley on YouTube, he does not want to get a prompt saying, “You’ve had enough this week. We’re going to give you some content from the Sinn Féin party conference.” We do not want that kind of interference going on. People have perfectly legitimate viewing habits that reflect their own preferences. The question is, do platforms push and actively promote conspiracy theories and fake news? I think they do, and there is evidence that they have done so.

I will mention one of the clearest examples of that in the brief time I have left. In the 2020 US presidential election, the platforms agreed, under pressure, to give far greater prominence to trusted news sources in their newsfeeds, so that people were far more likely to see content from a variety of different broadcasters. It was not necessarily all from CNN or Fox News—there could be a variety—but it was from known and legitimate news sources as a first preference. The platforms downgraded what they call civic groups, which are the friends and family groups that are often the breeding ground for conspiracy theories. One reason why they often spread so quickly is that people push them on their friends, who look at such content because it has come from someone they know and trust. However, when the platforms changed the ranking and promotion factor, it had a big impact: it dampened down disinformation and promoted trusted news sources, but it also reduced engagement with the platform. After the election, Facebook reversed the change and the conspiracy theorists were allowed to run riot again, which was a contributing factor in the insurrection we saw in Washington in January 2021.

Companies have the ability to make sure that fair and trusted news gets a better crack, which is absolutely essential in this digital age. They should be very wary about allowing AI to use content and articles from legitimate news organisations as training data to create what would effectively become generic copies to sell advertising against, steering people away from journalism that people have to pay for and towards free content that looks very similar but is far less likely to be trustworthy. We need to get the news bargaining code right so that proper news organisations do not see their content being distributed for free, and ads sold against it by other people, without getting fair remuneration. These are things we can do to protect our news ecosystem, and the Online Safety Act is essential for making sure that Ofcom holds companies to account for actively promoting known sources of conspiracy theories and disinformation. It is important to tackle the big threat to democracy, just as it is important to combat fraud and protect citizens from financial harm.

None Portrait Several hon. Members rose—
- Hansard -