Online Filter Bubbles: Misinformation and Disinformation

Tuesday 16th January 2024

(11 months ago)

Westminster Hall
Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

[Sir Mark Hendrick in the Chair]
10:29
John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - - - Excerpts

I beg to move,

That this House has considered the matter of preventing misinformation and disinformation in online filter bubbles.

It is good to see you in the Chair and in charge of our proceedings, Sir Mark. It is also good to see the Minister in his place. He confessed gently to me beforehand that this is the first Westminster Hall debate in which he has had the honour of being a Minister. I am sure that he will mark his debut with elan.

This important issue has not had enough exposure and discussion so far in Parliament, so I am pleased to see so many colleagues present. I suspect that they have all had far too many examples than they can possibly count of not just fake news but fake science, fake medicine or online memes of one kind or another landing in their in-trays from constituents. This is not just about constituents writing to Members of Parliament; it is a much broader issue that affects the whole tenor and fabric of our democracy and public debate. It is particularly important because public debate, in these online days, happens in a far wider and more varied collection of different forums than it used to before the internet was so commonly and widely used. It is something that needs to be addressed.

Gregory Campbell Portrait Mr Gregory Campbell (East Londonderry) (DUP)
- Hansard - - - Excerpts

I congratulate the hon. Member on securing this timely debate. Does he agree that the prevalence of fake news is all consuming? There was a quote put to me, as a Member of Parliament, on social media a few years ago. It was along the lines of, “In the words of Abraham Lincoln, don’t believe all you read on the internet.”

John Penrose Portrait John Penrose
- Hansard - - - Excerpts

The hon. Gentleman is absolutely right. The saying used to be, “Don’t believe everything you read in the newspapers,” but it applies equally strongly in the modern digital world to everything that we read on the internet.

Fake news, or misinformation and disinformation, really matters, particularly this year. A large proportion of the democratic world will have general elections in the next 12 months. The scope for interference in those elections by malign actors, whether they are foreign states, organised criminals or people of a particular religious or political persuasion, is very strong. They try to sway public debate through fair means or foul, and if we are talking about misinformation and disinformation, that means foul means. The potential in the next 12 months for bad things to happen is very high, and that is just when it comes to democracy. That does not cover the other examples of fake news or fake science that I mentioned, such as fake medicine. Believing quack cures can lead to more deaths, more people getting the wrong kinds of medical treatments and so on. There are many different opportunities.

There is also a serious issue around radicalisation. Somebody who is fed a diet of alt-left or alt-right political views, or extremist religious views—it does not really matter what it is—can easily disappear down a rabbit hole, into an echo chamber of views where only one particular strand of opinion is put in front of them again and again. That way leads to radicalisation on a whole range of different topics, and it undermines both our society and our democracy, and science in many cases. It means that societies as a whole become much more brittle and more divided, and it is much harder for democracy to flourish.

What is misinformation and disinformation, without getting sucked into technocratic definitions? It is rather like trying to define pornography. As the famous phrase goes, “You may not be able to define it, but like a hippopotamus, you recognise it when you see it.” [Interruption.] I will ignore the heckling on my right; it will not help. There are two underlying facets to misinformation and disinformation. One is that if someone is simply making stuff up, telling lies and saying things that are factually inaccurate and false, that can easily become misinformation and disinformation. The second is when things are factually accurate and correct but really one-sided and biased. That matters too; it is extremely important, and we have long had rules for our broadcasters, to which I will return in a minute, that are designed to prevent it.

The good news is that the Online Safety Act 2023 took a few early steps to do something about factual inaccuracy, at least. It does not do a great deal—it should do more—but it takes some early steps, and it would be churlish to pretend that there is nothing there at all. I tabled a couple of early amendments to get us to think about factual inaccuracy and to work out where it came from—provenance, in the jargon—so that we could tell whether something comes from a trusted source. We ended up with some useful points, particularly duties on Ofcom relating to media literacy and making sure that people know what questions to ask when they see something on the internet and do not, as we were just hearing, necessarily believe everything they read online but ask questions about where it came from, who produced it and whether it has been altered. Ofcom has that duty now; it has not yet grown teeth and claws or started to bite but at least, in principle, that power is there and is very welcome.

There is also the advisory committee enshrined in the Act, which ought to make a difference, although precisely how will depend on how actively it flexes the muscles it has been given. Separately from the Online Safety Act, there are the national security laws about foreign interference too. There is some protection, therefore, but it is not nearly enough. The Minister’s predecessors, in what used to be the Department for Digital, Culture, Media and Sport before it was reorganised, will say that in the early days of the Online Safety Act’s gestation, it was intended to cover misinformation and disinformation, but that was hived off and fell away at an early stage. That is an important omission, and we need to come back to it now.

I want to make a modest proposal. The Online Safety Act will start to make modest progress towards media literacy and people understanding and asking questions about factual accuracy and where something comes from when they see it on the web. It will go some way to addressing the first of the two sources of misinformation and disinformation—people telling lies, making stuff up, deepfakes of one kind or another. The sad fact is that the chances of deepfakes getting better with the advent of artificial intelligence is very high indeed so that, even if we think we can spot them now, we are probably kidding ourselves and in a year or two’s time it will be doubly, trebly or quadruply difficult to work out what is real and what is completely made up.

If we accept that at least something is in place in this country to deal with factual inaccuracy, we are still stuck with absolutely nothing, as yet, to deal with the one-sided and deeply biased presentation of factually correct narratives. I therefore want to draw a comparison, as I mentioned earlier, with what we already do and have been doing very successfully for decades in the broadcasting world, where Ofcom, through the broadcasting code, has been in charge of the duty of balance and undue prominence. That duty has very successfully said for decades that the analogue broadcasting world has to make sure that, when it presents something that is supposedly factual in a broadcast news programme, it must be balanced and must not give undue prominence to one side of the argument. That works really rather well, and has been a core part of ensuring that our public debates in this country are not sidetracked by fake news.

I suspect that every one of us here will, at various different times, have gnashed our teeth and shouted at the telly because we felt that the BBC, ITV or Sky News was presenting something in a slightly partisan way; depending on which side of the House we are on, we may have thought that the partisanship was on one side of the argument rather than the other. However, the fact remains that we all know the way they are supposed to do it and that there is some kind of redress, and there is generally an acceptance that it is the right thing to do. The duty matters not just because politicians think it is important, but because it has—very successfully, I would argue—made sure that there is a tendency towards reasoned, evidence-based consensus in British public debate, online and in broadcast news, over more than half a century.

The title of this debate is not just, “Misinformation and Disinformation”; it is about those two things in online filter bubbles. Online filter bubbles bear some quite important similarities to what broadcast news editorial room decision making has long been doing. The reason is that when we go online, we all have our own personal online filter bubble. Whether we use Google, Facebook, TikTok, all of the above, or whatever it might be, those platforms have an algorithm that says, “John Penrose likes looking at stuff to do with fishing tackle—we’re going to send him more stuff about fishing tackle.” I am not sure what the equivalent would be for the Minister; I am sure he will tell us in due course, unless he is too shy.

The algorithm works out what we have personally chosen to look at and gives us more of the same. That can also lead to radicalisation. If I start looking at things to do with Islamic jihad, it will say, “Oh! He’s interested in Islamic jihad”, and send me more and more things about Islamic jihad—or the alt-left, the alt-right, or whatever it might be. The algorithm’s decision to send people more of what they have already chosen—when it sends people things they have not chosen, but which it thinks they will like—is effectively a digital editorial decision that is, in principle, very similar to the editorial decisions going on in the Sky, ITV or BBC newsrooms, either for radio or for broadcast TV.

We need to come up with a modern, digital version of the long-established and, as I said, very successful principle of the duty of balance and undue prominence and apply it to the modern, digital world. Then, if I started looking at Islamic jihad, and I got sent more and more things about Islamic jihad, as I saw more and more things about Islamic jihad, the algorithm that was creating my personal filter bubble would start sending me things saying, “You do know that there is an alternative here? You do know that there is another side of this argument? You do know that the world is not just this, and this particular echo chamber—this rabbit hole of radicalisation that you are enthusiastically burrowing your way down—may actually be exactly that? You need to understand that there is more to it.” That is something that happens to all of us all the time in the old, analogue world, but does not happen in the digital world. I would argue that it is one of the reasons that many of us here, across the political spectrum, are so worried about the divisive nature of the online world and the rising levels of disrespect and potential incitement of violence there.

I plan to do something rather unusual for politicians and stop talking very soon, because I hope that this has served as a proposal for colleagues to consider. It is something that would need cross-party consensus behind it in order to be taken forward, and there may be better ways of doing it, but I am absolutely certain that we do not have anything in our legal arsenal in this area at the moment. I would argue that we need to act quite promptly. As I have said, the stakes in the next 12 months democratically are very high, but the stakes have been very high in other areas, such as medical disinformation, for a very long time because we have just come through a pandemic. The scope for damage—to our society, to our health and to our entire way of life—is very high.

Therefore, I hope that colleagues will consider what I have said, and if they have a better answer I am all ears and I would be absolutely delighted to hear it. This is a very early stage in the political debate about this issue, but we cannot carry on not having a political debate about it; we cannot carry on not addressing this central issue. So, I am glad that everybody is here today and I hope that we will all go forth and tell our political friends and neighbours that this issue is important and that they need to address it as well. And as I say, if people have better solutions than the one that I have just politely proffered, then my pen is poised and I look forward to taking notes.

14:45
Stewart Malcolm McDonald Portrait Stewart Malcolm McDonald (Glasgow South) (SNP)
- Hansard - - - Excerpts

It is most unusual for me to called so early in a Westminster Hall debate, Sir Mark, so I am grateful to you.

I congratulate the hon. Member for Weston-super-Mare (John Penrose) on securing this debate. There is no question that preventing misinformation and disinformation is one of the great challenges of our time, and it will only become more and more challenging, as he has adumbrated in his remarks to the House this afternoon.

Unfortunately, we have many active theatres of conflict around the world at the moment, so I will begin by thanking all of those who take to social media to counter so much of the disinformation that exists. Whether it is about the war in Ukraine or about the situation in the Red Sea, Gaza and Israel, so much disinformation is doing the rounds. Some of it is clearly state-sponsored; some of it less so.

Indeed, there is also misinformation or disinformation about elections, so no doubt we will see more of that as the elections in this country and elsewhere in the west draw closer. Also, last week there were elections in Taiwan, when the Taiwanese political parties said it was the harshest election yet in terms of Chinese-sponsored disinformation against a democratic people. However, a great many people invest time, effort, energy, money and resources online to counter such disinformation and they do a public service.

I will mention the negative part first, if I may; there is no point in my going over all the various examples of disinformation that exist. I recall being in a conference a few years ago with the hon. Member for Folkestone and Hythe (Damian Collins) where one of the complaints that we had—it is so often a complaint—was that when there are conferences and workshops and think-tank events about disinformation, everybody wants to talk about examples of disinformation but few people want to talk about how we arm ourselves against it.

So, as I say, let me start with the negative part first. I do not mean any of what I say today to be against the Minister—the Under-Secretary of State for Science, Innovation and Technology, the hon. Member for Meriden (Saqib Bhatti)—who, I will confess, I do not think I have faced on this issue before. Nevertheless, the Government do not have a coherent strategy on this issue. There are a great many officials across Government and across Whitehall who are doing some sterling work on it; no question about that. At a political level, however, this issue has not been given the serious consideration that it deserves; although it may be uncharitable of me to say so, that was evidenced most of all by the fact that Nadine Dorries was put in charge of it. [Laughter.] Having said that, I will come on to a central problem that is less about personalities and more about the policy framework and the institutions that are required.

As I understand it, and the Minister may correct me in his remarks, misinformation is the responsibility of the Department of Culture, Media and Sport; some disinformation is also that Department’s responsibility. Foreign disinformation falls with a mixture of the Foreign Office, the intelligence services and the Home Office. Other parts of disinformation are the responsibility of the Ministry of Defence and defence intelligence. I spent five and a half years as my party’s spokesperson for defence and the type of question that I wanted to ask depended on whether or not the Ministry of Defence could answer it. Who does this madness—a madness of responsibility and lines of accountability lying all over Whitehall—benefit? Certainly not our constituents.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

The hon. Member is making a very important point. I have tried repeatedly to find answers from the Government’s Counter Disinformation Unit. That specialist unit, set up in Whitehall to counteract some of this disinformation, is meant to be cross-departmental, but sadly it has been quite dormant. We have had very little information and transparency. Does the hon. Member agree that, if we had more transparency, we could see what Departments were working on across Government and seek to tackle the problem?

Stewart Malcolm McDonald Portrait Stewart Malcolm McDonald
- Hansard - - - Excerpts

Indeed. The hon. Lady is entirely correct. The fact that so much of this has spread like a great blob—some might say—around Whitehall benefits only our adversaries and those who wish to pursue disinformation in this country. That is before we get to the growing problem of the things the hon. Member for Weston-super-Mare mentioned—deep fakes and AI-generated disinformation—all of which is going to get worse and worse. As long as responsibility and lines of accountability and policy formation are a bit all over the place, when in my mind the obvious place for them to lie would be with the Cabinet Office, that will be of benefit only to those who want to sow disinformation.

In June 2021, in the spirit of trying to be a helpful Scottish nationalist, which might be an oxymoron to some people, I published a report that made nine recommendations on how, in fairness to the UK Government and Scottish Government, they can better counter disinformation in public life. I want to go through a couple of those. First, we need a proper national strategy that is a whole-society national strategy, imitating the excellent work done in countries such as Finland and Latvia, where countering disinformation and hybrid threats is not the job of the Department of Defence or even the Government but involves public institutions, other public bodies, the private sector, non-governmental organisations, civil society and private citizens. There is much that can be done. Surely we saw that in the generosity people showed during the pandemic. There is so much good will out there among the population to counter hybrid threats when they arise.

Although we have the counter disinformation unit, I would suggest a commissioner, perhaps similar to the Information Commissioner, with statutory powers on implementing the national strategy and countering disinformation. There is a job for our friends in the media, too. The media need to open up to explain to the public how stories are made. There is a job to be done in newspapers and broadcast media. It would be to the benefit of mainstream media—that phrase is often used in a derisory way, although I like my media to be mainstream—as the more the media explain to the public how they make news, the better that would be for those of us who consume it.

There should also be an audit of the ecosystem. One thing I suggested in the report is an annual update to Parliament of a threat assessment of hostile foreign disinformation to this country. The better we understand the information ecosystem, the better we can equip ourselves to counter hostile foreign disinformation. I also suggest literacy programmes across all public institutions, especially for public servants, whether elected or unelected. My goodness, some of them could do with that in this House.

I also suggest we look to host an annual clean information summit. There is so much good work that goes on, especially in Taiwan, and right on our own doorstep in Europe. So much good work goes on that we could learn from, and hopefully implement here. If we do not have a whole-society approach, involving public bodies, faith groups, trade unions, private enterprise and even political parties, fundamentally any strategy will fail.

I will end on this: political parties need to get their acts together, and not just on some of the stuff that gets put out. I am not going into things that individual parties have put out. But at either this election or the next—I would argue that the upcoming election is already at risk of a hostile foreign disinformation attack—what will happen when that disinformation gets more sophisticated, better funded and better resourced than anything we have to see it off? I come back to the conference I attended with the hon. Member for Folkestone and Hythe, where we took part in a war game: it was a presidential election, and our candidate was subject to a hostile foreign disinformation attack to spread smears and lies about them. We need to get used to this now. Political parties need to set their arms to one side and work together so that we can preserve that thing we call democracy. I think it is worth fighting for. I look forward to the other suggestions we will hear in the rest of the debate.

Mark Hendrick Portrait Sir Mark Hendrick (in the Chair)
- Hansard - - - Excerpts

I note the number of people present, and ask Members to keep their contributions to around seven minutes so that we can get everybody in.

14:55
Andrew Percy Portrait Andrew Percy (Brigg and Goole) (Con)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship today, Sir Mark. I congratulate my hon. Friend the Member for Weston-super-Mare (John Penrose) on securing the debate. I agreed with a lot of what was said by the hon. Member for Glasgow South (Stewart Malcolm McDonald)—did he say a “helpful Scottish nationalist”? I am not sure whether or not that was disinformation, but we will not debate that today.

I am extremely concerned about misinformation on a whole range of subjects. We saw it during the pandemic, with the vaccine debate; all Members will have received communications from constituents that were completely and utterly false, where people had been wound up online by fake doctors and people who were not vaccine specialists and were then presenting that information to us as fact. We see it in the immigration debate, where people are subjected to what is often racist commentary online, which then directs them towards other accounts—a lot of them anti-Muslim—which reinforce what they have heard. These people then appear in our inboxes, quoting that bile.

As others have mentioned, we also see it in election campaigns. I think all political parties can sometimes be a little guilty of promoting elements of disinformation. In the 2017 election in particular, I remember being on the receiving end of abuse and torrents of stuff that was put out about votes here, which, when I looked into the detail, just was not true. It was not as presented. I am afraid that all political parties sometimes cannot resist the urge to perhaps slightly misrepresent what has gone on in this place.

It will perhaps come as little surprise that I want to talk today about antisemitism—the anti-Jewish racism that remains prevalent and pernicious throughout our online platforms. Perhaps I am a hypocrite for talking about this, because I am not actually on any of these social media networks. I left them all, and it was the best thing I ever did for my mental health. I realised the power of filter bubbles, though, when I once looked at an account about a trainspotter, and ended up getting presented with lots of other trainspotting information. I thought, “Why is this all happening?”—it is because if someone looks at something once, they are driven down that path. Now, I could have become a radicalised trainspotter, but I was able to cut myself off just at the right point. I joke, but in other debates and on other accounts, this is incredibly frightening and dangerous.

We see it with antisemitism, with conspiracy theories reaching back hundreds of years, which, like artificial intelligence, mutate and evolve. It will be of little surprise to Members to hear that it can be found in relation to misinformation and disinformation too. We have particularly seen that since the start of the conflict in the middle east. Following the terror attacks on 7 October, there has been a significant proliferation of disinformation and misinformation. Shortly after the attacks, conspiracy theories emerged that were rooted in the anti-Jewish ideologies of those who wished to deny the atrocities that took place—denying that innocent civilians were attacked, that children were murdered and that women were subjected to gender-based violence.

I have seen some of that hate in the past 24 hours, following an outrageous smear somebody put out about me on social media that has resulted in a trickle of abuse coming at me, some of which is questionably antisemitic. Those emails have included a denial that Hamas was responsible for the deaths on 7 October, while someone else questioned, in relation to the Houthis, whether interrupting shipping lanes is really a heinous act. Worst of all, someone emailed me and described the hostages as “them Zionist rat hostages”. People have not come up with those comments and views themselves, but they have seen them online. They have been pointed in a particular direction through a series of misinformation and disinformation. It has had no effect on me, of course. I will continue to speak out and call out whatever I wish wherever I think it appropriate to whoever. It will not have any impact on me, but it has proven to me once again what a cesspit of hate and antisemitism social media can be. I will give a couple of examples to emphasise that.

One major conspiracy since 7 October is that the attacks on that day were a false flag operation by Israel—we have all probably had emails stating that. In one particular viral claim, social media users argued that the attack at the Nova music festival, in which 364 people were murdered and many abducted, was not carried out by Hamas but by Israeli forces, despite the fact that there was video evidence taken by the people there. Some try to be clever and deny one single aspect of the atrocity in order to skirt some of the social media rules.

In another example, it was claimed that the Israeli Government knew of the attack, but did not deploy the army in the hope that the crisis would help restore popularity. The Institute for Strategic Dialogue found that many of the conspiracies contain common antisemitic tropes. For example, sites affiliated with QAnon spread a conspiracy that the war was part of a plan to start a third world war, with a hidden ambition to start a new religion and cause chaos, which is of course a trope straight out of the Protocols of the Elders of Zion. We have seen that throughout.

AI has also played a major role in disseminating disinformation. I will use a few examples to demonstrate that. A Facebook post shows Israeli civilians cheering Israeli Defence Forces soldiers in an image that was heavily altered by AI. Of course, the people who shared it do not know that. There was a deepfake video of President Biden calling for a military draft in response to the war with Hamas. It appeared on TikTok and Facebook, where it managed to fool users into thinking that was real. I note that other people have talked about the difficulties of deepfake.

Deepfake images of abandoned and injured Palestinian babies in the ruins of Gaza have been viewed and shared millions of times. Because AI-generated content has become widespread, people now doubt genuine content. When authentic images of the luxurious homes of some Hamas leaders were shared, it was immediately pooh-poohed as an AI deepfake. Because of the algorithms that personalise the content, as other Members have said, users are drawn into filter bubbles on social media and continuously exposed to a specific narrative, with little or no exposure to counter-information.

The proposal of my hon. Friend the Member for Weston-super-Mare was spot on. I am conscious of your guidance on time, Sir Mark, so I will end there. I will just say that there is more we can do. The ideas that my hon. Friend has outlined are important, as are the things about digital media literacy and all the rest of it that the Government can invest in.

15:03
Justin Madders Portrait Justin Madders (Ellesmere Port and Neston) (Lab)
- Hansard - - - Excerpts

It is a pleasure to see you in the Chair this afternoon, Sir Mark. I am very sorry to hear about the abuse that the hon. Member for Brigg and Goole (Andrew Percy) has received. It is something that many of us unfortunately have experienced from time to time.

However, I want to start on a positive note and congratulate the hon. Member for Weston-super-Mare (John Penrose) on securing the debate and coming up with a sensible suggestion, which has to be the basis of further consideration. He was right. The matter is very dangerous for democracy and is an urgent issue that needs to be addressed. He used the term “radicalised”, which is a good way of looking at how that affects people.

People can hold views that they might not have countenanced a few months ago. The effect of filter bubbles can be that, in effect, they become a totally different person and they are completely unaware of the process that they have been through. There should be greater openness with individuals about the type of content that is being pushed on to their timeline. If an individual user could see, for example—or better still, be directed to—the tags that they have amassed, that knowledge would be of great assistance. It would hopefully prevent people from passively entering the echo chamber that we have heard so much about, and, crucially, would alert people to the possibility that the process is happening at all.

Anyone who has talked to someone whose worldview has been altered by what they have seen online will know that they will not be persuaded that their view might not be accurate. They can make no distinction between information they have picked up online as opposed to from traditional media sources, and they truly believe what they are being told. I have seen the terror in a person’s eyes as they recounted to me a particular conspiracy theory. That fear was absolutely real, and there was no dissuading them.

How widespread is the problem? Some academic research says that about 2% of the population are in an alt-left echo chamber, and 5% in an alt-right one. Another survey found that about 10% of those in the UK claim to see only social media that agrees with their particular views. That seems quite low, but it is actually still millions of people. I believe it is far easier to fall into these traps than to get out of them, so the potential for the number to grow is there. Even on these relatively low numbers, their potential to influence society is corrosive.

Groups of people operating in a sphere with a different factual basis from wider society have implications for how our democracy works, not only in terms of their own participation but in how they can shape debate outside their echo chamber. Voices airing conspiracy theories about 15-minute cities have, as we learned last week, impacted Government policy. It was reported in The Guardian that Ministers began considering curbs on cycling and walking schemes last year in response to concerns about 15-minute cities. Conspiracy theorists believe that 15-minute cities are designed to be part of the “great reset”, under which people will be forcibly locked down within their own local neighbourhood and not allowed to travel outside of it. After gaining traction online among right-wing fringe groups, mainly in echo chambers, it found its way into a Government policy. One of the biggest shifts in transport policy in decades had its origins in online conspiracy theories. From that, we can see the potential it has to really impact on Government policy.

That is one reason why foreign powers have used prominent social media platforms to seek to influence elections and disrupt civic debate. We know from extensive investigations by the US and UK security organisations that Russian state security services conduct operations via social media in order to influence the results of elections. The potency of infiltrating echo chambers and manipulating those inside can have national consequences. We have elections across the world this year, including in this country and the United States, so tackling the issue now is incredibly important.

When I have constituents who seem to believe that I and the majority of people in this place are lizards, who believe that I want to deliberately stop them from moving about freely, or who recite to me with unwavering certainty any number of other examples of absurd but dangerous conspiracy theories, we have to take seriously the threat to the democratic process that this represents. It is no coincidence that many of those online conspiracy theories have very negative things to say about UK politicians and the political process; the people peddling this stuff have no interest in seeing western liberal democracies flourish. They want to see them fail. It is fair to ask whether democracy can function properly when people are so trapped in their own warped realities that they cannot see any other viewpoint than their own, they immediately distrust anything that comes from an official source, and they cannot agree with others on basic, previously uncontested, facts.

We know trust in politics and politicians is at an all-time low. Those who end up in online bubbles tend to have zero confidence in politicians and the political process. Some people might say, “Well, so what? There have always been people who do not trust authority. There have always been people who have a propensity to believe conspiracies and operate outside the mainstream of society.” It is clear that those numbers are on the rise, their influence is growing, and there is a concerted effort by people hostile to this country to increase their ranks.

We cannot afford to be blasé. Our liberal democracy is fragile enough as it is, and it cannot be taken for granted. It has to be protected, defended and supported by us in this place as the guardians of democracy. It is not enough for us to be simply participants in the political process; we need to be its guardians. In this place, we debate and argue over interpretation of facts, but we do so within a framework where we are at least debating within the same reality. We also share a common understanding that if our arguments do not succeed on this occasion, the democratic process ensures that we will have another chance sometime. However, having so many and growing numbers of people who do not share the same reality as us and do not think that democracy works represents a real threat to democracy as a whole. We should not write those people off: we should try to engage with them as much as possible. However, we must also take on the source of their discomfort, challenge their beliefs and really lift the lid on who is pushing the disinformation and why. If we ignore that, it will grow, and before we know it there will be enough people sufficiently motivated to take matters into their own hands that this will not be just a dark corner of the internet: it will be on every street corner, and by then it will be too late.

15:10
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

I congratulate my hon. Friend the Member for Weston-super-Mare (John Penrose) on securing the debate. It could not be more important or timely; as he alluded to in his speech, half the world is voting this year. We have already seen some of those elections take place in Taiwan and Bangladesh, and in America the Republican party had the Iowa caucus last night. It is interesting to see a demonstration of the impact of disinformation on ordinary people going about their daily lives. It has been heavily reported in America that 60% of Republicans voting in the caucus believe that the last presidential election was stolen and illegitimate and that Donald Trump was elected as President.

The challenge of disinformation is not just from foreign state interference. When we first started talking about the issue some five or six years ago, we were looking principally at the influence networks of Russia and Iran and their ability to try to reshape the way in which people saw the world and the institutions in their own countries to sow fear and discord and make people distrust the institutions of their own society, the legitimacy of their courts, the freedom of their elections and the truth of their media. However, it is happening in our society as well. The pandemic was a demonstration of the potency of conspiracy networks such as QAnon to persuade people that the vaccine was not safe, and we see it today to persuade people that our public institutions and elections are not safe. It is being done to undermine the fabric of democracy. There is a lot more to being a democracy than holding elections, and having faith in our institutions, trusting our media and trusting the news and information that we get are all essential to the citizen’s job of casting their vote every four or five years to determine who should run their country. If that is attacked and undermined, it is an attack on our entire democratic way of life. This year, we will see that challenge in a way that we have not seen before, with a level of technical sophistication that we have not seen before, and we should be concerned about it.

I will respond briefly to the remarks by the hon. Member for Glasgow South (Stewart Malcolm McDonald) in his speech. I was briefly the Minister responsible for the Counter Disinformation Unit, and I thought that I had better meet it, because it is not a particularly public-facing organisation, to see what it had to say. The Government quite rightly have different strategies for dealing with disinformation across Government: some of it is led by policing and security; some of it is led by looking at bad actors internally; and some of it is led by the Foreign Office and Ministry of Defence looking at bad actors externally. The Government should trigger different responses: some that respond with news and information that challenge conspiracy theories and networks, and some that identify networks of disinformation being controlled and operated by foreign states against which we want companies and platforms to take action. That was included in the National Security Act 2023 last year, and the Online Safety Act places a further obligation on companies to act in response to intelligence reports that they receive. If they do not take action against those known networks of disinformation controlled and run by hostile foreign states, action can be taken against the companies as well.

That is why the Online Safety Act was so important; it creates, for the first time, the principle of liability of platforms for the information that they distribute and promote to other users. Central to the debate on the Bill that became the Online Safety Act was finally answering the false question that was posed all the time: are platforms, such as Facebook, actually platforms or publishers? They do not write the content, but they do distribute it. People have first amendment rights in America to speak freely, and we have freedom of speech rights in this country—that is not the same as the right actively to be promoted to millions of people on a social media platform. They are different things. The companies promote content to users to hold their attention, drive engagement and increase advertising revenue. It is a business decision for which they should be held to account, and the Online Safety Act now gives a regulator the power to hold companies to account for how they do that.

I listened carefully to what my hon. Friend the Member for Weston-super-Mare said about whether we could borrow from the broadcasting code to try to create standards. Can we break filter bubbles by trying to give people access to different sorts of information? I think this is a difficult area, and there are subtle differences between a broadcaster and a social media platform. It is true that they both reach big audiences. It is also true that social media platforms exercise editorial decisions, just like a broadcaster does. However, the reason why it was so important for broadcasting and broadcasting licences to make sure that there were fair standards for balance and probity was that there were not that many broadcasters when the licences were introduced. The list has now grown. People tuning in do not necessarily know what they will get, because the content is selected and programmed by the programme maker and the channel.

I would say that social media have become not broadcast media, but the ultimate narrowcast media, because the content to which people are being exposed is designed for them. An individual’s principal experience of being on social media is not of searching for things, but of having things played and promoted to them, so the responsibility should lie with companies for the decisions they make about what to promote. There is nothing wrong with people having preferences—people have preferences when they buy a newspaper. I am sure that when the hon. Member for Strangford (Jim Shannon) watches services by Rev. Ian Paisley on YouTube, he does not want to get a prompt saying, “You’ve had enough this week. We’re going to give you some content from the Sinn Féin party conference.” We do not want that kind of interference going on. People have perfectly legitimate viewing habits that reflect their own preferences. The question is, do platforms push and actively promote conspiracy theories and fake news? I think they do, and there is evidence that they have done so.

I will mention one of the clearest examples of that in the brief time I have left. In the 2020 US presidential election, the platforms agreed, under pressure, to give far greater prominence to trusted news sources in their newsfeeds, so that people were far more likely to see content from a variety of different broadcasters. It was not necessarily all from CNN or Fox News—there could be a variety—but it was from known and legitimate news sources as a first preference. The platforms downgraded what they call civic groups, which are the friends and family groups that are often the breeding ground for conspiracy theories. One reason why they often spread so quickly is that people push them on their friends, who look at such content because it has come from someone they know and trust. However, when the platforms changed the ranking and promotion factor, it had a big impact: it dampened down disinformation and promoted trusted news sources, but it also reduced engagement with the platform. After the election, Facebook reversed the change and the conspiracy theorists were allowed to run riot again, which was a contributing factor in the insurrection we saw in Washington in January 2021.

Companies have the ability to make sure that fair and trusted news gets a better crack, which is absolutely essential in this digital age. They should be very wary about allowing AI to use content and articles from legitimate news organisations as training data to create what would effectively become generic copies to sell advertising against, steering people away from journalism that people have to pay for and towards free content that looks very similar but is far less likely to be trustworthy. We need to get the news bargaining code right so that proper news organisations do not see their content being distributed for free, and ads sold against it by other people, without getting fair remuneration. These are things we can do to protect our news ecosystem, and the Online Safety Act is essential for making sure that Ofcom holds companies to account for actively promoting known sources of conspiracy theories and disinformation. It is important to tackle the big threat to democracy, just as it is important to combat fraud and protect citizens from financial harm.

None Portrait Several hon. Members rose—
- Hansard -

Mark Hendrick Portrait Sir Mark Hendrick (in the Chair)
- Hansard - - - Excerpts

I am conscious of the time, so I will limit Back-Bench contributions to six minutes each.

15:18
Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

I thank the hon. Member for Weston-super-Mare (John Penrose) for securing the debate and for giving us all an opportunity to participate. I am not technically minded, but he mentioned TikTok. In all honesty, I have no idea how it works, but my staff do, so I let them look after all the correspondence and contacts. I thank other Members for their significant contributions to the debate, and for the contributions that will follow.

We are here to discuss a critical issue that affects the very fabric of our society. In an era dominated by digital connectivity, the internet has become an indispensable tool for information transmission and exchange. It has also given rise to filter bubbles and echo chambers that reinforce our existing beliefs and shield us from alternative perspectives. I am very fortunate to have my office staff, who challenge me every day, so I never get an easy passage, so to speak. If I say something, they will always say, “Look, here’s the other side of that story.” It is good to have that challenge, because it keeps us sharp and focused on the issue, making us better understand the direction we are taking.

We live in a time when misinformation and dis- information can spread like wildfire, influencing public opinion, shaping public discourse, and even undermining the very foundations of our democratic systems. It is imperative that we address this issue head-on and take collective action to prevent the further entrenchment of filter bubbles in our online spaces. I am fortunate to have had a very good friend for some 45 or 46 years. If ever I have a problem or need some advice, it his wisdom I go to. He never tells me what I want to hear; he tells me what I need to hear. That helps us form our policies, strategies and thoughts for the way forward in the future.

First and foremost, we must acknowledge the role that social media platforms play in shaping our online experiences. These platforms, while providing a valuable means of communication, also contribute to the creation of filter bubbles by tailoring content to suit our preferences. To combat this, we must advocate for transparency and accountability from these platforms. They must disclose how their algorithms work and take responsibility for the unintended consequences of creating echo chambers.

Education is the most powerful tool in the fight against misinformation. We need to equip individuals with the critical thinking skills necessary to evaluate information critically, discern credible sources from unreliable ones, and challenge their preconceived notions. By fostering media and digital literacy, we empower citizens to navigate the vast online landscape using good judgment, balanced with a healthy scepticism. They say that as we grow older, we become more cynical. I would say that, no, we become sceptical. We are shaped by decisions and experiences, by those around us, and perhaps by the realities of life as well.

Collaboration between Government, technology companies and civil society is essential. We must work together to develop and implement policies that promote transparency, accountability and the ethical use of algorithms. Government should invest in initiatives and strategies that promote media literacy, while technology companies should prioritise the ethical design of their algorithms to mitigate any unintentional elaboration of misinformation and disinformation. This sounds very technical, but the fact is that we need to be wise, sensible and aware. That is what we are saying. By integrating these strategies into their practices, the IT sector can contribute significantly to the prevention of misinformation and disinformation in online filter bubbles.

There must also be encouragement for our online social media platforms to become more diverse. By engaging with individuals who hold different perspectives, we can burst the filter bubbles that insulate us from alternative viewpoints. This not only makes for a more robust and resilient society; it helps in breaking down the walls that misinformation and disinformation build around us. What steps will the Minister’s Department take to engage with platforms such as Meta and Google in building a more user-educated and factually informed society? We need to be aware of the power of media and those companies. It influences our young people—my generation, maybe not as much—because of access.

I conclude by suggesting that tackling misinformation and disinformation in online filter bubbles requires a multi-faceted approach that holds technology companies and platforms accountable and promotes education, critical thinking and collaboration. By taking these steps, we can strive towards a digital landscape that promotes the free exchange of diverse ideas, with the benefits of a more informed and connected society. With the help of the Minister, we can do that. I appreciate the contributions of all those who have spoken in the debate today and those who will speak after me. I believe we are all on the same page. We just need to do it better.

15:24
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is a pleasure to speak in this debate under your chairship today, Sir Mark. I thank the hon. Member for Weston-super-Mare (John Penrose) for securing this timely and important debate on such an important issue.

Let us be clear that the Online Safety Act is an extremely important and very long-overdue piece of legislation. In reality, however, gaps remain in that legislation that need to be addressed. In this debate, we have heard about what are hopefully some positive ways forward.

There is huge cross-party consensus. It is a shame and a frustration that, when cross-party amendments were tabled to the Online Safety Bill, they were not taken seriously enough in Committee and were not carried forward. We raised serious questions about social media platforms, their responsibility to protect users, young and old, and to tackle the rise in disinformation. That was a clear opportunity that has now sadly passed, but the fact remains that this issue needs to be tackled.

This debate is timely, but when the Bill was progressing through Parliament, the debate focused on misleading information around the conflict in Ukraine. We all know that an alarming amount of information has recently been shared online regarding the situation in Israel and the middle east. As the hon. Member for Brigg and Goole (Andrew Percy) mentioned, the horrendous atrocities that occurred on 7 October were livestreamed by Hamas. They wore GoPros and uploaded the footage directly to social media platforms, yet an incredible number of people still did not believe it, saying it was not true or that it was a hoax. How far do we have to go for women in particular to be believed when they report crimes against them and to take this seriously? I cannot help but think that if the Government had listened to the concerns that I and others raised at that time, then we would be in a much better position to deal with these issues. Sadly, we are where we are.

As colleagues have mentioned, we also need to consider the role that AI plays in relation to misinformation and disinformation, particularly the impact of generative AI. That has the power and the potential to be at the forefront of economic growth in the country but, as others have mentioned, with a huge number of elections happening across the world this year, there has never been a more crucial time to tackle the spread of this misinformation and disinformation and the impact that it could have on our democracy. I would be grateful if the Minister reassured us that the Government have a plan; I would welcome his assurances, specifically in light of whatever discussions he has had with the Electoral Commission regarding plans to tackle this ahead of the next UK general election.

The Minister knows that, despite the passing of the Online Safety Act, many of the provisions in the legislation will not be in place for quite some time. In the meantime, Twitter—now X—has given the green light for Donald Trump’s return. Political misinformation has increased since Twitter became X, and right-wing extremists continue to gain political traction on controversial YouTube accounts and on so-called free speech platform Rumble. Platforms to facilitate the increase in political misinformation and extremist hate are sadly readily available and can be all-encompassing. As colleagues have rightly noted, that is nothing new. We only need to cast our minds back to 2020 to remember the disturbing level of fake news and misinformation that was circulating on social media regarding the covid pandemic. From anti-vaxxers to covid conspiracists, the pandemic brought that issue to the forefront of our attention. Only today, it was announced in the media that the UK is in the grip of a sudden spike in measles. Health officials have had to declare the outbreak a national incident, and the surge has been directly linked to a decline in vaccine uptake as a result of a rise in health disinformation from anti-vax conspiracy theories. That causes real-world harm and it needs to be addressed.

Misinformation causes anxiety and fear among many people, and I fear that the provisions in the Act would not go far enough if we faced circumstances similar to the pandemic. We all know that this is wide-ranging, from conspiracy theories about the safety of 5G to nonsense information about the so-called dangers of 15-minute cities, about which my hon. Friend the Member for Ellesmere Port and Neston (Justin Madders) spoke so ably. Sadly, those conspiracy theories were not just peddled by lone-wolf actors on social media; they were promoted by parliamentarians. We have to take that issue very seriously.

There are dangerous algorithms that seem to feed off popular misinformation and create these echo chambers and filter bubbles online. They have not helped but have amplified the situation. Would the Minister explain why the Government have decided to pursue an Online Safety Act that has failed to consider platforms’ business models and has instead become entirely focused on regulating content?

Moving on, possibly my biggest concern about misinformation and disinformation is the relationship between what is acceptable online and what is acceptable offline. As we all know, the issue of misinformation and disinformation is closely connected to online extremism. Although the Minister may claim that people can avoid harm online simply by choosing not to view content, that is just not realistic. After all, there is sadly no way to avoid abuse and harassment offline if individuals choose to spout it. In fact, only recently, when I dared to raise concerns and make comments here in Parliament about online misogyny and incel culture and ideology, I experienced a significant level of hate and harassment. Other colleagues have had similar experiences, as we have heard today.

This is a worrying trend, because we all know that online extremism can translate into the radicalisation of people in real-life terms, which can then heighten community tensions and make minority groups more susceptible to verbal and physical abuse and discrimination.

Online harm costs the UK economy £1.5 billion a year; we cannot afford to get this wrong. Online harm not only has that real-world impact, but it puts our country at an economic disadvantage. Recent research has shown that when there is a spike in online abuse towards one specific demographic, that translates to real-world abuse and real-world harm two weeks later, as the police themselves have said. There is only a two-week lag before online harm results in real-world attacks on certain individuals. No one should have to fear for their safety, online or offline.

In short, the Online Safety Act 2023 had the potential to be as world-leading as it was once billed to be, but the Minister must know that is far from being a perfect piece of legislation, particularly when it comes to misinformation and disinformation.

I am clearly not alone in expressing these views. I hope that the Minister has heard the concerns expressed in this wide-ranging debate, and that he will take seriously some of the proposals and move them forward.

Mark Hendrick Portrait Sir Mark Hendrick (in the Chair)
- Hansard - - - Excerpts

We now move to the contributions from the Front Benches.

15:30
John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

I thank the hon. Member for Weston-super-Mare (John Penrose) for securing this debate. He and I sat together recently at a Quaker dinner in London, where we discussed disinformation and the coarsening of public debate, and I think that the small cross-party group present at the event all agreed that social media had been one of the driving factors behind that, if not the only one.

In 2015, as a new MP and a new user of social media, it took me quite some time to adapt. At first, I thought that when people wrote to me on Twitter the rules of normal social intercourse applied—that I might disagree with someone but if I responded courteously and offered facts, a respectful dialogue would then ensue or we could agree to disagree amicably.

Historywoman, a professor from Edinburgh University no less, soon disabused me of that view. The venom was staggering, and apparently it was just because we disagreed on facts about the constitution; she screamed abuse. Then there was Wings Over Scotland, with more eyeball-bulging, temple-throbbing hate. I had offered some facts about trans people, which he did not like; in fact, he hated it so much that he pounded his keyboard for months in a frenzy.

I got to understand the concept of pile-ons when a sinister organisation called the LGB Alliance decided to reward folk who gave them money by reposting disinformation and abuse about me from their account—a charity account, no less. Finally, when someone called me a “greasy bender” and Twitter moderators judged that comment to be factual and fair comment, I realised that courteous replies did not quite cut it and I became a fan of the block button.

Why are these people so angry and why do they believe that they can behave online in a way that would be considered certifiable offline? I sit on the Culture, Media and Sport Committee, which has undertaken long and detailed inquiries into the areas of disinformation and misinformation, and the impact of online filter bubbles. So what are filter bubbles? They are the result of algorithms designed to increase user engagement rather than correct inaccuracies; in other words, they are designed to show people content again and again based on their viewer biases. For some people, that can be innocent enough—I seem to be directed towards endless posts and films about house restoration options and Timothée Chalamet’s latest outfits—but for others, the filter bubbles are far from benign. Indeed, Facebook itself warned its own staff that its algorithms

“exploit the human brain’s attractiveness to divisiveness”.

What does that mean in practice? It means that if someone watches one conspiracy video, the chances are 70% or more that another conspiracy video reinforcing their paranoia will be recommended for them to watch immediately afterwards. The result is to drive some users into a frenzy. This is why some people blow up 5G masts, convinced that they are the cause of covid. It is not just the underprivileged and ignorant who fall prey; even graduates of the world’s most elite universities can become victims. Donald Trump thought that injecting bleach could cure covid and we now know from the covid inquiry that Boris Johnson wondered whether blowing a hairdryer up his nostrils might save him from the pandemic.

Filter bubbles pose an enormous threat to our democracy. We know how heavily engaged Vladimir Putin was in encouraging people to vote for Brexit by spreading disinformation online. He believed that Brexit would weaken the European Union and Britain’s economy. He was successful but only half right. In the United States, swept away in a tsunami of ignorance, prejudice and shared disinformation, those who stormed the Capitol believed that the victor had lost and the loser had won. Who knew that one of the world’s great democracies would be so vulnerable?

At the Select Committee, we have heard harrowing stories about vulnerable young people fed content persuading them to commit suicide. One father’s testimony was especially harrowing, and I will never forget it. So what responsibility should Members of Parliament take? Surely we should have been much tougher, and dealt much sooner with cynical and unscrupulous social media companies that are driven only by profits and scared only by threats to those profits.

Of course, politicians are directly responsible for the way in which disinformation that they initiate is spread offline and online. All of us—at least almost all—condemned Nigel Farage’s overtly racist Brexit campaign poster, with its image of outsiders supposedly queuing to get into the UK; it had hideous echoes of the 1930s. But what of the much mocked and seemingly more innocuous Tory conference speeches last September? Delegates were told that the UK Government had prevented bans on meat and single-car usage, and had stopped the requirement of us all having seven household bins. The claims were risible, false and mocked but, strikingly, Cabinet Minister after Cabinet Minister tried to defend them when questioned by journalists. Does it matter? Yes, it does. It has a corrosive effect on voters’ trust. Knowingly spreading disinformation helps only those who would undermine our democratic institutions. Some call it post-truth politics: conditioning voters to believe no one and nothing—to believe that there is no difference between truth and lies, and no difference between “Channel 4 News” and GB News.

Our Committee found that there have been repeated successful attempts by bad-faith actors to insert their talking points into our democratic discourse online. The social media companies have shown little interest in tackling them. They were disdainful witnesses when we summoned them and, disturbingly, we have seen our once proudly independent broadcasting sector polluted with the arrival of GB News to challenge long-standing, universally accepted standards. Its aim: to become as successful as Fox News in the dissemination of on-air propaganda online and offline. We all hope that the Online Safety Act 2023 will help but, alas, I fear that the evidence hitherto suggests that our woefully passive regulator, Ofcom, will continue to be found wanting.

15:38
Chris Evans Portrait Chris Evans (Islwyn) (Lab/Co-op)
- Hansard - - - Excerpts

I congratulate the hon. Member for Weston-super-Mare (John Penrose) on securing this debate, which has come at a very important time as we face an election year, not only in this country but across the world. It was a theme developed by the hon. Member for Glasgow South (Stewart Malcolm McDonald) when he talked about the vigilance that we all must demonstrate in the coming months and years as, whatever our political stripe, lots of fake news and information will be thrown at us. I was particularly interested to listen to his views on the recent election in Taiwan and the interference of China. It is sad that the pedlar of fake news himself last night won a huge victory in the Iowa caucuses, and I do hope that the America that elected Barack Obama will come to the fore in November.

I was particularly saddened to listen to the hon. Member for Brigg and Goole (Andrew Percy). I have known him since we were both was elected in 2010, as a doughty fighter for social justice for those whose voices have not been heard. He has been a very strong advocate for his constituents and he is one of the most patriotic people I have ever met, so when I hear of the accusations he has faced online, it fills me with sadness—not only because I am a Member of Parliament, like him, but because I feel that in the world we are living in, it makes it extremely difficult to put any view across. That means that people who come here, especially women and those who identify as being from an ethnic minority, can sometimes be afraid to speak for the abuse they will get online from people outside this place, and very often outside this country. As many have said, that is ultimately a danger to our democracy.

My hon. Friend the Member for Ellesmere Port and Neston (Justin Madders) is right to say that the people who write to us with these crazy conspiracy theories actually believe them and nothing can be said to change their minds. I have someone who writes to me every week with increasingly outlandish views about what the next Labour Government will do. As often as I tell him that he is completely wrong, he tells me that I am a liar and he knows better than everybody else. What can we say to these people?

The hon. Member for Folkestone and Hythe (Damian Collins), a former Chair of the Select Committee, talked about the Online Safety Act, which was one of those rare occasions when the entire House comes together. He is right that social media platforms finally need to answer the question of whether they are just platforms or whether they are publishers. They should be held to account, because ultimately they are the mouthpiece for these crazy, odd, eccentric conspiracy theories that have permeated our society.

The hon. Member for Strangford (Jim Shannon), who probably has not missed a Westminster Hall debate since the hon. Member for Brigg and Goole and I were elected in 2010, spoke about ensuring that online platforms can be diverse; he made a great contribution, as always. I must congratulate my hon. Friend the Member for Pontypridd (Alex Davies-Jones), who was the shadow Minister before me and has proven to be quite a hard act to follow—[Interruption.] Who said that? [Laughter.] I pay tribute to the work of my hon. Friend in her new role on violence against women and girls. I know she will be a strong advocate for them, as she has proven already. She showed that in her passionate speech. I thank her for all her work in this area, and I think the entire House would agree with me. I listened to a very passionate and powerful speech by the hon. Member for Ochil and South Perthshire (John Nicolson)—I hope I pronounced that correctly, because very often people mispronounce my constituency. He gave lots of sad examples that were all too true. It is nothing we have not heard before; sadly, what he talks about has become all too familiar.

The concept of filter bubbles captures how digital platforms personalise information based on individuals’ web history. These personalised digital environments create universes of information tailored to individual preferences, opinions and beliefs. This results in information being pushed on to a person’s algorithm even if it is not necessarily true, yet because it might be something that already aligns with the person’s beliefs, it could be taken as fact. In the realm of digital thought bubbles, individuals are primarily exposed to content aligned with their existing views, potentially fuelling polarisation and diminishing mutual understanding. The challenge we face, as highlighted by the Writers’ Institute, is to navigate a society where finding common ground becomes increasingly elusive.

As we have heard today, we MPs are more than familiar with echo chambers. Most can see that echo chambers or filter bubbles affect others. However, accepting that they affect ourselves is more of a challenging task. When discussing this topic, we think of Americans with the Fox News logo burned on to the television screens, or our conspiracy theorist uncle sitting there in his tin foil hat, yet we fail to consider that we ourselves are scrolling through Twitter or Instagram, instantly consuming the posts we enjoy.

On a lighter note, you will be pleased to learn, Sir Mark, that through numerous posts I have discovered that Manchester City is the greatest team in the world. I know that Sir Mark is a long-time supporter, so I am sure he will tell me that that is absolutely correct and reaffirms what he already knows to be true. But in the interests of honesty, among our hon. Friends, I think he might concede that the algorithm is feeding us posts that may be biased or that tell us what we would like to hear. Members may think, with my example of football, that these clever algorithms are not particularly harmful, but as many have said, they have negative and dangerous consequences. They will limit our freedom of thought and are a danger to the democratic freedoms we have enjoyed throughout the years in this country and around the world. This is because within those filter bubbles divisive ideologies can take root and thrive, leading to the erosion of trust in our institutions.

We cannot ignore the fact that these bubbles are a by-product of algorithms designed to maximise user engagement. Although they keep us engaged, they can simultaneously trap us in a feedback loop of our own preconceptions. The danger lies in the fact that citizens become increasingly susceptible to manipulation, as misinformation tailored to their worldview becomes indistinguishable from reality.

Recent research has shown that absorption into these thought bubbles is not inevitable or a passive process. As my hon. Friend the Member for Ellesmere Port and Neston said, Oxford University does not think that filter bubbles affect the majority, but somewhere between 6% and 8% of the UK population. As my hon. Friend said, that might sound like a small figure, but it is millions of people.

What sets that 8% of people in echo chambers apart from those who are not? The primary causal mechanism is self-selection, when individuals actively choose to immerse themselves in echo chambers because they prefer news that aligns with and reinforces their existing views. It is not a process of hypnosis by the Twitter algorithm, over which one has no control. It is through an active dismissal of news sources that do not agree with their opinion.

Recent studies have gone as far as to suggest that, for some, passive personalisation results in a more varied source breadth. That is because passive personalisation is shown to enhance the probability of algorithms suggesting additional news content to individuals already immersed in news consumption. For those who are less like to actively seek out the news, it promotes news in the first place. For people who have no interest in current affairs, these algorithms produce a wider variety of news than they would otherwise see.

As such, the filter bubble theory does not seem to be comprehensive. In many cases, algorithm selections lead to slightly more diverse news than if the algorithm had not been used. It is easy to see why many older people, or those who do not have smartphones, simply consume the news by reading the same paper every day. I must admit publicly that my grandparents were avid readers of the Daily Mail and believed everything it said—imagine the conversations when I became a Member of Parliament.

Many people took their paper’s stance as gospel, as it aligned with their own political and social views. Now we can google a news story and hundreds of different stances are presented to us immediately, as is the ability to discuss and engage with those who do not agree with us. Of course, even if the proportion of people in these thought bubbles are small, that does not mean that the issue is not dangerous. We should work so that nobody is in a thought bubble. I believe that can be helped through proper education, giving people the skills to spot when they are in a thought bubble and arm them with the tools to get out.

In an era dominated by digital connectivity, the ability to navigate the vast sea of information online has become an essential skill. Sir Mark, I can see that I am pushed for time but I will try to speak on this issue as quickly as I can. One key aspect of cultivating digital literacy is the understanding of how online platforms curate content and of the formation of thought bubbles. A well-rounded education in digital skills plays a pivotal role in equipping individuals with the tools necessary to prevent entrapment in these echo chambers. An informed understanding of the process is critical, as is educating individuals on algorithms.

As a Welsh MP, I should raise the example of Wales. Welsh schools have introduced a digital competence framework, which teaches children from the age of three how to responsibly find and use information on the internet, further encouraging fact finding and verifying. As the child grows up to 16, the level of skills taught gradually increases, so as they first navigate the wide world of social media, they are best placed to curate their own nuanced social media needs.

As I said to someone this morning, by the age of six it is often too late; children already have exposure to social media platforms and devices. At one of my first events as shadow Minister, I saw the amazing example of the Kingston University digital skills campaign, which involves every student there having to pass an exam in a digital skills course. That enables students to be confident with media literacy and allows them to be resilient in the face of thought bubbles.

We face something we have never faced before. All that we have known to be true is in danger. It is only through education and debates like this that we can come to grips with those who seek to bring down our democracy.

15:49
Saqib Bhatti Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Saqib Bhatti)
- Hansard - - - Excerpts

I am conscious of time and of the broad range of this debate, but I will try to address as many issues as possible. I commend my hon. Friend the Member for Weston-super-Mare (John Penrose) for securing this important debate on preventing misinformation and disinformation in online filter bubbles, and for all his campaigning on the subject throughout the passage of the Online Safety Act. He has particularly engaged with me in the run-up to today’s well-versed debate, for which I thank hon. Members across the Chamber.

May I echo the sentiments expressed towards my hon. Friend the Member for Brigg and Goole (Andrew Percy)? I thank him for sharing his reflections. I was not going to say this today, but after the ceasefire vote I myself have faced a number of threats and a lot of abuse, so I have some personal reflections on the issue as well. I put on the record my invitation to Members across the House to share their experiences. I certainly will not hesitate to deal with social media companies where I see that they must do more. I know anecdotally, from speaking to colleagues, that it is so much worse for female Members. Across the House, we will not be intimidated in how we vote and how we behave, but clearly we are ever vigilant of the risk.

Since the crisis began, the Technology Secretary and I have already met with the large social media platforms X, TikTok, Meta, Snap and YouTube. My predecessor—my hon. Friend the Member for Sutton and Cheam (Paul Scully)—and the Technology Secretary also held a roundtable with groups from the Jewish community such as the Antisemitism Policy Trust. They also met Tell MAMA to discuss Muslim hate, which has been on the rise. I will not hesitate to reconvene those groups; I want to put that clearly on the record.

It is evident that more and more people are getting their news through social media platforms, which use algorithms. Through that technology, platform services can automatically select and promote content for many millions of users, tailored to them individually following automated analysis of their viewing habits. Many contributors to the debate have argued that the practice creates filter bubbles, where social media users’ initial biases are constantly reaffirmed with no counterbalance.

The practice can drive people to adopt extreme and divisive political viewpoints. This is a hugely complex area, not least because the creation of nudge factors in these echo chambers raises less the question of truth, but of how we can protect the free exchange of ideas and the democratisation of speech, of which the internet and social media have often been great drivers. There is obviously a balance to be achieved.

I did not know that you are a Man City fan, Sir Mark. I am a Manchester United fan. My hon. Friend the Member for Weston-super-Mare talked about fish tackle videos; as a tortured Manchester United fan, I get lots of videos from when times were good. I certainly hope that they return.

The Government are committed to preserving freedom of expression, both online and offline. It is vital that users are able to choose what content they want to view or engage with. At the same time, we agree that online platforms must take responsibility for the harmful effects of the design of their services and business models. Platforms need to prioritise user safety when designing their services to ensure that they are not being used for illegal activity and ensure that children are protected. That is the approach that drove our groundbreaking Online Safety Act.

I will move on to radicalisation, a subject that has come up quite a bit today. I commend my hon. Friend the Member for Folkestone and Hythe (Damian Collins) for his eloquent speech and his description of the journey of the Online Safety Act. Open engagement-driven algorithms have been designed by tech companies to maximise revenue by serving content that will best elicit user engagement. There is increasing evidence that the recommender algorithms amplify extreme material to increase user engagement and de-amplify more moderate speech.

Algorithmic promotion, another piece of online architecture, automatically nudges the user towards certain online choices. Many popular social media platforms use recommender algorithms, such as YouTube’s filter bubble. Critics argue that they present the user with overly homogeneous content based on interests, ideas and beliefs, creating extremist and terrorist echo chambers or rabbit holes. There are a multitude of features online that intensify and support the creation of those echo chambers, from closed or selective chat groups to unmoderated forums.

Research shows that individuals convicted of terrorist attacks rarely seek opposing information that challenges their beliefs. Without diverse views, online discussion groups grow increasingly partisan, personalised and compartmentalised. The polarisation of online debates can lead to an environment that is much more permissive of extremist views. That is why the Online Safety Act, which received Royal Assent at the end of October, focuses on safety by design. We are in the implementation phase, which comes under my remit; we await further evidence from the data that implementation will produce.

Under the new regulation, social media platforms will need to assess the risk of their services facilitating illegal content and activity such as illegal abuse, harassment or stirring up hatred. They will also need to assess the risk of children being harmed on their services by content that does not cross the threshold of illegality but is harmful to them, such as content that promotes suicide, self-harm or eating disorders.

Platforms will then need to take steps to mitigate the identified risks. Ofcom, the new online safety regulator, will set out in codes of practice the steps that providers can take to mitigate particular risks. The new safety duties apply across all areas of a service, including the way in which it is designed, used and operated. If aspects of a service’s design, such as the use of algorithms, exacerbate the risk that users will carry out illegal activity such as illegal abuse or harassment, the new duties could apply. Ofcom will set out the steps that providers can take to make their algorithms safer.

I am conscious of time, so I will move on to the responsibility around extremism. Beyond the duties to make their services safe by design and reduce risk in that way, the new regulation gives providers duties to implement systems and processes for filtering out and moderating content that could drive extremism. For example, under their illegal content duty, social media providers will need to put systems in place to seek out and remove content that encourages terrorism. They will need to do the same for abusive content that could incite hatred on the basis of characteristics such as race, religion or sexual orientation. They will also need to remove content in the form of state-sponsored or state-linked disinformation aimed at interfering with matters such as UK elections and political decision making, or other false information that is intended to cause harm.

Elections have come up quite a bit in this debate. The defending democracy taskforce, which has been instituted to protect our democracy, is meeting regularly and regular discussions are going on; it is cross-nation and cross-Government, and we certainly hope to share more information in the coming months. We absolutely recognise the responsibilities of Government to deal with the issue and the risks that arise from misinformation around elections. We are not shying away from this; we are leading on it across Government.

The idea put forward by my hon. Friend the Member for Weston-super-Mare has certainly been debated. He has spoken to me about it before, and I welcome the opportunity to have this debate. He was right to say that this is the start of the conversation—I accept that—and right to say that he may not yet have the right answer, but I am certainly open to further discussions with him to see whether there are avenues that we could look at.

I am very confident that the Online Safety Act, through its insistence on social media companies dealing with the issue and on holding social media companies to account on their terms and conditions, will be a vital factor. My focus will absolutely be on the implementation of the Act, because we know that that will go quite a long way.

We have given Ofcom, the new independent regulator, the power to require providers to change their algorithms and their service design where necessary to reduce the risk of users carrying out illegal activity or the risk of children being harmed. In overseeing the new framework, Ofcom will need to carry out its duties in a way that protects freedom of expression. We have also created a range of new transparency and freedom-of-expression duties for the major social media platforms; these will safeguard pluralism in public debate and give users more certainty about what they can expect online. As I have said, the Government take the issue incredibly seriously and will not hesitate to hold social media companies to account.

Mark Hendrick Portrait Sir Mark Hendrick (in the Chair)
- Hansard - - - Excerpts

John Penrose has 30 seconds to wind up.

15:59
John Penrose Portrait John Penrose
- Hansard - - - Excerpts

I thank everybody who has contributed: it shows that there is a great cross-party consensus on the need for more. I urge the Minister to understand that, because although the Online Safety Act is good and important and does vital things, I do not think that it will be enough in this area. It used to be said, and is still true, that a lie is halfway around the world before the truth has got its boots on.

Motion lapsed (Standing Order No. 10(6)).