All 1 Debates between Saqib Bhatti and Mark Hendrick

Online Filter Bubbles: Misinformation and Disinformation

Debate between Saqib Bhatti and Mark Hendrick
Tuesday 16th January 2024

(11 months, 1 week ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Saqib Bhatti Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Saqib Bhatti)
- Hansard - -

I am conscious of time and of the broad range of this debate, but I will try to address as many issues as possible. I commend my hon. Friend the Member for Weston-super-Mare (John Penrose) for securing this important debate on preventing misinformation and disinformation in online filter bubbles, and for all his campaigning on the subject throughout the passage of the Online Safety Act. He has particularly engaged with me in the run-up to today’s well-versed debate, for which I thank hon. Members across the Chamber.

May I echo the sentiments expressed towards my hon. Friend the Member for Brigg and Goole (Andrew Percy)? I thank him for sharing his reflections. I was not going to say this today, but after the ceasefire vote I myself have faced a number of threats and a lot of abuse, so I have some personal reflections on the issue as well. I put on the record my invitation to Members across the House to share their experiences. I certainly will not hesitate to deal with social media companies where I see that they must do more. I know anecdotally, from speaking to colleagues, that it is so much worse for female Members. Across the House, we will not be intimidated in how we vote and how we behave, but clearly we are ever vigilant of the risk.

Since the crisis began, the Technology Secretary and I have already met with the large social media platforms X, TikTok, Meta, Snap and YouTube. My predecessor—my hon. Friend the Member for Sutton and Cheam (Paul Scully)—and the Technology Secretary also held a roundtable with groups from the Jewish community such as the Antisemitism Policy Trust. They also met Tell MAMA to discuss Muslim hate, which has been on the rise. I will not hesitate to reconvene those groups; I want to put that clearly on the record.

It is evident that more and more people are getting their news through social media platforms, which use algorithms. Through that technology, platform services can automatically select and promote content for many millions of users, tailored to them individually following automated analysis of their viewing habits. Many contributors to the debate have argued that the practice creates filter bubbles, where social media users’ initial biases are constantly reaffirmed with no counterbalance.

The practice can drive people to adopt extreme and divisive political viewpoints. This is a hugely complex area, not least because the creation of nudge factors in these echo chambers raises less the question of truth, but of how we can protect the free exchange of ideas and the democratisation of speech, of which the internet and social media have often been great drivers. There is obviously a balance to be achieved.

I did not know that you are a Man City fan, Sir Mark. I am a Manchester United fan. My hon. Friend the Member for Weston-super-Mare talked about fish tackle videos; as a tortured Manchester United fan, I get lots of videos from when times were good. I certainly hope that they return.

The Government are committed to preserving freedom of expression, both online and offline. It is vital that users are able to choose what content they want to view or engage with. At the same time, we agree that online platforms must take responsibility for the harmful effects of the design of their services and business models. Platforms need to prioritise user safety when designing their services to ensure that they are not being used for illegal activity and ensure that children are protected. That is the approach that drove our groundbreaking Online Safety Act.

I will move on to radicalisation, a subject that has come up quite a bit today. I commend my hon. Friend the Member for Folkestone and Hythe (Damian Collins) for his eloquent speech and his description of the journey of the Online Safety Act. Open engagement-driven algorithms have been designed by tech companies to maximise revenue by serving content that will best elicit user engagement. There is increasing evidence that the recommender algorithms amplify extreme material to increase user engagement and de-amplify more moderate speech.

Algorithmic promotion, another piece of online architecture, automatically nudges the user towards certain online choices. Many popular social media platforms use recommender algorithms, such as YouTube’s filter bubble. Critics argue that they present the user with overly homogeneous content based on interests, ideas and beliefs, creating extremist and terrorist echo chambers or rabbit holes. There are a multitude of features online that intensify and support the creation of those echo chambers, from closed or selective chat groups to unmoderated forums.

Research shows that individuals convicted of terrorist attacks rarely seek opposing information that challenges their beliefs. Without diverse views, online discussion groups grow increasingly partisan, personalised and compartmentalised. The polarisation of online debates can lead to an environment that is much more permissive of extremist views. That is why the Online Safety Act, which received Royal Assent at the end of October, focuses on safety by design. We are in the implementation phase, which comes under my remit; we await further evidence from the data that implementation will produce.

Under the new regulation, social media platforms will need to assess the risk of their services facilitating illegal content and activity such as illegal abuse, harassment or stirring up hatred. They will also need to assess the risk of children being harmed on their services by content that does not cross the threshold of illegality but is harmful to them, such as content that promotes suicide, self-harm or eating disorders.

Platforms will then need to take steps to mitigate the identified risks. Ofcom, the new online safety regulator, will set out in codes of practice the steps that providers can take to mitigate particular risks. The new safety duties apply across all areas of a service, including the way in which it is designed, used and operated. If aspects of a service’s design, such as the use of algorithms, exacerbate the risk that users will carry out illegal activity such as illegal abuse or harassment, the new duties could apply. Ofcom will set out the steps that providers can take to make their algorithms safer.

I am conscious of time, so I will move on to the responsibility around extremism. Beyond the duties to make their services safe by design and reduce risk in that way, the new regulation gives providers duties to implement systems and processes for filtering out and moderating content that could drive extremism. For example, under their illegal content duty, social media providers will need to put systems in place to seek out and remove content that encourages terrorism. They will need to do the same for abusive content that could incite hatred on the basis of characteristics such as race, religion or sexual orientation. They will also need to remove content in the form of state-sponsored or state-linked disinformation aimed at interfering with matters such as UK elections and political decision making, or other false information that is intended to cause harm.

Elections have come up quite a bit in this debate. The defending democracy taskforce, which has been instituted to protect our democracy, is meeting regularly and regular discussions are going on; it is cross-nation and cross-Government, and we certainly hope to share more information in the coming months. We absolutely recognise the responsibilities of Government to deal with the issue and the risks that arise from misinformation around elections. We are not shying away from this; we are leading on it across Government.

The idea put forward by my hon. Friend the Member for Weston-super-Mare has certainly been debated. He has spoken to me about it before, and I welcome the opportunity to have this debate. He was right to say that this is the start of the conversation—I accept that—and right to say that he may not yet have the right answer, but I am certainly open to further discussions with him to see whether there are avenues that we could look at.

I am very confident that the Online Safety Act, through its insistence on social media companies dealing with the issue and on holding social media companies to account on their terms and conditions, will be a vital factor. My focus will absolutely be on the implementation of the Act, because we know that that will go quite a long way.

We have given Ofcom, the new independent regulator, the power to require providers to change their algorithms and their service design where necessary to reduce the risk of users carrying out illegal activity or the risk of children being harmed. In overseeing the new framework, Ofcom will need to carry out its duties in a way that protects freedom of expression. We have also created a range of new transparency and freedom-of-expression duties for the major social media platforms; these will safeguard pluralism in public debate and give users more certainty about what they can expect online. As I have said, the Government take the issue incredibly seriously and will not hesitate to hold social media companies to account.

Mark Hendrick Portrait Sir Mark Hendrick (in the Chair)
- Hansard - - - Excerpts

John Penrose has 30 seconds to wind up.