To ask His Majesty’s Government what steps their Counter-Disinformation Unit is taking to identify and combat disinformation on social media in respect of the conflict in Israel and Palestine.
The Department for Science, Innovation and Technology takes the threat posed by disinformation in relation to the conflict extremely seriously. We are taking a three-pronged approach, working in lockstep with communities, technology companies and across government. The Government are working to identify fake accounts, known as bots, and working closely with social media companies to ensure the removal of illegal content and content in breach of their terms of service.
Given the second Question today, perhaps the Minister will confirm that much of the work of the unit is outsourced to an artificial intelligence company, logically.ai, which I understand is based in Yorkshire. I am interested in exactly how the output of the unit is conveyed to others. The Minister has confirmed that there is active interaction with social media companies, and there is an effort to identify the sources of this misinformation, many of which are state actors. However, some are individuals in this country and elsewhere. What happens when those sources have been identified? Who takes the action further?
I will have to write to the noble Lord to confirm the Counter-Disinformation Unit’s use of logically.ai. Where the unit identifies disinformation being deployed at scale, it would first engage with the relevant ministry to allow it to respond. On occasion, it will engage directly with social media companies, if the content it is seeing either is illegal or runs contrary to the terms of service declared by that company.
My Lords, there has been a huge rise in anti-Semitism and Islamophobia on social media, much of it due to disinformation. What steps are His Majesty’s Government taking to educate the public to spot disinformation and stop them forwarding and repeating it?
My noble friend makes an important point. In the escalating battle between those pushing disinformation at us and our attempts to limit it, media literacy is key. Under the terms of the Online Safety Bill, which is due to become law in just a few days, Ofcom is obliged to produce a media literacy strategy to generate awareness of and resilience to misinformation and disinformation. It is obliged also to create an expert advisory committee on misinformation and disinformation online. In addition, there is now a media literacy programme fund that awards up to £700,000 of grant funding for media literacy programmes. All this is dependent on platforms setting out clearly their terms of service, so that users can access them in the full knowledge of the kind of information that they can expect to see.
My Lords, the EU Commission has formally opened an investigation into X, the platform previously referred to as Twitter, to ensure that it complies with the Digital Services Act following the onslaught of the current conflict in Israel and Gaza, Palestine. Could the Minister outline what discussions and engagement have taken place with the European Commission in relation to its and the UK’s investigations?
On 11 October, shortly after the commencement of hostilities, the Secretary of State for DSIT convened a meeting of social media platforms. These included Google, YouTube, Meta, X, Snap and TikTok. She made her expectation very clear that not only would illegal content be rapidly and urgently removed but authoritative content would be promoted to create more clarity around what is accurate content in this fast-moving and difficult situation. Those meetings are ongoing daily at official level and are accompanied by detailed correspondence on the acts of those platforms.
My Lords, it is good to hear that the Government are engaging with the social media platforms on this incredibly serious issue. Twitter has most aptly been renamed X, but without irony: one goes into this area with great caution, as it is distressing and nasty. I am told that X is currently laying off people whose job it is to monitor and remove posts that contain disinformation. Given this, and given that we have made progress in looking at social media platforms, what are we doing to require them to do this other than simply engaging with them?
I absolutely agree with the most reverend Primate about the seriousness and horror of the situation. On requiring social media companies to act, the Online Safety Bill will become law in a matter of days; it places much more rigorous requirements on the social media companies to remove content which is illegal and is harmful to children and to have only content that is consistent with their published terms of service.
My Lords, what are Ministers doing to engage with the leaders of the relevant religious communities to persuade their followers to avoid inflammatory actions and words, which are causing such trouble and intercommunity tension?
That is an important part of the Government’s approach to this very difficult, nasty situation. Last week, the Secretary of State met leaders of Jewish communities, and ongoing meetings are similarly being convened by DLUHC with all communities. We are establishing bridges between these communities and the social media platforms. One advantage they have in that dialogue is that they are accorded trusted flagger status, which greatly reduces the amount of time it takes to raise content of concern.
My Lords, the House has previously debated the role and work of the Counter-Disinformation Unit. I do not think anybody was particularly convinced by the assurances which the Minister gave back in July. These issues have been brought into sharp focus by recent events. At the time of that last debate, we were promised a meeting. Unless our Front Bench was left off the invite list, I am not aware of that follow-up meeting having taken place. Given some of the Minister’s responses today, that meeting is now more urgent than ever. Can the Minister commit to meet with those of us who are deeply concerned about this issue?
I remember the July debate very well. I made a commitment then to meet with concerned Members, which I am happy to repeat. Again, I ask that concerned Members write to me to indicate that they would like to meet. Those who have written to me, have met with me.
My Lords, the Minister mentioned that the Online Safety Bill will come into law very shortly. Will he commit to setting up the advisory committee on disinformation and misinformation as soon as possible after this? The current situation clearly demonstrates both the need for it and for it to come to swift conclusions.
I very much share the noble Lord’s analysis of the need for this group to come rapidly into existence. It is, of course, the role of Ofcom to create it. I will undertake to liaise with it to make sure that that is speeded up.
My Lords, it was reported that a hospital had been hit. Immediately—sadly, in this modern day—the media like to break news, not to check how accurate it is. In practice, when we find out exactly what did happen, the damage has already been done because it went out to billions of people who wanted to believe that the Israelis did it.
That was very concerning. I am unable to comment specifically on the role of the BBC reporting on it. Combined with other sources of misinformation and disinformation online, it greatly amplified the damage that was done. We continue to work with the social media companies to ensure that they promote authoritative versions of the truth based on their use of fact-checkers, whether third-party independents or part of their own organisation.