Tom Gordon
Main Page: Tom Gordon (Liberal Democrat - Harrogate and Knaresborough)(1 day, 16 hours ago)
Commons ChamberI thank my hon. Friend and constituency neighbour for highlighting this incredibly important issue and for the work that she does in this area. She is absolutely right to say that. The Committee heard very moving and distressing evidence that suicide and self-harm content can be and has been amplified by social media algorithms and that that can play a role in suicide and self-harm, including by young people. Promoting suicide is illegal, and the Online Safety Act introduced an offence of promoting self-harm, but it does not do enough to tackle legal content that promotes suicide or self-harm, as with the rest of legal but harmful content, such as misinformation. The Committee’s recommendations that platforms should be held accountable for the algorithmic amplification of misinformation would address part of what my hon. Friend is concerned about. We hope that in implementing those recommendations, the Government would set out how they would fully address her concerns.
I am a member of the Select Committee over which the hon. Lady presides as Chair, and I thank her and all the staff who helped with this report and inquiry. I know that many of us in this place wear not one, but multiple hats; I also sit on the Joint Committee on Human Rights, and we are doing an inquiry into AI and human rights. Some of the work from this report will be helpful and inform us in looking at some of the key issues, so, with another hat on, I thank her for that.
One of the evidence sessions that has stuck with me from working on the report is when we had social media company bosses in front of us. They talked about how they removed most of the content within 10 minutes in 90% of cases, but they did not accept responsibility for the proliferation of that data, information and content outside their own spheres. What worries me is that, at a time when we have advancing technologies and a great pace of change and we need to maintain regulation at a breakneck pace, companies—particularly social media companies—are unravelling and unpicking their content notes, their monitoring and how they look for and remove information that might be harmful. Does the Chair of the Science, Innovation and Technology Committee agree that we need to do more in regulating current approaches and ensure that the companies do not backslide on their obligations?
I thank the hon. Gentleman, my fellow Committee member, for that question, as well as for his contribution to this report and the work of the Committee, which has been exceedingly valuable. He has raised a really important point; we heard evidence from Meta about Facebook content checking, and how outside the UK and the US it was moving from fact-checking to community notes, which X has also done. The Committee has recommended adopting the principle that platforms are held accountable, which must go hand in hand with those platforms setting out how they can demonstrate that accountability. The report also recommends that the Government undertake research into what effective fact-checking looks like and how misinformation is spread, because one of the things that the Committee—which is a scientific Committee—observed was the lack of real evidence in thar area. That is partly because the algorithms and platforms are so opaque and secretive about how they operate.