Sub-Committee on Disinformation Debate
Full Debate: Read Full DebateLisa Cameron
Main Page: Lisa Cameron (Conservative - East Kilbride, Strathaven and Lesmahagow)Department Debates - View all Lisa Cameron's debates with the Department for Digital, Culture, Media & Sport
(5 years, 8 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
The hon. Gentleman, who is a member of the Select Committee, makes an important point. He will know that we discussed the role of groups with Facebook during our investigation. We believe they play a significant role in spreading disinformation; it is not just through targeted advertising that someone can drive content through a platform such as that. Indeed, as he knows, the Committee’s final report on disinformation touched on how far-right organisations are using closed Facebook groups with hundreds of thousands of members to spread content very quickly through the web. Content posted into the group by a group administrator goes immediately to the top of the news feed of members, who may in turn share it.
These closed groups may be closed to the public, but Facebook can tell what is going on in them, and it should act where closed groups are behaving irresponsibly or maliciously in spreading lies and disinformation about people. It can see who the administrators are and who is doing that.
As a consequence of the attacks in Christchurch in particular—having an independent regulator with the power to go into the tech companies to see what is going on would facilitate this—we should do an audit of the sorts of groups and organisations that were sharing and promoting the vile content involved. That could provide a really important map of the way in which these far-right groups, in particular, co-ordinate online and spread disinformation.
The hon. Gentleman is quite right that this is not just about global news stories such as the Christchurch attacks; disinformation is also taking place in individual communities. We should be able to report such things to Facebook and know that it will investigate and take action against groups, including by closing them or the administrator down if necessary.
I thank the hon. Gentleman and all members of the Committee for a very important report. I know that the Minister is working extremely hard on these issues.
My question is about making it easier or more streamlined for the police to investigate closed Facebook pages. At this point in time, it seems to be very difficult for the police to access information even when they have suspicions about it. The fact that individuals can post anonymously without giving their own details seems to exacerbate the situation whereby they feel they can post whatever they like without any responsibility.
The hon. Lady raises a number of very important issues. Co-operation with the authorities is important. We have seen too many cases where different social media companies have been criticised for not readily sharing information with the police as part of an investigation. Often the companies have very narrow terms of reference for when they would do that; sometimes if there is an immediate threat to life or if information might be related to a potential terror attack, they will act. However, we see hideous crimes that affect families in a grievous way and those families want the crimes to be investigated efficiently and speedily, and for the police to get access to any relevant information. I think we would have to say that the current system is not working effectively enough and that more should be done.
There should be more of an obligation on the companies to share proactively with the authorities information that they have observed. They might not have been asked for it yet, but it could be important or relevant to a police investigation. Part of, if you like, the duty of care of the tech companies should be to alert the relevant authorities to a problem when they see it and not wait to be asked as part of a formal investigation. Again, that sort of proactive intervention would be necessary.
I also share a general concern, in that I believe tech companies could do more to observe behaviour on their platforms that could lead to harm. That includes self-harm resulting from a vulnerable person accessing content that might lead them towards a pattern of self-harm. Indeed, one of the particular concerns that emerged from the Molly Russell case was the content she was engaging with on Instagram.
The companies should take a more proactive responsibility to identify people who share content that may lead to the radicalisation of individuals or encourage them to commit harmful acts against other citizens. I think the companies have the power to identify that sort of behaviour online, and there should be more of an obligation on them to share their knowledge of that content when they see it.