(2 years, 5 months ago)
Public Bill CommitteesI want to briefly agree with the sentiments of the Opposition Front Bench, especially about the strength of the committee and the lack of teeth that it currently has. Given that the Government have been clear that they are very concerned about misinformation and disinformation, it seems odd that they are covered in the Bill in such a wishy-washy way.
The reduction of the time from 18 months to six months would also make sense. We would expect the initial report the committee publish in six months to not be as full as the ones it would publish after that. I do not see any issue with it being required to produce a report as soon as possible to assess how the Act is bedding in and beginning to work, rather than having to wait to assess—potentially once the Act is properly working. We want to be able to pick up any teething problems that the Act might have.
We want the committee to be able to say, “Actually, this is not working quite as we expected. We suggest that Ofcom operates in a slightly different way or that the interaction with providers happens in a slightly different way.” I would rather that problems with the Act were tackled as early as possible. We will not know about problems with the Act, because there is no proper review mechanism. There is no agreement on the committee, for example, to look at how the Act is operating. This is one of the few parts of the Bill where we have got an agreement to a review, and it would make sense that it happen as early as possible.
We agree that misinformation and disinformation are very important matters that really need to be tackled, but there is just not enough clout in the Bill to allow Ofcom to properly tackle these issues that are causing untold harm.
When I spoke at the very beginning of the Committee’s proceedings, I said that the legislation was necessary, that it was a starting point and that it would no doubt change and develop over time. However, I have been surprised at how little, considering all of the rhetoric we have heard from the Secretary of State and other Ministers, the Bill actually deals with the general societal harm that comes from the internet. This is perhaps the only place in the Bill where it is covered.
I am thinking of the echo chambers that are created around disinformation and the algorithms that companies use. I really want to hear from the Minister where he sees this developing and why it is so weak and wishy-washy. While I welcome that much of the Bill seeks to deal with the criminality of individuals and the harm and abuse that can be carried out over the internet, overall it misses a great opportunity to deal with the harmful impact the internet can have on society.
(2 years, 5 months ago)
Public Bill CommitteesI want to talk about a few different things relating to the amendments. Speaking from the Opposition Front Bench, the hon. Member for Pontypridd covered in depth amendment 20, which relates to being directed to other content. Although this seems like a small amendment, it would apply in a significant number of different situations. Particular mention was made of Discord for gaming, but also of things such as moving from Facebook to Messenger—all those different directions that can happen. A huge number of those are important for those who would seek to abuse children online by trying to move from the higher-regulation services or ones with more foot traffic to areas with perhaps less moderation so as to attack children in more extreme ways.
I grew up on the internet and spent a huge amount of time speaking to people, so I am well aware that people can be anyone they want to be on the internet, and people do pretend to be lots of different people. If someone tells us their age on the internet, we cannot assume that that is in any way accurate. I am doing what I can to imprint that knowledge on my children in relation to any actions they are taking online. In terms of media literacy, which we will come on to discuss in more depth later, I hope that one of the key things that is being told to both children and adults is that it does not matter if people have pictures on their profile—they can be anybody that they want to online and could have taken those pictures from wherever.
In relation to amendment 21 on collaboration, the only reasonable concern that I have heard is about an action that was taken by Facebook in employing an outside company in the US. It employed an outside company that placed stories in local newspapers on concerns about vile things that were happening on TikTok. Those stories were invented—they were made up—specifically to harm TikTok’s reputation. I am not saying for a second that collaboration is bad, but I think the argument that some companies may make that it is bad because it causes them problems and their opponents may use it against them proves the need to have a regulator. The point of having a regulator is to ensure that any information or collaboration that is required is done in a way that, should a company decide to use it with malicious intent, the regulator can come down on them. The regulator ensures that the collaboration that we need to happen in order for emergent issues to be dealt with as quickly as possible is done in a way that does not harm people. If it does harm people, the regulator is there to take action.
I want to talk about amendments 25 and 30 on the production of images and child sexual abuse content. Amendment 30 should potentially have an “or” at the end rather than an “and”. However, I am very keen to support both of those amendments, and all the amendments relating to the production of child sexual abuse content. On the issues raised by the Opposition about livestreaming, for example, we heard two weeks ago about the percentage of self-generated child sexual abuse content. The fact is that 75% of that content is self-generated. That is absolutely huge.
If the Bill does not adequately cover production of the content, whether it is by children and young people who have been coerced into producing the content and using their cameras in that way, or whether it is in some other way, then the Bill fails to adequately protect our children. Purely on the basis of that 75% stat, which is so incredibly stark, it is completely reasonable that production is included. I would be happy to support the amendments in that regard; I think they are eminently sensible. Potentially, when the Bill was first written, production was not nearly so much of an issue. However, as it has moved on, it has become a huge issue and something that needs tackling. Like Opposition Members, I do not feel like the Bill covers production in as much detail as it should, in order to provide protection for children.
Amendment 10 would create a duty to publish the illegal content risk assessment, and proactively supply that to Ofcom. This is new legislation that is really a trial that will set international precedent, and a lot of the more prescriptive elements—which are necessary—are perhaps the most challenging parts of the Bill. The Minister has been very thoughtful on some of the issues, so I want to ask him, when we look at the landscape of how we look to regulate companies, where does he stand on transparency and accountability? How far is he willing to go, and how far does the Bill go, on issues of transparency? It is my feeling that the more companies are forced to publish and open up, the better. As we saw with the case of the Facebook whistleblower Frances Haugen, there is a lot to uncover. I therefore take this opportunity to ask the Minister how far the Bill goes on transparency and what his thoughts are on that.