Thursday 13th January 2022

(2 years, 3 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - -

I echo the words of thanks to the Joint Committee and its Clerks, under the excellent chairmanship of my hon. Friend the Member for Folkestone and Hythe (Damian Collins), for this thorough and weighty report. It includes some of the big beasts in the world of online safety, and that is important, because this is one of the single most important pieces of legislation of our time. It is absolutely groundbreaking. It is vast; it is almost five Bills in one, and no country has attempted to regulate the internet so comprehensively.

The pressure is on, not least because we have got into this bad habit of describing this Bill as the calvary coming over the hill for the online world and all the ills it contains. Someone compared it with the motor car, and it has taken decades of legislation to address the safety issues of that evolving technology, so we will never do this in a oner, but this Bill needs to be the very best possible starting point—the foundations to face the current threats, but also the challenges of the future.

I lived with this Bill for 20 months; I can talk about it forever, but I will not. Let us start at the beginning with the algorithms. We have all seen them. We start watching cute videos—in my case, it is usually of babies falling asleep in their own food, but that is probably just me—and immediately we are swept into this rabbit hole of suggested content, and it is designed to keep us engaged as long as possible, because that is where the money is: to capture our attention and sell it to the highest bidder. Do not forget, if we are not paying for the product, it is most likely we are the product.

It gets more sinister than that, though, because that same algorithm that is sending me those cute babies is recommending self-harm to a vulnerable teenager or spreading wildly dangerous disinformation about the dangers of covid. Algorithms are echo chambers. They take our fears and paranoia and surround us with unhealthy voices that reinforce them, however dangerous and hateful.

According to the 2020 Netflix documentary “The Social Dilemma”, former employees of the largest social media companies who were integral to the early development of those algorithms say that addiction is built into the design. Many of them say that the platforms are so unhealthy, they would not let their own kids anywhere near them. As the report says, tackling these design risks at source is more effective than just trying to take down individual pieces of content. We saw that with the outbreak of covid, when 5G masts were burned to the ground because of some wild conspiracy theory that suggested they were the root cause of covid. 3G and 4G masts were also destroyed, because it turns out these people are not wildly bright and cannot tell the difference. We cannot censor this stuff, because that just fuels the sense of a conspiracy theory of state conspiring. We need to stop it before it is force-fed into people and ensure that there is balance. Platforms must tackle the design features that exacerbate the risk of harm, and the legislation should include a specific responsibility for them to do it and for the regulator to enforce it.

I want to talk quickly about a couple of the specifics. There cannot be a Member of the House who has not supported a constituent devastated by online fraud. It is growing exponentially, and there is almost universal agreement that the legislation should address it. That is why we changed things from the White Paper to the draft Bill, but I agree with the Committee that the measure should be strengthened, and it should also be extended to cover paid-for advertising.

Child protection has always been a cornerstone of the Bill. I have no doubt that social media platforms are where the volume is, in terms of both content and people, and in practice it is where very young children are most likely to stumble over really unpleasant content. However, it is not enough to include only user-generated content. The Bill’s credibility will be undermined overnight if the largest commercial pornography providers can keep hosting extreme content and putting children at risk. I would therefore like to see the Bill extended, in line with the age appropriate design code. That would be a really good way of dealing with that, as the report suggests.

On categorisation, no doubt big platforms and search engines are where the volume is, but the digital world changes at lightning pace and trends go viral overnight. Risk should not be judged on size—it must be judged on risk. Emerging platforms can be hotbeds of extremism and really unpleasant content, and they must be appropriately regulated.

A final quick note of caution about the report’s all-or-nothing tone. It makes great suggestions that would strengthen the Bill, but that has been years in the making. I did a tiny bit of it. It has involved many Ministers and a team of fantastic officials, many of whom have worked on it from the beginning. The Bill is like a huge, complicated tapestry: you pull one thread and others can unravel further down the line. The online world is so fast moving—it is evolving at a rate of knots. We have to think carefully about how we change the Bill. Otherwise, it will be obsolete before the ink is dry.