(2 years, 5 months ago)
Commons ChamberThis has been an interesting debate on a Bill I have followed closely. I have been particularly struck by some of the arguments that claim the Bill is an attack on freedom of speech. I always listen intently to my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and to the hon. and learned Member for Edinburgh South West (Joanna Cherry), but I think they are wrong in the conclusions they have reached about legal but harmful content. Indeed, many of the criticisms that the hon. and learned Member for Edinburgh South West made of the various platforms were criticisms of the present situation, and that is exactly why I think this legislation will improve the position. However, those Members raised important points that I am sure will be responded to. I have also been a strong advocate of the inclusion of small but high-harm platforms, as the Minister and the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), both know—we have all had those discussions.
In the time I have, I want to focus principally on the issue of search and on new clauses 9 and 10, which stand in my name. As the shadow Minister has highlighted, last week we were—like many people in this place, perhaps—sent the most remarkable online prompt, which was to simply search Google for the words “desk ornament”. The top images displayed in response to that very mundane and boring search were of swastikas, SS bolts and other Nazi memorabilia presented as desk ornaments. Despite there having been awareness of that fact since, I believe, the previous weekend, and even though Google is making millions of pounds in seconds from advertising, images promoting Nazism were still available for all to see as a result of those searches.
When he gave evidence to the Bill Committee recently, Danny Stone, the Antisemitism Policy Trust’s very capable chief executive, pointed out that Amazon’s Alexa had used just one comment posted by one individual on Amazon’s website to inform potentially millions of users who cared to ask that George Soros was responsible for all of the world’s evils, and that Alexa had used a comment from another website to inform those who searched for it that the humanitarian group the White Helmets was an illicit operation founded by a British spy.
As we have seen throughout the covid pandemic, similar results come up in response to other searches, such as those around vaccines and covid. The Antisemitism Policy Trust has previously demonstrated that Microsoft Bing, the platform that lies behind Alexa, was directing users to hateful searches such as “Jews are bastards” through autocompletes, as well as pointing people to homophobic stories. We even had the sickening situation of Google’s image carousel highlighting Jewish baby strollers in response to people searching for portable barbecues.
Our own Alexa searches highlighted the issue some time ago. Users who asked Alexa “Do Jews control the media?” were responded to with a quote from a website called Jew Watch—that should tell Members all they need to know about the nature of the platform—saying that Jews control not only the media, but the financial system too. The same problem manifests itself across search platforms in other languages, as we highlighted not so long ago with Siri in Spanish. When asked, “Do the Jews control the media?” she responds with an article that states that Jews do indeed control international media. This goes on and on, irrespective of whether the search is voice or text-based.
The largest search companies in the world are falling at the first hurdle when it comes to risk assessing for harms on their platform. That is the key point when we ask for lawful but harmful content to be responded to. It is about risk assessment—requiring companies that do not respect borders, operate globally and are in many ways more powerful than Governments to risk assess and warn about lawful but deeply harmful content that all of us in the House would be disgusted by.
At present, large traditional search services including Google and Microsoft Bing, and voice search assistants including Alexa and Siri, will be exempted from having to risk assess their systems and address harm to adults, despite the fact that other large user-to-user services will have to do so. How can it be possible that Google does not have to act, when Meta—Facebook—and Twitter do? That does not seem consistent with the aims of the Bill.
There is a lot more that I would like to have said on the Bill. I welcome the written ministerial statement last week in relation to small but high-harm platforms. I hope that as the Bill progresses to the other place, we can look again at search. Some of the content generated is truly appalling, even though it may very well be considered lawful.
I join everyone else in the House in welcoming the Minister to his place.
I rise to speak in support of amendments 15 and 16. At the core of this issue is the first duty of any Government: to keep people safe. Too often in debates, which can become highly technical, we lose sight of that fact. We are not just talking about technology and regulation; we are talking about real lives and real people. It is therefore incumbent on all of us in this place to have that at the forefront of our minds when discussing such legislation.
Labelling social media as the wild west of today is hardly controversial—that is plain and obvious for all to see. There has been a total failure on the part of social media companies to make their platforms safe for everyone to use, and that needs to change. Regulation is not a dirty word, but a crucial part of ensuring that as the internet plays a bigger role in every generation’s lives, it meets the key duty of keeping people safe. It has been a decade since we first heard of this Bill, and almost four years since the Government committed to it, so I am afraid that there is nothing even slightly groundbreaking about the Bill as it is today. We have seen progress being made in this area around the world, and the UK is falling further and further behind.
Of particular concern to me is the impact on children and young people. As a mother, I worry for the world that my young daughter will grow up in, and I will do all I can in this place to ensure that children’s welfare is at the absolute forefront. I can see no other system or institution that children are allowed to engage with that has such a wanting lack of safeguards and regulation. If there was a faulty slide in a playground, it would be closed off and fixed. If a sports field was covered with glass or litter, that would be reported and dealt with. Whether we like it or not, social media has become the streets our children hang out in, the world they grow up in and the playground they use. It is about time we started treating it with the same care and attention.
There are far too many holes in the Bill that allow for the continued exploitation of children. Labour’s amendments 15 and 16 tackle the deeply troubling issue of “breadcrumbing”. That is where child abusers use social networks to lay trails to illegal content elsewhere online and share videos of abuse edited to fall within content moderation guidelines. The amendments would give the regulators powers to tackle that disgusting practice and ensure that there is a proactive response to it. They would bring into regulatory scope the millions of interactions with accounts that actively enable child abuse. Perhaps most importantly, they would ensure that social media companies tackled child abuse at the earliest possible stage.
In its current form, even with Government amendment 14, the Bill merely reinforces companies’ current focus only on material that explicitly reaches the criminal threshold. That is simply not good enough. Rather than acknowledging that issue, Government amendments 71 and 72 let social media companies off the hook. They remove the requirement for companies to apply their terms and conditions “consistently”. That was addressed very eloquently by the hon. Member for Croydon South (Chris Philp) and the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), who highlighted that Government amendment 14 simply does not go far enough.