All 1 Debates between Chris Elmore and Holly Lynch

Online Harms

Debate between Chris Elmore and Holly Lynch
Wednesday 7th October 2020

(4 years, 1 month ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Holly Lynch Portrait Holly Lynch
- Hansard - - - Excerpts

The right hon. Lady makes an important point. I am about to come on to some of the different ways that we need to extend the regulation that is already there. She makes the point that that information was going straight into homes; information online is coming straight into somebody’s hand in front of their face, so why do we not extend the same types of regulation to it? I will come on to that in more detail, but I thank her for that point.

As I said, 99% of 12 to 15-year-olds are online, and seven in 10 young people have experienced cyber-bullying, with nearly 40% of young people saying they experienced cyber-bullying on a high-frequency basis, according to the Royal Society for Public Health’s “#StatusofMind” report. Those of us in this Chamber know better than anyone the impact that social media is having on public discourse and on the ability to have safe spaces for the exchange of different opinions, which are vital in any democracy.

One of the reasons the Yorkshire Evening Post was so motivated to launch the Call It Out campaign was realising the impact of the barrage of online abuse directed predominantly, but not exclusively, towards their its female journalists. Editor Laura Collins, who I commend for her leadership on this issue, told me this week that the sentiment of one comment on Facebook responding to an article about the local restrictions in Leeds was not uncommon: it said, “Whoever is publishing these articles needs executing by firing squad”. The newspaper reported it to Facebook on 28 September and nine days later is yet to receive a response.

Our “Clean Up The Internet” initiative, somewhat underwhelmed by the White Paper, feared that the Government did not have the will to truly transform the way the internet is used, so we considered what else would need to happen. Online social media platforms have said far too often that they just provide the platform and can only do so much to oversee the content shared on it, but that holds no water at all where paid ads are concerned. It is a glaring omission from the White Paper that it does not consider misinformation and disinformation, which can be not only shared widely for free, but promoted through online advertising.

As we have heard, advertising in print or on broadcast platforms is regulated through Ofcom and the Advertising Standards Authority, and it must be pre-approved by a number of relevant bodies. There are clear rules, powers and consequences. The internet, however, to quote the NSPCC campaign, is the “wild west”. We must therefore extend that regulation to online advertising as a matter of urgency.

The urgency is twofold. The spread of misinformation and disinformation relating to the pandemic, whether it is conspiracy theories about its origins or even its existence, fake cures or promoting the sale of personal protective equipment by bogus companies, when we are trying to combat a virus, can have fatal consequences. So-called clickbait advertising and the monetisation of items dressed up as news, with the most outrageous and sensational teasers inevitably receiving the most clicks and generating the most income, means that credible news from real journalists with integrity to both their conduct and their content, like those at the Yorkshire Post and the Yorkshire Evening Post, is being driven out of that space. The online business model does not work for those who play by the rules, because there simply are not any.

Let us move on to what else would make a difference. I hope that the Minister will be able to answer a number of questions today about the progress of legislation and regulation. We have had the initial response to the White Paper, but when can we expect to see the Bill published? If we consider that the process began when the Green Paper was published in October 2017 and that the Government have suggested it may be 2023 before new legislation comes into effect, that will be six years, which is an incredibly long time in the life of a child—almost an entire generation.

Opportunities to strengthen protections for children online have been continually missed. During lockdown, large numbers of children have been harmed by entirely avoidable online experiences. If the Government had acted sooner, those consequences may not have been as severe or widespread.

Chris Elmore Portrait Chris Elmore (Ogmore) (Lab)
- Hansard - -

I congratulate my hon. Friend on securing the debate and thank her for her tribute to me. I pay tribute to her for the work she does in her constituency and across Yorkshire on this issue.

In terms of protection of children, one of the most concerning things I have seen during the pandemic is about the Internet Watch Foundation, which is Government- funded and reports to the police and central Government about the number of URLs focusing on paedophilia and child exploitation images. Takedown has reduced by some 80% since the pandemic started. I have raised that with Ministers in the Cabinet Office and the Department for Digital, Culture, Media and Sport. Does she agree that the Government need to take that far more seriously and put funding in place to ensure such things can be taken down and that children are protected from the most extreme online harms?

Holly Lynch Portrait Holly Lynch
- Hansard - - - Excerpts

My hon. Friend, who has vast experience in this area, references some of the most extreme and harrowing online experiences, which our children are now becoming exposed to on a regular basis. We absolutely must re-resource this area to get a grip of it and prevent children from becoming victims, which happens every day that we do not tighten up the rules and regulations surrounding the use of the internet.

I also ask the Minister whether legislation will include— it should—regulation of, or rather the removal of, misinformation and disinformation online. Will it seek to regulate much more of what is harmful and hateful but is not necessarily criminal from a public health perspective, if nothing else? Will the proposed duty of care be properly underpinned by a statutory framework? Just how significant will the consequences be for those who do not adhere to it?

The Government announced the suspension of the implementation of an age-verification regime for commercial pornography sites on 16 October 2019, despite the fact that it only needed a commencement date. It is not at all clear why that was or when it will be reintroduced. I hope that the Minister can enlighten is about when the regime will come into effect.

The Local Government Association has raised important concerns. Local authorities have statutory safeguarding responsibilities on issues such as child exploitation, as we have just heard, suicide prevention and tackling addiction, all of which become incredibly difficult when a child or young person—or an adult, for that matter—goes online. It had to produce the “Councillors’ guide to handling intimidation”, which recognises the growing need among councillors for support related to predominantly online intimidation. That is another damning indication of just how bad things have become.

I have worked with these groups on this issue and have been overwhelmed with suggestions for what more could be done. First, no one should be able to set up an entirely anonymous profile on social media platforms. The rise in bots and people hiding behind anonymous profiles who push hate and abuse should simply no longer be allowed. People would not necessarily have to put all their information in the public domain, but they would need to provide accurate information in order to be able to set up an account or a profile. The approach is explicitly called for in two of the public petitions attached to the debate, demonstrating that there is public support for such an approach. That would allow us to hold both the platform and the individuals responsible to account for any breaches in conduct.

Imagine if being held to account for posting something that is predetermined to be abusive through the online harms Bill, such as hateful antisemitic content, meant that an appropriate agency—be it Ofcom, the police or the enforcement arm of a new regulator— could effectively issue on-the-spot fines to the perpetrator. If we can identify the perpetrator, we can also work with police to determine whether a hate crime has occurred and bring charges wherever possible. The increased resources that are necessary for such an approach would be covered by the revenue generated by those fines. That type of approach would be transformative. Can the Minister respond to that point—not necessarily to me, but to all those who have signed the petitions before us, which ask for that kind of thinking?

Fearing that the Government lack the will to adopt the radical approach that is required, the working group that I spoke about will look to get more and more advertisers on board that are prepared to pull their advertising from social media platforms if the sorts of transformations that we are calling for are not forthcoming. I put everyone on notice that that work is well under way.

On securing the debate, I was approached by colleagues from all parties, and I am pleased that so many are able to take part. Given just how broad this topic is, I have not said anything about extremist and radical content online, gang violence, cyber-bullying, self-harm, explicit and extreme content, sexual content, grooming, gaming and gambling, and the promotion of eating disorders. I am sure others will say more about such things, but I fear the Government will say that there is so much to regulate that they are struggling to see the way forward. There is so much there that it is a dereliction of duty every day that we fail to regulate this space and keep damaging content from our young people and adults alike.

We know that this is an international issue, and Plan International has just released the results of its largest ever global survey on online violence after speaking to 14,000 girls aged 15 to 25 across 22 countries. The data reveal that nearly 60% have been harassed or abused online, and that one in five girls have left a social media platform or significantly reduced their use of it after being harassed. This plea goes to the social media companies as well: if they want to have users in the future who can enjoy what they provide, they must create a safe space. Currently, they simply do not. It is an international issue, but we are the mother of Parliaments, are we not?

The Government seem so overwhelmed by the prospect of doing everything that they are not doing anything. I urge the Minister to start that process. Take those first steps, because each one will make some difference in bringing about the change that we have a moral obligation to deliver.