All 3 Debates between John Nicolson and Alex Davies-Jones

Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 9th Jun 2022

Online Safety Bill (Ninth sitting)

Debate between John Nicolson and Alex Davies-Jones
John Nicolson Portrait John Nicolson
- Hansard - -

No, I will let that particular weed die in the bed. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Briefly, as with earlier clauses, the Labour party recognises the challenge in finding the balance between freedom of expression and keeping people safe online. Our debate on the amendment has illustrated powerfully that the exemptions as they stand in the Bill are hugely flawed.

First, the exemption is open to abuse. Almost any organisation could develop a standards code and complaints process to define itself as a news publisher and benefit from the exemption. Under those rules, as outlined eloquently by my hon. Friend the Member for Batley and Spen, Russia Today already qualifies, and various extremist publishers could easily join it. Organisations will be able to spread seriously harmful content with impunity—I referred to many in my earlier contributions, and I have paid for that online.

Secondly, the exemption is unjustified, as we heard loud and clear during the oral evidence sessions. I recall that Kyle from FairVote made that point particularly clearly. There are already rigorous safeguards in the Bill to protect freedom of expression. The fact that content is posted by a news provider should not itself be sufficient reason to treat such content differently from that which is posted by private citizens.

Furthermore, quality publications with high standards stand to miss out on the exemption. The Minister must also see the lack of parity in the broadcast media space. In order for broadcast media to benefit from the exemption, they must be regulated by Ofcom, and yet there is no parallel stipulation for non-broadcast media to be regulated in order to benefit. How is that fair? For broadcast media, the requirement to be regulated by Ofcom is simple, but for non-broadcast media, the series of requirements are not rational, exclude many independent publishers and leave room for ambiguity.

Online Safety Bill (Tenth sitting)

Debate between John Nicolson and Alex Davies-Jones
Committee stage
Tuesday 14th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
John Nicolson Portrait John Nicolson
- Hansard - -

I do, of course, agree. As anyone who has suffered with someone in their family committing suicide knows, it has a lifelong family effect. It is yet another amendment where I feel we should depart from the pantomime of so much parliamentary procedure, where both sides fundamentally agree on things but Ministers go through the torturous process of trying to tell us that every single amendment that any outside body or any Opposition Member, whether from the SNP or the Labour party, comes up with has been considered by the ministerial team and is already incorporated or covered by the Bill. They would not be human if that were the case. Would it not be refreshing if there were a slight change in tactic, and just occasionally the Minister said, “Do you know what? That is a very good point. I think I will incorporate it into the Bill”?

None of us on the Opposition Benches seeks to make political capital out of any of the things we propose. All of us, on both sides of the House, are here with the best of intentions, to try to ensure that we get the best possible Bill. We all want to be able to vote for the Bill at the end of the day. Indeed, as I said, I have worked with two friends on the Conservative Benches—with the hon. Member for Watford on the Joint Committee on the draft Bill and with the hon. Member for Wolverhampton North East on the Select Committee on Digital, Culture, Media and Sport—and, as we know, they have both voted for various proposals. It is perhaps part of the frustration of the party system here that people are forced to go through the hoops and pretend that they do not really agree with things that they actually do agree with.

Let us try to move on with this, in a way that we have not done hitherto, and see if we can agree on amendments. We will withdraw amendments if we are genuinely convinced that they have already been considered by the Government. On the Government side, let them try to accept some of our amendments—just begin to accept some—if, as with this one, they think they have some merit.

I was talking about Samaritans, and exactly what it wants to do with the Bill. It is concerned about harmful content after the Bill is passed. This feeds into potentially the most important aspect of the Bill: it does not mandate risk assessments based exclusively on risk. By adding in the qualifications of size and scope, the Bill wilfully lets some of the most harmful content slip through its fingers—wilfully, but I am sure not deliberately. Categorisation will be covered by a later amendment, tabled by my hon. Friend the Member for Aberdeen North, so I shall not dwell on it now.

In July 2021, the Law Commission for England and Wales recommended the creation of a new narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. The commission identified that there is

“currently no offence that adequately addresses the encouragement of serious self-harm.”

The recommendation followed acknowledgement that

“self-harm content online is a worrying phenomenon”

and should have a

“robust fault element that targets deliberate encouragement of serious self-harm”.

Currently, there are no provisions of the Bill to create a new offence of assisting or encouraging self- harm.

In conclusion, I urge the Minister to listen not just to us but to the expert charities, including Samaritans, to help people who have lived experience of self-harm and suicide who are calling for regulation of these dangerous sites.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Good afternoon, Sir Roger; it is a pleasure, as ever, to serve under your chairship. I rise to speak to new clause 36, which has been grouped with amendment 142 and is tabled in the names of the hon. Members for Ochil and South Perthshire and for Aberdeen North.

I, too, pay tribute to Samaritans for all the work it has done in supporting the Bill and these amendments to it. As colleagues will be aware, new clause 36 follows a recommendation from the Law Commission dating back to July 2021. The commission recommended the creation of a new, narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. It identified that there is

“currently no offence that adequately addresses the encouragement of serious self-harm.”

The recommendation followed acknowledgement that

“self-harm content online is a worrying phenomenon”

and should have a

“robust fault element that targets deliberate encouragement of serious self-harm”.

Currently, there are no provisions in the Bill to create a new offence of assisting or encouraging self-harm, despite the fact that other recommendations from the Law Commission report have been brought into the Bill, such as creating a new offence of cyber-flashing and prioritising tackling illegal suicide content.

We all know that harmful suicide and self-harm content is material that has the potential to cause or exacerbate self-harm and suicidal behaviours. Content relating to suicide and self-harm falls into both categories in the Bill—illegal content and legal but harmful content. Encouraging or assisting suicide is also currently a criminal offence in England and Wales under the Suicide Act 1961, as amended by the Coroners and Justice Act 2009.

Content encouraging or assisting someone to take their own life is illegal and has been included as priority illegal content in the Bill, meaning that platforms will be required to proactively and reactively prevent individuals from encountering it, and search engines will need to structure their services to minimise the risk to individuals encountering the content. Other content, including content that positions suicide as a suitable way of overcoming adversity or describes suicidal methods, is legal but harmful.

The Labour party’s Front-Bench team recognises that not all content falls neatly into the legal but harmful category. What can be helpful for one user can be extremely distressing to others. Someone may find it extremely helpful to share their personal experience of suicide, for example, and that may also be helpful to other users. However, the same material could heighten suicidal feelings and levels of distress in someone else. We recognise the complexities of the Bill and the difficulties in developing a way around this, but we should delineate harmful and helpful content relating to suicide and self-harm, and that should not detract from tackling legal but clearly harmful content.

In its current form, the Bill will continue to allow legal but clearly harmful suicide and self-harm content to be accessed by over-18s. Category 1 platforms, which have the highest reach and functionality, will be required to carry out risk assessments of, and set out in their terms and conditions their approach to, legal but harmful content in relation to over-18s. As the hon. Member for Ochil and South Perthshire outlined, however, the Bill’s impact assessment states that “less than 0.001%” of in-scope platforms

“are estimated to meet the Category 1 and 2A thresholds”,

and estimates that only 20 platforms will be required to fulfil category 1 obligations. There is no requirement on the smaller platforms, including those that actively encourage suicide, to do anything at all to protect over-18s. That simply is not good enough. That is why the Labour party supports new clause 36, and we urge the Minister to do the right thing by joining us.

--- Later in debate ---
John Nicolson Portrait John Nicolson
- Hansard - -

It is an interesting question. Alas, I long ago stopped trying to put myself into the minds of Conservative Ministers—a scary place for any of us to be.

We understand that it is difficult to try to regulate in respect of human trafficking on platforms. It requires work across borders and platforms, with moderators speaking different languages. We established that Facebook does not have moderators who speak different languages. On the Joint Committee on the draft Bill, we discovered that Facebook does not moderate content in English to any adequate degree. Just look at the other languages around the world—do we think Facebook has moderators who work in Turkish, Finnish, Swedish, Icelandic or a plethora of other languages? It certainly does not. The only language that Facebook tries to moderate—deeply inadequately, as we know—is English. We know how bad the moderation is in English, so can the Committee imagine what it is like in some of the world’s other languages? The most terrifying things are allowed to happen without moderation.

Regulating in respect of human trafficking on platforms is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and the costs that will be entailed. If human trafficking is not designated a priority harm, I fear it will fall by the wayside, so I must ask the Minister: is human trafficking covered by another provision on priority illegal content? Like my hon. Friend the Member for Aberdeen North, I cannot see where in the Bill that lies. If the answer is yes, why are the human rights groups not satisfied with the explanation? What reassurance can the Minister give to the experts in the field? Why not add a direct reference to the Modern Slavery Act, as in the amendment?

If the answer to my question is no, I imagine the Minister will inform us that the Bill requires platforms to consider all illegal content. In what world is human trafficking that is facilitated online not a priority? Platforms must be forced to be proactive on this issue; if not, I fear that human trafficking, like so much that is non-priority illegal content, will not receive the attention it deserves.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Schedule 7 sets out the list of criminal content that in-scope firms will be required to remove as a priority. Labour was pleased to see new additions to the most recent iteration, including criminal content relating to online drug and weapons dealing, people smuggling, revenge porn, fraud, promoting suicide and inciting or controlling prostitution for gain. The Government’s consultation response suggests that the systems and processes that services may use to minimise illegal or harmful content could include user tools, content moderation and recommendation procedures.

More widely, although we appreciate that the establishment of priority offences online is the route the Government have chosen to go down with the Bill, we believe the Bill remains weak in relation to addressing harms to adults and wider societal harms. Sadly, the Bill remains weak in its approach and has seemingly missed a number of known harms to both adults and children that we feel are a serious omission. Three years on from the White Paper, the Government know where the gaps are, yet they have failed to address them. That is why we are pleased to support the amendment tabled by the hon. Members for Ochil and South Perthshire and for Aberdeen North.

Human trafficking offences are a serious omission from schedule 7 that must urgently be rectified. As we all know from whistleblower Frances Haugen’s revelations, Facebook stands accused, among the vast array of social problems, of profiting from the trade and sale of human beings—often for domestic servitude—by human traffickers. We also know that, according to internal documents, the company has been aware of the problems since at least 2018. As the hon. Member for Ochil and South Perthshire said, we know that a year later, on the heels of a BBC report that documented the practice, the problem was said to be so severe that Apple itself threatened to pull Facebook and Instagram from its app store. It was only then that Facebook rushed to remove content related to human trafficking and made emergency internal policy changes to avoid commercial consequences described as “potentially severe” by the company. However, an internal company report detailed that the company did not take action prior to public disclosure and threats from Apple—profit over people.

In a complaint to the US Securities and Exchange Commission first reported by The Wall Street Journal, whistleblower Haugen wrote:

“Investors would have been very interested to learn the truth about Facebook almost losing access to the Apple App Store because of its failure to stop human trafficking on its products.”

I cannot believe that the Government have failed to commit to doing more to tackle such abhorrent practices, which are happening every day. I therefore urge the Minister to do the right thing and support amendment 90.

Online Safety Bill (Eighth sitting)

Debate between John Nicolson and Alex Davies-Jones
Committee stage
Thursday 9th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is, as ever, a pleasure to serve under your chairship, Ms Rees. Amendment 65 would add organisations campaigning for the removal of animal content to the list of bodies that Ofcom must consult. As we all know, Ofcom must produce codes of practice that offer guidance on how regulated services can comply with its duties. Later in the Bill, clause 45 makes clear that if a company complies with the code of practice, it will be deemed to have complied with the Bill in general. In addition, the duties for regulated services come into force at the same time as the codes of practice. That all makes what the codes say extremely important.

The absence of protections relating to animal abuse content is a real omission from the Bill. Colleagues will have seen the written evidence from Action for Primates, which neatly summarised the key issues on which Labour is hoping to see agreement from the Government. Given this omission, it is clear that the current draft of the Bill is not fit for tackling animal abuse, cruelty and violence, which is all too common online.

There are no explicit references to content that can be disturbing and distressing to those who view it—both children and adults. We now know that most animal cruelty content is produced specifically for sharing on social media, often for profit through the monetisation schemes offered by platforms such as YouTube. Examples include animals being beaten, set on fire, crushed or partially drowned; the mutilation and live burial of infant monkeys; a kitten intentionally being set on by a dog and another being stepped on and crushed to death; live and conscious octopuses being eaten; and animals being pitted against each other in staged fights.

Animals being deliberately placed into frightening or dangerous situations from which they cannot escape or are harmed before being “rescued” on camera is becoming increasingly popular on social media, too. For example, kittens and puppies are “rescued” from the clutches of a python. Such fake rescues not only cause immense suffering to animals, but are fraudulent because viewers are asked to donate towards the rescue and care of the animals. This cannot be allowed to continue.

Indeed, as part of its Cancel Out Cruelty campaign, the Royal Society for the Prevention of Cruelty to Animals conducted research, which found that in 2020 there were nearly 500 reports of animal cruelty on social media. That was more than twice the figure reported for 2019. The majority of these incidents appeared on Facebook. David Allen, head of prevention and education at the RSPCA, has spoken publicly about the issue, saying:

“Sadly, we have seen an increase in recent years in the number of incidents of animal cruelty being posted and shared on social media such as Facebook, Instagram, TikTok and Snapchat.”

John Nicolson Portrait John Nicolson
- Hansard - -

I totally agree with the points that the hon. Lady is making. Does she agree that the way in which the Bill is structured means that illegal acts that are not designated as “priority illegal” will likely be put at the very end of companies’ to-do list and that they will focus considerably more effort on what they will call “priority illegal” content?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with and welcome the hon. Gentleman’s contribution. It is a very valid point and one that we will explore further. It shows the necessity of this harm being classed as a priority harm in order that we protect animals, as well as people.

David Allen continued:

“We’re very concerned that the use of social media has changed the landscape of abuse with videos of animal cruelty being shared for likes and kudos with this sort of content normalising—and even making light of—animal cruelty. What’s even more worrying is the level of cruelty that can be seen in these videos, particularly as so many young people are being exposed to graphic footage of animals being beaten or killed which they otherwise would never have seen.”

Although the Bill has a clear focus on protecting children, we must remember that the prevalence of cruelty to animals online has the potential to have a hugely negative impact on children who may be inadvertently seeing that content through everyday social media channels.