Online Safety Bill (Eighth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Eighth sitting)

Jane Stevenson Excerpts
Committee stage
Thursday 9th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with and welcome the hon. Gentleman’s contribution. It is a very valid point and one that we will explore further. It shows the necessity of this harm being classed as a priority harm in order that we protect animals, as well as people.

David Allen continued:

“We’re very concerned that the use of social media has changed the landscape of abuse with videos of animal cruelty being shared for likes and kudos with this sort of content normalising—and even making light of—animal cruelty. What’s even more worrying is the level of cruelty that can be seen in these videos, particularly as so many young people are being exposed to graphic footage of animals being beaten or killed which they otherwise would never have seen.”

Although the Bill has a clear focus on protecting children, we must remember that the prevalence of cruelty to animals online has the potential to have a hugely negative impact on children who may be inadvertently seeing that content through everyday social media channels.

Jane Stevenson Portrait Jane Stevenson (Wolverhampton North East) (Con)
- Hansard - -

The hon. Lady knows that I am a great animal lover, and I obviously have concerns about children being exposed to these images. I am just wondering how she would differentiate between abusive images and the images that are there to raise awareness of certain situations that animals are in. I have seen many distressing posts about the Yulin dogmeat festival and about beagles being used in laboratory experiments. How would she differentiate between images that are there to raise awareness of the plight of animals and the abusive ones?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the hon. Lady for her contribution. Like me, she is a passionate campaigner for animal welfare. It was a pleasure to serve on the Committee that considered her Glue Traps (Offences) Act 2022, which I know the whole House was pleased to pass. She raises a very important point and one that the Bill later explores with regard to other types of content, such as antisemitic content and racist content in terms of education and history and fact. The Bill deals specifically with that later, and this content would be dealt with in the same way. We are talking about where content is used as an educational tool and a raising-awareness tool, compared with just images and videos of direct abuse.

To give hon. Members a real sense of the extent of the issue, I would like to share some findings from a recent survey of the RSPCA’s frontline officers. These are pretty shocking statistics, as I am sure Members will all agree. Eighty-one per cent. of RSPCA frontline officers think that more abuse is being caught on camera. Nearly half think that more cases are appearing on social media. One in five officers said that one of the main causes of cruelty to animals is people hurting animals just to make themselves more popular on social media. Some of the recent cruelty videos posted on social media include a video of a magpie being thrown across the road on Instagram in June 2021; a woman captured kicking her dog on TikTok in March 2021; a teenager being filmed kicking a dog, which was shared on WhatsApp in May 2021; and videos posted on Instagram of cockerels being forced to fight in March 2021.

I am sure that colleagues will be aware of the most recent high-profile case, which was when disturbing footage was posted online of footballer Kurt Zouma attacking his cat. There was, quite rightly, an outpouring of public anger and demands for justice. Footage uploaded to Snapchat on 6 February showed Zouma kicking his Bengal cat across a kitchen floor in front of his seven-year-old son. Zouma also threw a pair of shoes at his pet cat and slapped its head. In another video, he was heard saying:

“I swear I’ll kill it.”

In sentencing him following his guilty plea to two offences under the Animal Welfare Act 2006, district judge Susan Holdham described the incident as “disgraceful and reprehensible”. She added:

“You must be aware that others look up to you and many young people aspire to emulate you.”

What makes that case even more sad is the way in which the video was filmed and shared, making light of such cruelty. I am pleased that the case has now resulted in tougher penalties for filming animal abuse and posting it on social media, thanks to new guidelines from the Sentencing Council. The prosecutor in the Zouma case, Hazel Stevens, told the court:

“Since this footage was put in the public domain there has been a spate of people hitting cats and posting it on various social media sites.”

There have been many other such instances. Just a few months ago, the most abhorrent trend was occurring on TikTok: people were abusing cats, dogs and other animals to music and encouraging others to do the same. Police officers discovered a shocking 182 videos with graphic animal cruelty on mobile phones seized during an investigation. This sickening phenomenon is on the rise on social media platforms, provoking a glamorisation of the behaviour. The videos uncovered during the investigation showed dogs prompted to attack other animals such as cats, or used to hunt badgers, deer, rabbits and birds. Lancashire police began the investigation after someone witnessed two teenagers encouraging a dog to attack a cat on an estate in Burnley in March of last year. The cat, a pet named Gatsby, was rushed to the vet by its owners once they discovered what was going on, but unfortunately it was too late and Gatsby’s injuries were fatal. The photos and videos found on the boys’ phones led the police to discover more teenagers in the area who were involved in such cruel activities. The views and interactions that the graphic footage was attracting made it even more visible, as the platform was increasing traffic and boosting content when it received attention.

It should not have taken such a high-profile case of a professional footballer with a viral video to get this action taken. There are countless similar instances occurring day in, day out, and yet the platforms and authorities are not taking the necessary action to protect animals and people from harm, or to protect the young people who seek to emulate this behaviour.

I pay tribute to the hard work of campaigning groups such as the RSPCA, Action for Primates, Asia for Animals Coalition and many more, because they are the ones who have fought to keep animal rights at the forefront. The amendment seeks to ensure that such groups are given a voice at the table when Ofcom consults on its all-important codes of practice. That would be a small step towards reducing animal abuse content online, and I hope the Minister can see the merits in joining the cause.

I turn to amendment 60, which would bring offences to which animals are subject within the definition of illegal content, a point raised by the hon. Member for Ochil and South Perthshire. The Minister will recall the Animal Welfare (Sentencing) Act 2021, which received Royal Assent last year. Labour was pleased to see the Government finally taking action against those who commit animal cruelty offences offline. The maximum prison sentence for animal cruelty was increased from six months to five years, and the Government billed that move as them taking a firmer approach to cases such as dog fighting, abuse of puppies and kittens, illegally cropping a dog’s ears and gross neglect of farm animals. Why, then, have the Government failed to include offences against animals within the scope of illegal content online? We want parity between the online and offline space, and that seems like a sharp omission from the Bill.

Placing obligations on service providers to remove animal cruelty content should fall within both the spirit and the scope of the Bill. We all know that the scope of the Bill is to place duties on service providers to remove illegal and harmful content, placing particular emphasis on the exposure of children. Animal cruelty content is a depiction of illegality and also causes significant harm to children and adults.

If my inbox is anything to go by, all of us here today know what so many of our constituents up and down the country feel about animal abuse. It is one of the most popular topics that constituents contact me about. Today, the Minister has a choice to make about his Government's commitment to preventing animal cruelty and keeping us all safe online. I hope he will see the merit in acknowledging the seriousness of animal abuse online.

Amendment 66 would ensure that groups were able to make complaints about animal abuse videos. Labour welcomes clause 140, as the ability to make super-complaints is a vital part of our democracy. However, as my hon. Friend the Member for Worsley and Eccles South and other Members have mentioned, the current definition of an “eligible entity” is far too loose. I have set out the reasons as to why the Government must go further to limit and prevent animal abuse content online. Amendment 66 would ensure that dangerous animal abuse content is a reasonable cause for a super-complaint to be pursued.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree with the points that have been made about the violence against women code of conduct. It is vital, and it would be a really important addition to the Bill. I associate myself with the shadow Minister’s comments, and am happy to stand alongside her.

I want to make a few comments about new clause 20 and some of the issues it raises. The new clause is incredibly important, and we need to take seriously the concerns that have been raised with us by the groups that advocate on behalf of children. They would not raise those concerns if they did not think the Bill was deficient in this area. They do not have spare people and cannot spend lots of time doing unnecessary things, so if they are raising concerns, those are very important things that will make a big difference.

I want to go a little further than what the new clause says and ask the Minister about future-proofing the Bill and ensuring that technologies can be used as they evolve. I am pretty sure that everybody agrees that there should be no space where it is safe to share child sexual exploitation and abuse, whether physical space or online space, private messaging or a more open forum. None of those places should be safe or legal. None should enable that to happen.

My particular thought about future-proofing is about the development of technologies that are able to recognise self-generated pictures, videos, livestreams and so on that have not already been categorised, do not have a hash number and are not easy for the current technologies to find. There are lots of people out there working hard to stamp out these images and videos online, and I have faith that they are developing new technologies that are able to recognise images, videos, messages and oral communications that cannot currently be recognised.

I agree wholeheartedly with the new clause: it is important that a report be produced within six months of the Bill being passed. It would be great if the Minister would commit to thinking about whether Ofcom will be able to require companies to implement new technologies that are developed, as well as the technologies that are currently available. I am not just talking about child sexual abuse images, material or videos; I am also talking about private messaging where grooming is happening. That is a separate thing that needs to be scanned for, but it is incredibly important.

Some of the stories relayed by the shadow Minister relate to conversations and grooming that happened in advance of the self-generated material being created. If there had been a proactive action to scan for grooming behaviour by those companies whose platforms the direct messaging was taking place on, then those young people would potentially have been in a safer place, because it could have been stopped in advance of that self-generated material being created. Surely, that should be the aim. It is good that we can tackle this after the event—it is good that we have something—but tackling it before it happens would be incredibly important.

Jane Stevenson Portrait Jane Stevenson
- Hansard - -

Online sexual exploitation is a horrific crime, and we all want to see it ended for good. I have concerns about whether new clause 20 is saying we should open up all messaging—where is the consideration of privacy when the scanning is taking place? Forgive me, I do not know much about the technology that is available to scan for that content. I do have concerns that responsible users will have an infringement of privacy, even when doing nothing of concern.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not know whether everybody draws the same distinction as me. For me the distinction is that, because it will be happening with proactive technology—technological means will be scanning those messages rather than humans—nobody will see the messages. Software will scan messages, and should there be anything that is illegal—should there be child sexual abuse material—that is what will be flagged and further action taken.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the shadow Minister for her assistance with that intervention, which was incredibly helpful. I do not have concerns that anybody will be able to access that data. The only data that will be accessible is when the proactive technology identifies something that is illegal, so nobody can see any of the messages except for the artificial intelligence. When the AI recognises that something is abuse material, at that point the Bill specifies that it will go to the National Crime Agency if it is in relation to child abuse images.

Jane Stevenson Portrait Jane Stevenson
- Hansard - -

My concern is that, at the point at which the data is sent to the National Crime Agency, it will be visible to human decision making. I am wondering whether that will stop parents sharing pictures of their babies in the bath? There are instances where people could get caught up in a very innocent situation that is deemed to be something more sinister by AI. However, I will take the advice of the hon. Member for Pontypridd advice and look into the technology.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

In terms of the secondary processes that kick in after the AI has scanned the data, I assume it will be up to Ofcom and the provider to discuss what happens then. Once the AI identifies something, does it automatically get sent to the National Crime Agency, or does it go through a process of checking to ensure the AI has correctly identified something? I agree with what the Minister has reiterated on a number of occasions; if it is child sexual abuse material then I have no problem with somebody’s privacy being invaded in order for that to be taken to the relevant authorities and acted on.

I want to make one last point. The wording of new clause 20 is about a report on those proactive technologies. It is about requiring Ofcom to come up with and justify the use of those proactive technologies. To give the hon. Member for Wolverhampton North East some reassurance, it is not saying, “This will definitely happen.” I assume that Ofcom will be able to make the case—I am certain it will be able to—but it will have to justify it in order to be able to require those companies to undertake that use.

My key point is about the future-proofing of this, ensuring that it is not just a one-off, and that, if Ofcom makes a designation about the use of proactive technologies, it is able to make a re-designation or future designation, should new proactive technologies come through, so that we can require those new proactive technologies to be used to identify things that we cannot identify with the current proactive technologies.