Read Bill Ministerial Extracts
Online Safety Bill (Seventh sitting) Debate
Full Debate: Read Full DebateJane Stevenson
Main Page: Jane Stevenson (Conservative - Wolverhampton North East)Department Debates - View all Jane Stevenson's debates with the Department for Digital, Culture, Media & Sport
(2 years, 5 months ago)
Public Bill CommitteesGood morning, Ms Rees.
It is important that users of online services are empowered to report harmful content, so that it can be removed. It is also important for users to have access to complaints procedures when wrong moderation decisions have been made. Reporting and complaint mechanisms are integral to ensuring that users are safe and that free speech is upheld, and we support these provisions in the Bill.
Clauses 17 and 18, and clauses 27 and 28, are two parts of the same process: content reporting by individual users, and the handling of content reported as a complaint. However, it is vital that these clauses create a system that works. That is the key point that Labour Members are trying to make, because the wild west system that we have at the moment does not work.
It is welcome that the Government have proposed a system that goes beyond the users of the platform and introduces a duty on companies. However, companies have previously failed to invest enough money in their complaints systems for the scale at which they are operating in the UK. The duties in the Bill are an important reminder to companies that they are part of a wider society that goes beyond their narrow shareholder interest.
One example of why this change is so necessary, and why Labour Members are broadly supportive of the additional duties, is the awful practice of image abuse. With no access to sites on which their intimate photographs are being circulated, victims of image abuse have very few if any routes to having the images removed. Again, the practice of image abuse has increased during the pandemic, including through revenge porn, which the Minister referred to. The revenge porn helpline reported that its case load more than doubled between 2019 and 2020.
These clauses should mean that people can easily report content that they consider to be either illegal, or harmful to children, if it is hosted on a site likely to be accessed by children, or, if it is hosted on a category 1 platform, harmful to adults. However, the Minister needs to clarify how these service complaints systems will be judged and what the performance metrics will be. For instance, how will Ofcom enforce against a complaint?
In many sectors of the economy, even with long-standing systems of regulation, companies can have tens of millions of customers reporting content, but that does not mean that any meaningful action can take place. The hon. Member for Aberdeen North has just told us how often she reports on various platforms, but what action has taken place? Many advocacy groups of people affected by crimes such as revenge porn will want to hear, in clear terms, what will happen to material that has been complained about. I hope the Minister can offer that clarity today.
Transparency in reporting will be vital to analysing trends and emerging types of harm. It is welcome that in schedule 8, which we will come to later, transparency reporting duties apply to the complaints process. It is important that as much information as possible is made public about what is going on in companies’ complaints and reporting systems. As well as the raw number of complaints, reporting should include what is being reported or complained about, as the Joint Committee on the draft Bill recommended last year. Again, what happens to the reported material will be an important metric on which to judge companies.
Finally, I will mention the lack of arrangements for children. We have tabled new clause 3, which has been grouped for discussion with other new clauses at the end of proceedings, but it is relevant to mention it now briefly. The Children’s Commissioner highlighted in her oral evidence to the Committee how children had lost faith in complaints systems. That needs to be changed. The National Society for the Prevention of Cruelty to Children has also warned that complaints mechanisms are not always appropriate for children and that a very low proportion of children have ever reported content. A child specific user advocacy body could represent the interests of child users and support Ofcom’s regulatory decisions. That would represent an important strengthening of protections for users, and I hope the Government will support it when the time comes.
I rise briefly to talk about content reporting. I share the frustrations of the hon. Member for Aberdeen North. The way I read the Bill was that it would allow users and affected persons, rather than “or” affected persons, to report content. I hope the Minister can clarify that that means affected persons who might not be users of a platform. That is really important.
Will the Minister also clarify the use of human judgment in these decisions? Many algorithms are not taking down some content at the moment, so I would be grateful if he clarified that there is a need for platforms to provide a genuine human judgment on whether content is harmful.
I want to raise an additional point about content reporting and complaints procedures. I met with representatives of Mencap yesterday, who raised the issue of the accessibility of the procedures that are in place. I appreciate that the Bill talks about procedures being accessible, but will the Minister give us some comfort about Ofcom looking at the reporting procedures that are in place, to ensure that adults with learning disabilities in particular can access those content reporting and complaints procedures, understand them and easily find them on sites?
That is a specific concern that Mencap raised on behalf of its members. A number of its members will be users of sites such as Facebook, but may find it more difficult than others to access and understand the procedures that are in place. I appreciate that, through the Bill, the Minister is making an attempt to ensure that those procedures are accessible, but I want to make sure they are accessible not just for the general public but for children, who may need jargon-free access to content reporting and complaints procedures, and for people with learning disabilities, who may similarly need jargon-free, easy-to-understand and easy-to-find access to those procedures.
I will address the review clause now, since it is relevant. If, in due course, as I hope and expect, the Bill has the desired effect, perhaps that would be the moment to consider the case for an ombudsman. The critical step is to take a systemic approach, which the Bill is doing. That engages the question of new clause 1, which would create a mechanism, probably for the reason the hon. Lady just set out, to review how things are going and to see if, in due course, there is a case for an ombudsman, once we see how the Bill unfolds in practice.
Let me finish the point. It is not a bad idea to review it and see how it is working in practice. Clause 149 already requires a review to take place between two and four years after Royal Assent. For the reasons that have been set out, it is pretty clear from this debate that we would expect the review to include precisely that question. If we had an ombudsman on day one, before the systems and processes had had a chance to have their effect, I fear that the ombudsman would be overwhelmed with millions of individual issues. The solution lies in fixing the problem systemically.
Because we need to give the new systems and processes time to take effect. If the hon. Lady felt so strongly that an ombudsman was required, she was entirely at liberty to table an amendment to introduce one, but she has not done so.
I wonder whether Members would be reassured if companies were required to have a mechanism by which users could register their dissatisfaction, to enable an ombudsman, or perhaps Ofcom, to gauge the volume of dissatisfaction and bring some kind of group claim against the company. Is that a possibility?
Yes. My hon. Friend hits the nail on the head. If there is a systemic problem and a platform fails to act appropriately not just in one case, but in a number of them, we have, as she has just described, the super-complaints process in clauses 140 to 142. Even under the Bill as drafted, without any changes, if a platform turns out to be systemically ignoring reasonable complaints made by the public and particular groups of users, the super-complainants will be able to do exactly as she describes. There is a mechanism to catch this—it operates not at individual level, but at the level of groups of users, via the super-complaint mechanism—so I honestly feel that the issue has been addressed.
When the numbers are so large, I think that the super-complaint mechanism is the right way to push Ofcom if it does not notice. Obviously, the first line of defence is that companies comply with the Bill. The second line of defence is that if they fail to do so, Ofcom will jump on them. The third line of defence is that if Ofcom somehow does not notice, a super-complaint group—such as the NSPCC, acting for children—will make a super-complaint to Ofcom. We have three lines of defence, and I submit to the Committee that they are entirely appropriate.
Jane Stevenson
Main Page: Jane Stevenson (Conservative - Wolverhampton North East)(2 years, 5 months ago)
Public Bill CommitteesI completely agree with and welcome the hon. Gentleman’s contribution. It is a very valid point and one that we will explore further. It shows the necessity of this harm being classed as a priority harm in order that we protect animals, as well as people.
David Allen continued:
“We’re very concerned that the use of social media has changed the landscape of abuse with videos of animal cruelty being shared for likes and kudos with this sort of content normalising—and even making light of—animal cruelty. What’s even more worrying is the level of cruelty that can be seen in these videos, particularly as so many young people are being exposed to graphic footage of animals being beaten or killed which they otherwise would never have seen.”
Although the Bill has a clear focus on protecting children, we must remember that the prevalence of cruelty to animals online has the potential to have a hugely negative impact on children who may be inadvertently seeing that content through everyday social media channels.
The hon. Lady knows that I am a great animal lover, and I obviously have concerns about children being exposed to these images. I am just wondering how she would differentiate between abusive images and the images that are there to raise awareness of certain situations that animals are in. I have seen many distressing posts about the Yulin dogmeat festival and about beagles being used in laboratory experiments. How would she differentiate between images that are there to raise awareness of the plight of animals and the abusive ones?
I thank the hon. Lady for her contribution. Like me, she is a passionate campaigner for animal welfare. It was a pleasure to serve on the Committee that considered her Glue Traps (Offences) Act 2022, which I know the whole House was pleased to pass. She raises a very important point and one that the Bill later explores with regard to other types of content, such as antisemitic content and racist content in terms of education and history and fact. The Bill deals specifically with that later, and this content would be dealt with in the same way. We are talking about where content is used as an educational tool and a raising-awareness tool, compared with just images and videos of direct abuse.
To give hon. Members a real sense of the extent of the issue, I would like to share some findings from a recent survey of the RSPCA’s frontline officers. These are pretty shocking statistics, as I am sure Members will all agree. Eighty-one per cent. of RSPCA frontline officers think that more abuse is being caught on camera. Nearly half think that more cases are appearing on social media. One in five officers said that one of the main causes of cruelty to animals is people hurting animals just to make themselves more popular on social media. Some of the recent cruelty videos posted on social media include a video of a magpie being thrown across the road on Instagram in June 2021; a woman captured kicking her dog on TikTok in March 2021; a teenager being filmed kicking a dog, which was shared on WhatsApp in May 2021; and videos posted on Instagram of cockerels being forced to fight in March 2021.
I am sure that colleagues will be aware of the most recent high-profile case, which was when disturbing footage was posted online of footballer Kurt Zouma attacking his cat. There was, quite rightly, an outpouring of public anger and demands for justice. Footage uploaded to Snapchat on 6 February showed Zouma kicking his Bengal cat across a kitchen floor in front of his seven-year-old son. Zouma also threw a pair of shoes at his pet cat and slapped its head. In another video, he was heard saying:
“I swear I’ll kill it.”
In sentencing him following his guilty plea to two offences under the Animal Welfare Act 2006, district judge Susan Holdham described the incident as “disgraceful and reprehensible”. She added:
“You must be aware that others look up to you and many young people aspire to emulate you.”
What makes that case even more sad is the way in which the video was filmed and shared, making light of such cruelty. I am pleased that the case has now resulted in tougher penalties for filming animal abuse and posting it on social media, thanks to new guidelines from the Sentencing Council. The prosecutor in the Zouma case, Hazel Stevens, told the court:
“Since this footage was put in the public domain there has been a spate of people hitting cats and posting it on various social media sites.”
There have been many other such instances. Just a few months ago, the most abhorrent trend was occurring on TikTok: people were abusing cats, dogs and other animals to music and encouraging others to do the same. Police officers discovered a shocking 182 videos with graphic animal cruelty on mobile phones seized during an investigation. This sickening phenomenon is on the rise on social media platforms, provoking a glamorisation of the behaviour. The videos uncovered during the investigation showed dogs prompted to attack other animals such as cats, or used to hunt badgers, deer, rabbits and birds. Lancashire police began the investigation after someone witnessed two teenagers encouraging a dog to attack a cat on an estate in Burnley in March of last year. The cat, a pet named Gatsby, was rushed to the vet by its owners once they discovered what was going on, but unfortunately it was too late and Gatsby’s injuries were fatal. The photos and videos found on the boys’ phones led the police to discover more teenagers in the area who were involved in such cruel activities. The views and interactions that the graphic footage was attracting made it even more visible, as the platform was increasing traffic and boosting content when it received attention.
It should not have taken such a high-profile case of a professional footballer with a viral video to get this action taken. There are countless similar instances occurring day in, day out, and yet the platforms and authorities are not taking the necessary action to protect animals and people from harm, or to protect the young people who seek to emulate this behaviour.
I pay tribute to the hard work of campaigning groups such as the RSPCA, Action for Primates, Asia for Animals Coalition and many more, because they are the ones who have fought to keep animal rights at the forefront. The amendment seeks to ensure that such groups are given a voice at the table when Ofcom consults on its all-important codes of practice. That would be a small step towards reducing animal abuse content online, and I hope the Minister can see the merits in joining the cause.
I turn to amendment 60, which would bring offences to which animals are subject within the definition of illegal content, a point raised by the hon. Member for Ochil and South Perthshire. The Minister will recall the Animal Welfare (Sentencing) Act 2021, which received Royal Assent last year. Labour was pleased to see the Government finally taking action against those who commit animal cruelty offences offline. The maximum prison sentence for animal cruelty was increased from six months to five years, and the Government billed that move as them taking a firmer approach to cases such as dog fighting, abuse of puppies and kittens, illegally cropping a dog’s ears and gross neglect of farm animals. Why, then, have the Government failed to include offences against animals within the scope of illegal content online? We want parity between the online and offline space, and that seems like a sharp omission from the Bill.
Placing obligations on service providers to remove animal cruelty content should fall within both the spirit and the scope of the Bill. We all know that the scope of the Bill is to place duties on service providers to remove illegal and harmful content, placing particular emphasis on the exposure of children. Animal cruelty content is a depiction of illegality and also causes significant harm to children and adults.
If my inbox is anything to go by, all of us here today know what so many of our constituents up and down the country feel about animal abuse. It is one of the most popular topics that constituents contact me about. Today, the Minister has a choice to make about his Government's commitment to preventing animal cruelty and keeping us all safe online. I hope he will see the merit in acknowledging the seriousness of animal abuse online.
Amendment 66 would ensure that groups were able to make complaints about animal abuse videos. Labour welcomes clause 140, as the ability to make super-complaints is a vital part of our democracy. However, as my hon. Friend the Member for Worsley and Eccles South and other Members have mentioned, the current definition of an “eligible entity” is far too loose. I have set out the reasons as to why the Government must go further to limit and prevent animal abuse content online. Amendment 66 would ensure that dangerous animal abuse content is a reasonable cause for a super-complaint to be pursued.
I absolutely agree with the points that have been made about the violence against women code of conduct. It is vital, and it would be a really important addition to the Bill. I associate myself with the shadow Minister’s comments, and am happy to stand alongside her.
I want to make a few comments about new clause 20 and some of the issues it raises. The new clause is incredibly important, and we need to take seriously the concerns that have been raised with us by the groups that advocate on behalf of children. They would not raise those concerns if they did not think the Bill was deficient in this area. They do not have spare people and cannot spend lots of time doing unnecessary things, so if they are raising concerns, those are very important things that will make a big difference.
I want to go a little further than what the new clause says and ask the Minister about future-proofing the Bill and ensuring that technologies can be used as they evolve. I am pretty sure that everybody agrees that there should be no space where it is safe to share child sexual exploitation and abuse, whether physical space or online space, private messaging or a more open forum. None of those places should be safe or legal. None should enable that to happen.
My particular thought about future-proofing is about the development of technologies that are able to recognise self-generated pictures, videos, livestreams and so on that have not already been categorised, do not have a hash number and are not easy for the current technologies to find. There are lots of people out there working hard to stamp out these images and videos online, and I have faith that they are developing new technologies that are able to recognise images, videos, messages and oral communications that cannot currently be recognised.
I agree wholeheartedly with the new clause: it is important that a report be produced within six months of the Bill being passed. It would be great if the Minister would commit to thinking about whether Ofcom will be able to require companies to implement new technologies that are developed, as well as the technologies that are currently available. I am not just talking about child sexual abuse images, material or videos; I am also talking about private messaging where grooming is happening. That is a separate thing that needs to be scanned for, but it is incredibly important.
Some of the stories relayed by the shadow Minister relate to conversations and grooming that happened in advance of the self-generated material being created. If there had been a proactive action to scan for grooming behaviour by those companies whose platforms the direct messaging was taking place on, then those young people would potentially have been in a safer place, because it could have been stopped in advance of that self-generated material being created. Surely, that should be the aim. It is good that we can tackle this after the event—it is good that we have something—but tackling it before it happens would be incredibly important.
Online sexual exploitation is a horrific crime, and we all want to see it ended for good. I have concerns about whether new clause 20 is saying we should open up all messaging—where is the consideration of privacy when the scanning is taking place? Forgive me, I do not know much about the technology that is available to scan for that content. I do have concerns that responsible users will have an infringement of privacy, even when doing nothing of concern.
I do not know whether everybody draws the same distinction as me. For me the distinction is that, because it will be happening with proactive technology—technological means will be scanning those messages rather than humans—nobody will see the messages. Software will scan messages, and should there be anything that is illegal—should there be child sexual abuse material—that is what will be flagged and further action taken.
I thank the shadow Minister for her assistance with that intervention, which was incredibly helpful. I do not have concerns that anybody will be able to access that data. The only data that will be accessible is when the proactive technology identifies something that is illegal, so nobody can see any of the messages except for the artificial intelligence. When the AI recognises that something is abuse material, at that point the Bill specifies that it will go to the National Crime Agency if it is in relation to child abuse images.
My concern is that, at the point at which the data is sent to the National Crime Agency, it will be visible to human decision making. I am wondering whether that will stop parents sharing pictures of their babies in the bath? There are instances where people could get caught up in a very innocent situation that is deemed to be something more sinister by AI. However, I will take the advice of the hon. Member for Pontypridd advice and look into the technology.
In terms of the secondary processes that kick in after the AI has scanned the data, I assume it will be up to Ofcom and the provider to discuss what happens then. Once the AI identifies something, does it automatically get sent to the National Crime Agency, or does it go through a process of checking to ensure the AI has correctly identified something? I agree with what the Minister has reiterated on a number of occasions; if it is child sexual abuse material then I have no problem with somebody’s privacy being invaded in order for that to be taken to the relevant authorities and acted on.
I want to make one last point. The wording of new clause 20 is about a report on those proactive technologies. It is about requiring Ofcom to come up with and justify the use of those proactive technologies. To give the hon. Member for Wolverhampton North East some reassurance, it is not saying, “This will definitely happen.” I assume that Ofcom will be able to make the case—I am certain it will be able to—but it will have to justify it in order to be able to require those companies to undertake that use.
My key point is about the future-proofing of this, ensuring that it is not just a one-off, and that, if Ofcom makes a designation about the use of proactive technologies, it is able to make a re-designation or future designation, should new proactive technologies come through, so that we can require those new proactive technologies to be used to identify things that we cannot identify with the current proactive technologies.