2 Jane Stevenson debates involving the Department for Digital, Culture, Media & Sport

Online Safety Bill (Seventh sitting)

Jane Stevenson Excerpts
Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Good morning, Ms Rees.

It is important that users of online services are empowered to report harmful content, so that it can be removed. It is also important for users to have access to complaints procedures when wrong moderation decisions have been made. Reporting and complaint mechanisms are integral to ensuring that users are safe and that free speech is upheld, and we support these provisions in the Bill.

Clauses 17 and 18, and clauses 27 and 28, are two parts of the same process: content reporting by individual users, and the handling of content reported as a complaint. However, it is vital that these clauses create a system that works. That is the key point that Labour Members are trying to make, because the wild west system that we have at the moment does not work.

It is welcome that the Government have proposed a system that goes beyond the users of the platform and introduces a duty on companies. However, companies have previously failed to invest enough money in their complaints systems for the scale at which they are operating in the UK. The duties in the Bill are an important reminder to companies that they are part of a wider society that goes beyond their narrow shareholder interest.

One example of why this change is so necessary, and why Labour Members are broadly supportive of the additional duties, is the awful practice of image abuse. With no access to sites on which their intimate photographs are being circulated, victims of image abuse have very few if any routes to having the images removed. Again, the practice of image abuse has increased during the pandemic, including through revenge porn, which the Minister referred to. The revenge porn helpline reported that its case load more than doubled between 2019 and 2020.

These clauses should mean that people can easily report content that they consider to be either illegal, or harmful to children, if it is hosted on a site likely to be accessed by children, or, if it is hosted on a category 1 platform, harmful to adults. However, the Minister needs to clarify how these service complaints systems will be judged and what the performance metrics will be. For instance, how will Ofcom enforce against a complaint?

In many sectors of the economy, even with long-standing systems of regulation, companies can have tens of millions of customers reporting content, but that does not mean that any meaningful action can take place. The hon. Member for Aberdeen North has just told us how often she reports on various platforms, but what action has taken place? Many advocacy groups of people affected by crimes such as revenge porn will want to hear, in clear terms, what will happen to material that has been complained about. I hope the Minister can offer that clarity today.

Transparency in reporting will be vital to analysing trends and emerging types of harm. It is welcome that in schedule 8, which we will come to later, transparency reporting duties apply to the complaints process. It is important that as much information as possible is made public about what is going on in companies’ complaints and reporting systems. As well as the raw number of complaints, reporting should include what is being reported or complained about, as the Joint Committee on the draft Bill recommended last year. Again, what happens to the reported material will be an important metric on which to judge companies.

Finally, I will mention the lack of arrangements for children. We have tabled new clause 3, which has been grouped for discussion with other new clauses at the end of proceedings, but it is relevant to mention it now briefly. The Children’s Commissioner highlighted in her oral evidence to the Committee how children had lost faith in complaints systems. That needs to be changed. The National Society for the Prevention of Cruelty to Children has also warned that complaints mechanisms are not always appropriate for children and that a very low proportion of children have ever reported content. A child specific user advocacy body could represent the interests of child users and support Ofcom’s regulatory decisions. That would represent an important strengthening of protections for users, and I hope the Government will support it when the time comes.

Jane Stevenson Portrait Jane Stevenson (Wolverhampton North East) (Con)
- Hansard - -

I rise briefly to talk about content reporting. I share the frustrations of the hon. Member for Aberdeen North. The way I read the Bill was that it would allow users and affected persons, rather than “or” affected persons, to report content. I hope the Minister can clarify that that means affected persons who might not be users of a platform. That is really important.

Will the Minister also clarify the use of human judgment in these decisions? Many algorithms are not taking down some content at the moment, so I would be grateful if he clarified that there is a need for platforms to provide a genuine human judgment on whether content is harmful.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to raise an additional point about content reporting and complaints procedures. I met with representatives of Mencap yesterday, who raised the issue of the accessibility of the procedures that are in place. I appreciate that the Bill talks about procedures being accessible, but will the Minister give us some comfort about Ofcom looking at the reporting procedures that are in place, to ensure that adults with learning disabilities in particular can access those content reporting and complaints procedures, understand them and easily find them on sites?

That is a specific concern that Mencap raised on behalf of its members. A number of its members will be users of sites such as Facebook, but may find it more difficult than others to access and understand the procedures that are in place. I appreciate that, through the Bill, the Minister is making an attempt to ensure that those procedures are accessible, but I want to make sure they are accessible not just for the general public but for children, who may need jargon-free access to content reporting and complaints procedures, and for people with learning disabilities, who may similarly need jargon-free, easy-to-understand and easy-to-find access to those procedures.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will address the review clause now, since it is relevant. If, in due course, as I hope and expect, the Bill has the desired effect, perhaps that would be the moment to consider the case for an ombudsman. The critical step is to take a systemic approach, which the Bill is doing. That engages the question of new clause 1, which would create a mechanism, probably for the reason the hon. Lady just set out, to review how things are going and to see if, in due course, there is a case for an ombudsman, once we see how the Bill unfolds in practice.

Jane Stevenson Portrait Jane Stevenson
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me finish the point. It is not a bad idea to review it and see how it is working in practice. Clause 149 already requires a review to take place between two and four years after Royal Assent. For the reasons that have been set out, it is pretty clear from this debate that we would expect the review to include precisely that question. If we had an ombudsman on day one, before the systems and processes had had a chance to have their effect, I fear that the ombudsman would be overwhelmed with millions of individual issues. The solution lies in fixing the problem systemically.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Because we need to give the new systems and processes time to take effect. If the hon. Lady felt so strongly that an ombudsman was required, she was entirely at liberty to table an amendment to introduce one, but she has not done so.

Jane Stevenson Portrait Jane Stevenson
- Hansard - -

I wonder whether Members would be reassured if companies were required to have a mechanism by which users could register their dissatisfaction, to enable an ombudsman, or perhaps Ofcom, to gauge the volume of dissatisfaction and bring some kind of group claim against the company. Is that a possibility?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. My hon. Friend hits the nail on the head. If there is a systemic problem and a platform fails to act appropriately not just in one case, but in a number of them, we have, as she has just described, the super-complaints process in clauses 140 to 142. Even under the Bill as drafted, without any changes, if a platform turns out to be systemically ignoring reasonable complaints made by the public and particular groups of users, the super-complainants will be able to do exactly as she describes. There is a mechanism to catch this—it operates not at individual level, but at the level of groups of users, via the super-complaint mechanism—so I honestly feel that the issue has been addressed.

When the numbers are so large, I think that the super-complaint mechanism is the right way to push Ofcom if it does not notice. Obviously, the first line of defence is that companies comply with the Bill. The second line of defence is that if they fail to do so, Ofcom will jump on them. The third line of defence is that if Ofcom somehow does not notice, a super-complaint group—such as the NSPCC, acting for children—will make a super-complaint to Ofcom. We have three lines of defence, and I submit to the Committee that they are entirely appropriate.

Elected Women Representatives: Online Abuse

Jane Stevenson Excerpts
Tuesday 20th April 2021

(3 years, 7 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Jane Stevenson Portrait Jane Stevenson (Wolverhampton North East) (Con) [V]
- Hansard - -

It is a great pleasure to serve under your chairmanship, Mr Paisley. I thank my right hon. Friend the Member for Basingstoke (Mrs Miller) for securing the debate.

Since my election in December 2019, I have seen horrific abuse of colleagues of both genders, but I have been especially disturbed by the nature of the abuse targeted at my female colleagues. We need to separate the standard political online abuse from the sort of messages that are aimed specifically at women, typically involving threats of sexual violence or insults about their physical appearance, or questioning whether they should be in politics at all because they may want to have children or because of their other family circumstances.

Sadly, it is not just in the UK that female representative face such abuse. A quick bit of internet research shows the extent of abuse faced by female colleagues around the world. I spent a thoroughly depressing evening reading all about it in numerous articles from Canada, Kenya, Finland, the USA, India, Chile and Japan—every single country that I could think of. A study by the National University of Ireland in Galway last year found that 96% of female politicians in the Republic of Ireland had faced online abuse. Shockingly, 40% of those interviewed reported being threatened with sexual violence as part of that abuse.

Study after study shows that such abuse puts decent, community-minded people off politics. I speak to those people regularly. If I meet a fantastic community activist, I will ask them why they do not think about standing as a councillor or Member of Parliament. The abuse is raised as one of the main reasons that they do not step forward. I have faced abuse. I am sure that we all have our little mantras that we repeat. In order for someone to upset me I have to respect their opinion, which wipes out a lot of the abuse, but we all have bad days when we are exhausted and it is difficult to brush it off. Sadly, many women leave politics, and cite the abuse that they have received as the reason.

Another result of the abuse is that it makes politicians more distant from the people we represent. Many people are leaving social media platforms. Social media should and could be a brilliant way to keep people informed and engaged with their community, but I fear that without further action we will have less engaged politicians. What more can social media providers do? I do not support a blanket ban on anonymity. Online anonymity is sometimes very valuable, if someone is seeking help for a highly sensitive matter or is a victim of domestic abuse.

However, perhaps we need better tailoring of regulations, filters and the ability to block, as my hon. Friend the Member for Stroud (Siobhan Baillie) referred to. That could go with the blue tick. Perhaps people interacting with verified accounts could have a filter button, to turn them on and off. Perhaps lowering the temperature for public figures will improve the internet for all users, because nobody should face the sort of abuse that we all face very regularly.

This debate should send a message from all parties in Parliament to our colleagues around the world and to those involved, and wanting to get involved, in politics. It was lovely that recently the hon. Member for Swansea East (Carolyn Harris) was defended from all corners of the House when faced with a horrific episode of abuse. When all of us call abuse out, especially when it is to Opposition colleagues, that sends a powerful message that we will not tolerate it. We can all do our bit to defend our colleagues in every corner of politics.