Thursday 13th January 2022

(2 years, 3 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Parliament Live - Hansard - -

I come to this report through the prism of my work on body image. The Minister will be pleased to hear that I will not give again the speech that I delivered yesterday, when he was kind enough to join proceedings on my private Member’s Bill about digitally altered body images that should carry a logo. Although I would welcome the Government taking on that Bill, I have to play on the Government’s playing field, which has led me to assess this Bill through that prism.

I should congratulate the Government on what they are trying to achieve: a world-leading, world-beating risk assessment across the internet. To achieve that would be no mean feat. I have not heard mentioned enough the role that Ofcom will play. Having met Ofcom, I know that it would need the tools and ability to investigate and to levy very heavy fines and punishments on companies for breaching the rules. They are going to be the key to holding this all together.

Body image falls on the side of content that is legal but harmful. Clause 46(3) of the draft Bill states:

“Content is within this subsection if the provider of the service has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities”.

It repeats that in several versions. I am pleased to see that that matches up with the report, but I appreciate that there is a difference of opinion on whether clause 11 should remain. Both pick up on the fact that

“Knowingly false communications likely to cause significant physical or psychological harm to a reasonable person”

should be called out. The report goes on to state:

“As with the other safety duties, we recommend that Ofcom be required to issue a mandatory code of practice to service providers on how they should comply with this duty. In doing so they must identify features and processes that facilitate sharing and spread of material in these named areas and set out clear expectations of mitigation and management strategies”.

After reading those points, both in the Bill and the report, I think a gap has been missed. There is no problem with seeing one doctored image; it is the volume of doctored images—the repeated images of shoulders distorted, waists thinner, breasts bigger—that has an impact. That is the same with people who are looking for information on dietary requirements. My hon. Friend the Member for Gosport (Dame Caroline Dinenage), who is no longer in her place, hit the nail on the head perfectly. It is about algorithms. That is where I want the Bill to be stronger. In every meeting that I have had with TikTok, Instagram, Facebook or Snapchat—you name it—when I have asked about algorithms, they say, “We can’t tell you more about it because it’s commercially sensitive,” but they are fundamentally what is driving us down the rabbit holes that the report rightly picks up on. How will we in this House determine what things look like if we do not understand what is driving them there in the first place? The horse has literally left the stables by the time we are picking up the pieces.

I am pleased that in previous debates the Minister has said that Ofcom will be able to request this information, but I would ask that we go one step further and say that that information could be exposed to the public. Why? Because that will undermine the whole model driving these companies in their commercial activity, because it will lay it bare for us all to see. That is key to the transparency that we need. Otherwise, how do we police the volume of images that are being delivered to our young people, whether they are body images or about self-harm, race hate or race-baiting, or whatever people want to call or it or whatever their niche happens to be? As we heard in this debate, social media plays on not only people’s interests, but their insecurities. That is what we have to tighten up on. The Bill and this report, working in conjunction, can really do that. However, I urge that the volume and, most importantly, the algorithms are considered.