Wednesday 26th October 2022

(2 years ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Hinds Portrait Damian Hinds (East Hampshire) (Con)
- Hansard - - - Excerpts

I beg to move,

That this House has considered online harms.

It is a great pleasure to see you in the Chair, Mr Dowd. This is the first time I have had the opportunity to serve in Westminster Hall under your chairmanship—[Interruption.] In a debate about technology, this was always going to happen. It is great to see the Minister, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), in his place. He is enormously respected by Members on both sides of the House. He came to this role with more knowledge of his subject than probably any other Minister in the history of Ministers, so he brings a great deal to it.

This is an important and timely debate, given that the Online Safety Bill is returning to the Commons next week. Obviously, a great deal of the debate will be in the House of Lords, so I thought that it was important to have more discussion in the House of Commons. The Online Safety Bill is a landmark and internationally leading Bill. As a number of people, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), can attest, it has been a long time in gestation—five years, including two consultations, a Green Paper, a White Paper, a draft Bill, prelegislative scrutiny, 11 sessions of the Joint Committee on the Draft Online Safety Bill chaired by my hon. Friend the Member for Folkestone and Hythe, and nine days at Committee stage in the Commons. It is complex legislation, but that is because the subject that it addresses is complex.

Some want the Bill to go further, and I have no doubt that on Report and in the Lords there will be many attempts to do that. Others think it already goes too far. The most important message about the Bill is that we need to get on with it.

Technology is a big part of children’s lives—actually, it is a big part of all our lives. The vast majority of it is good. It provides new ways of keeping in touch, and ways of enhancing education for children with special educational needs. Think of all the rows in the car that have been done away with by the sat-nav—at least those rows. My personal favourite is the thing on my phone that says, “The rain will stop in 18 minutes,” so I know when to get my sandwich. Technology changes the way we live our lives. Think about our working lives in this place. Thanks to Tony Blair and new Labour, the pager got all MPs on message and disciplined, and now WhatsApp is having exactly the opposite effect.

In particular, in the Bill and this discussion we are concerned about social media. Again, most of what social media has given us is good, but it has also carried with it much harm. I say “carried with it” because much of that harm has not been created by social media, but has been distributed, facilitated and magnified by it. In the last couple of weeks, we have been reminded of the terrible tragedy of Molly Russell, thanks to the tireless campaigning and immense fortitude of her father, Ian, and her family. The coroner concluded that social media companies and the content pushed to Molly through algorithmic recommendations contributed to her death

“in more than a minimal way”.

Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Hansard - -

My right hon. Friend is making an excellent speech, and I entirely agree that the Bill needs to come forward now. The algorithm is the key part to anything that goes on, in terms of dealing with online problems. The biggest problem I have found is trying to get transparency around the algorithm. Does he agree that the Bill should concentrate on exposing the algorithms, even if they are commercially sensitive, and allowing Ofcom to pull on those algorithms so that we do not get into the horrible situation that he has described?

Damian Hinds Portrait Damian Hinds
- Hansard - - - Excerpts

I absolutely agree about the centrality of the algorithms and about understanding how they work. We may come on to that later in the debate. That brings me on to my next point. Of course, we should not think of Molly’s tragedy as a single event; there have been other tragedies. There is also a long tail of harm done to young people through an increased prevalence of self-harm, eating disorders, and the contribution to general mental ill-health. All of that has a societal cost, as well as a cost to the individual. That is also a literal cost, in terms of cash, as well as the terrible social cost.

Importantly, this is not only about children. Ages 18 to 21 can be a vulnerable time for some of the issues I have just mentioned. Of course, with domestic abuse, antisemitism, racist abuse, and so on, most of that is perpetrated by—and inflicted on—people well above the age of majority. I found that the importance and breadth of this subject was reflected in my Outlook inbox over the past few days. Whenever a Member’s name is on the Order Paper for a Westminster Hall debate, they get all sorts of briefings from various third parties, but today’s has broken all records. I have heard from everybody, from Lego to the Countryside Alliance.

On that subject, I thank some of the brilliant organisations that work so hard in this area, such as 5Rights, the Children’s Charities Coalition, the National Society for the Prevention of Cruelty to Children, of course, the Carnegie Trust, the City of London Corporation, UK Finance, the Samaritans, Kick It Out, and more.

I should also note the three e-petitions linked to this subject, reflecting the public’s engagement: the e-petition to ban anonymous accounts on social media, which has almost 17,000 signatories; the petition to hold online trolls accountable, with more than 130,000 signatories; and the e-petition for verified ID to be required to open a social media account, with almost 700,000 signatories.

Such is the interest in this subject and the Online Safety Bill, which is about to come back to the Commons, that someone could be forgiven for thinking that it is about to solve all of our problems, but I am afraid that it will not. It is a framework that will evolve, and this will not be the last time that we have to legislate on the subject. Indeed, many of the things that must be done probably cannot be legislated for anyway. Additionally, technology evolves. A decade ago, legislators were not talking about the effect of livestreaming on child abuse. We certainly were not talking about the use of emojis in racist abuse. Today, we are just getting to grips with what the metaverse will be and what it implies. Who knows, in five or 10 years’ time, what the equivalent subjects will be?

From my most recent ministerial role as Minister of State for Security, there are three areas covered in the Online Safety Bill that I will mention to stress the importance of pressing on with it and getting it passed into law. The first is child abuse, which I have just mentioned. Of course, some child abuse is perpetrated on the internet, but it is more about distribution. Every time that an abusive image of a child is forwarded, that victim is re-victimised. It also creates the demand for further primary abuse. I commend the agencies, the National Crime Agency and CEOP—Child Exploitation and Online Protection Command—and the brilliant organisations, some of which I have mentioned, that work in this area, including the international framework around NCMEC, the National Centre for Missing and Exploited Children, in the United States.

However, I am afraid that it is a growth area. That is why we must move quickly. The National Crime Agency estimates that between 550,000 and 850,000 people pose, in varying degrees, a sexual risk to children. Shall I repeat those numbers? Just let them sink in. That is an enormous number of people. With the internet, the accessibility is much greater than ever before. The Internet Watch Foundation notes a growth in sexual abuse content available online, particularly in the category known as “self-generated” imagery.

The second area is fraud, which is now the No. 1 category of crime in this country by volume—and in many other countries. Almost all of it has an online aspect or is entirely online. I commend the Minister, and the various former Ministers in the Chamber, on their work in ensuring that fraud is properly addressed in the Bill. There have been three moves forward in that area, and my hon. Friends the Members for Hexham (Guy Opperman) and for Barrow and Furness (Simon Fell) may speak a bit more about that later. We need to ensure that fraud is in scope, that it becomes a priority offence and, crucially, that advertising for fraud is added to the offences covered.

I hope that, over time, the Government can continue to look at how to sharpen our focus in this area and, in particular, how to line up everybody’s incentives. Right now, the banks have a great incentive to stop fraud because they are liable for the losses. Anybody who has tried to make an online payment recently will know what that does. When people are given a direct financial incentive—a cost—to this thing being perpetrated, they will go to extraordinary lengths to try to stop it happening. If we could get that same focus on people accepting the content or ads that turn out to be fraud, imagine what we could do—my hon. Friend may be about to tell us.

--- Later in debate ---
Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Hansard - -

If it was not intimidating enough to be with the great and the good of the Online Safety Bill, trying to say everything I want in three minutes is even more of a challenge.

I will be brief. I came to this issue through body image; that is why I learned what I have on this subject. I simply ask for two things. In his speech, my right hon. Friend the Member for East Hampshire (Damian Hinds) said that this is about frameworks. I have two suggestions that I think would make a huge difference in respect of future-proofing the legislation and providing a framework. The first is to build on the fantastic work of my hon. Friend the Member for Stroud (Siobhan Baillie). We are talking about having authenticated anonymous and non-anonymous accounts. Giving the end user the choice of whether they want to go into the wild west is fundamental.

Now that, through the Content Authenticity Initiative —to which 800 companies around the world have signed up—the technology exists to have an open standard of transparency in respect of how images are taken, from the camera to how they are put in place, we have a framework that runs around the world that means people can make the same choice about images as about accounts. If we future-proof that in legislation, we simply allow the user to choose to switch on that tool and see images that we know are verified on an open source. It is not about someone making a decision; it is simply about understanding where the image comes from, how it got there, how it was made and who passed it on. That is an incredibly powerful and incredibly simple way to create a protective framework.

That leads me to my second, possibly more important, point, which was raised by my hon. Friend the Member for Gosport (Dame Caroline Dinenage). Algorithms are king when it comes to social media. Controlling them is very difficult, but someone should be responsible. In schools we have safeguarding leads for dealing with vulnerable people, and in councils we have financial named people, so why on earth do we not have a named person legally responsible for the algorithm in a company? We have it with GDPR. That would allow anyone in this debate, anyone in the police force, anyone in Ofcom or any member of the public to go that person and say, “Why is your algorithm behaving in the way it is?” Every time I have tried to do that, I have been told that it is commercially sensitive and that there is a team somewhere else in the world that deals with it.

I know that Ofcom has the power to deal with this issue, but it is a one-off notice when it is requested. I simply think that stating that there is a named person legally responsible for the algorithm would change behaviours, because their head would be on the chopping block if they got it wrong. This is about responsibility. That is what the Bill provides, and that is why I am advocating for those two points.