Wednesday 26th October 2022

(1 year, 6 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Hinds Portrait Damian Hinds (East Hampshire) (Con)
- Hansard - -

I beg to move,

That this House has considered online harms.

It is a great pleasure to see you in the Chair, Mr Dowd. This is the first time I have had the opportunity to serve in Westminster Hall under your chairmanship—[Interruption.] In a debate about technology, this was always going to happen. It is great to see the Minister, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), in his place. He is enormously respected by Members on both sides of the House. He came to this role with more knowledge of his subject than probably any other Minister in the history of Ministers, so he brings a great deal to it.

This is an important and timely debate, given that the Online Safety Bill is returning to the Commons next week. Obviously, a great deal of the debate will be in the House of Lords, so I thought that it was important to have more discussion in the House of Commons. The Online Safety Bill is a landmark and internationally leading Bill. As a number of people, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), can attest, it has been a long time in gestation—five years, including two consultations, a Green Paper, a White Paper, a draft Bill, prelegislative scrutiny, 11 sessions of the Joint Committee on the Draft Online Safety Bill chaired by my hon. Friend the Member for Folkestone and Hythe, and nine days at Committee stage in the Commons. It is complex legislation, but that is because the subject that it addresses is complex.

Some want the Bill to go further, and I have no doubt that on Report and in the Lords there will be many attempts to do that. Others think it already goes too far. The most important message about the Bill is that we need to get on with it.

Technology is a big part of children’s lives—actually, it is a big part of all our lives. The vast majority of it is good. It provides new ways of keeping in touch, and ways of enhancing education for children with special educational needs. Think of all the rows in the car that have been done away with by the sat-nav—at least those rows. My personal favourite is the thing on my phone that says, “The rain will stop in 18 minutes,” so I know when to get my sandwich. Technology changes the way we live our lives. Think about our working lives in this place. Thanks to Tony Blair and new Labour, the pager got all MPs on message and disciplined, and now WhatsApp is having exactly the opposite effect.

In particular, in the Bill and this discussion we are concerned about social media. Again, most of what social media has given us is good, but it has also carried with it much harm. I say “carried with it” because much of that harm has not been created by social media, but has been distributed, facilitated and magnified by it. In the last couple of weeks, we have been reminded of the terrible tragedy of Molly Russell, thanks to the tireless campaigning and immense fortitude of her father, Ian, and her family. The coroner concluded that social media companies and the content pushed to Molly through algorithmic recommendations contributed to her death

“in more than a minimal way”.

Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Hansard - - - Excerpts

My right hon. Friend is making an excellent speech, and I entirely agree that the Bill needs to come forward now. The algorithm is the key part to anything that goes on, in terms of dealing with online problems. The biggest problem I have found is trying to get transparency around the algorithm. Does he agree that the Bill should concentrate on exposing the algorithms, even if they are commercially sensitive, and allowing Ofcom to pull on those algorithms so that we do not get into the horrible situation that he has described?

Damian Hinds Portrait Damian Hinds
- Hansard - -

I absolutely agree about the centrality of the algorithms and about understanding how they work. We may come on to that later in the debate. That brings me on to my next point. Of course, we should not think of Molly’s tragedy as a single event; there have been other tragedies. There is also a long tail of harm done to young people through an increased prevalence of self-harm, eating disorders, and the contribution to general mental ill-health. All of that has a societal cost, as well as a cost to the individual. That is also a literal cost, in terms of cash, as well as the terrible social cost.

Importantly, this is not only about children. Ages 18 to 21 can be a vulnerable time for some of the issues I have just mentioned. Of course, with domestic abuse, antisemitism, racist abuse, and so on, most of that is perpetrated by—and inflicted on—people well above the age of majority. I found that the importance and breadth of this subject was reflected in my Outlook inbox over the past few days. Whenever a Member’s name is on the Order Paper for a Westminster Hall debate, they get all sorts of briefings from various third parties, but today’s has broken all records. I have heard from everybody, from Lego to the Countryside Alliance.

On that subject, I thank some of the brilliant organisations that work so hard in this area, such as 5Rights, the Children’s Charities Coalition, the National Society for the Prevention of Cruelty to Children, of course, the Carnegie Trust, the City of London Corporation, UK Finance, the Samaritans, Kick It Out, and more.

I should also note the three e-petitions linked to this subject, reflecting the public’s engagement: the e-petition to ban anonymous accounts on social media, which has almost 17,000 signatories; the petition to hold online trolls accountable, with more than 130,000 signatories; and the e-petition for verified ID to be required to open a social media account, with almost 700,000 signatories.

Such is the interest in this subject and the Online Safety Bill, which is about to come back to the Commons, that someone could be forgiven for thinking that it is about to solve all of our problems, but I am afraid that it will not. It is a framework that will evolve, and this will not be the last time that we have to legislate on the subject. Indeed, many of the things that must be done probably cannot be legislated for anyway. Additionally, technology evolves. A decade ago, legislators were not talking about the effect of livestreaming on child abuse. We certainly were not talking about the use of emojis in racist abuse. Today, we are just getting to grips with what the metaverse will be and what it implies. Who knows, in five or 10 years’ time, what the equivalent subjects will be?

From my most recent ministerial role as Minister of State for Security, there are three areas covered in the Online Safety Bill that I will mention to stress the importance of pressing on with it and getting it passed into law. The first is child abuse, which I have just mentioned. Of course, some child abuse is perpetrated on the internet, but it is more about distribution. Every time that an abusive image of a child is forwarded, that victim is re-victimised. It also creates the demand for further primary abuse. I commend the agencies, the National Crime Agency and CEOP—Child Exploitation and Online Protection Command—and the brilliant organisations, some of which I have mentioned, that work in this area, including the international framework around NCMEC, the National Centre for Missing and Exploited Children, in the United States.

However, I am afraid that it is a growth area. That is why we must move quickly. The National Crime Agency estimates that between 550,000 and 850,000 people pose, in varying degrees, a sexual risk to children. Shall I repeat those numbers? Just let them sink in. That is an enormous number of people. With the internet, the accessibility is much greater than ever before. The Internet Watch Foundation notes a growth in sexual abuse content available online, particularly in the category known as “self-generated” imagery.

The second area is fraud, which is now the No. 1 category of crime in this country by volume—and in many other countries. Almost all of it has an online aspect or is entirely online. I commend the Minister, and the various former Ministers in the Chamber, on their work in ensuring that fraud is properly addressed in the Bill. There have been three moves forward in that area, and my hon. Friends the Members for Hexham (Guy Opperman) and for Barrow and Furness (Simon Fell) may speak a bit more about that later. We need to ensure that fraud is in scope, that it becomes a priority offence and, crucially, that advertising for fraud is added to the offences covered.

I hope that, over time, the Government can continue to look at how to sharpen our focus in this area and, in particular, how to line up everybody’s incentives. Right now, the banks have a great incentive to stop fraud because they are liable for the losses. Anybody who has tried to make an online payment recently will know what that does. When people are given a direct financial incentive—a cost—to this thing being perpetrated, they will go to extraordinary lengths to try to stop it happening. If we could get that same focus on people accepting the content or ads that turn out to be fraud, imagine what we could do—my hon. Friend may be about to tell us.

Guy Opperman Portrait Guy Opperman (Hexham) (Con)
- Hansard - - - Excerpts

I commend my right hon. Friend for the work that he has done. He knows, because we spoke about this when we both were Ministers, that the key implementation once this Bill is law will be fraudulent advertising. I speak as a former Pensions Minister, and every single day up and down this country our pensioners are defrauded of at least £1 million, if not £2 million or £3 million. It is important that there are targeted penalties against online companies, notably Google, but also that there are police forces to take cases forward. The City of London Police is very good, but its resources are slim at present. Does he agree that those things need to be addressed as the Bill goes forward?

Damian Hinds Portrait Damian Hinds
- Hansard - -

I agree. Some of those matters should be addressed in the Bill and some outside it, but my hon. Friend, whom I commend for all his work, particularly on pensions fraud and investment fraud, is absolutely right that as the balance in the types of crimes has shifted, the ways we resource ourselves and tool up to deal with them has to reflect that.

Could you give me an indication, Mr Dowd, of how many Members are speaking in this debate after me?

Damian Hinds Portrait Damian Hinds
- Hansard - -

I shall accelerate in that case. The third area I want to mention, from my previous role as Security Minister, is disinformation. I welcome what is called the bridge that has been built between the Online Safety Bill and the National Security Bill to deal specifically with state-sponsored disinformation, which has become a tool of war. That probably does not surprise anybody, but I am afraid that, for states with a hostile intention, it can become, and is, a tool in peacetime. Quite often, it is not necessarily even about spreading untruths—believe it or not—but just about trying to wind people up and make them dislike one another more in an election, referendum or whatever it may be. This is important work.

Health disinformation, which we were exercised about during the coronavirus pandemic, is slated to be on the list of so-called legal but harmful harms, so the Bill would also deal with that. That brings me to my central point about the hardest part of this Bill: the so-called legal but harmful harms. I suggest that we actually call them “harmful but legal”, because that better captures their essence, as our constituents would understand it. It is a natural reaction when hearing about the Online Safety Bill, which will deal with stuff that is legal, to say, “Well, why is there a proposed law going through the British Parliament that tries to deal with things that are, and will stay, legal? We have laws to give extra protection to children, but adults should be able to make their own choices. If you start to interfere with that, you risk fundamental liberties, including freedom of speech.” I agree with that natural reaction, but I suggest that we have to consider a couple of additional factors.

First, there is no hard line between adults and children in this context. There is not a 100%—or, frankly, even 50%—reliable way of being able to tell who is using the internet and whether they are above or below age 18. I know that my hon. Friend the Member for Gosport (Dame Caroline Dinenage), among others, has been round the loop many times looking at age verification and so-called age assurance. It is very difficult. That is why a couple of weeks ago a piece of Ofcom research came out that found 32%—a third—of eight to 17-year-old social media users appear to be over 18. That is why a couple of weeks ago a piece of Ofcom research came out that found 32%—a third—of eight to 17-year-old social media users appear to be over 18. Why is that? Because it is commonplace for someone to sign up to TikTok or Snapchat with the minimum age of 13 when they are 10. They must give an age above 13 to be let in. Let us say that that age limit was set at 14; that means that when they are 14, it thinks they are 18—and so it carries on, all the way through life.

Liz Twist Portrait Liz Twist (Blaydon) (Lab)
- Hansard - - - Excerpts

The right hon. Member and many other Members present will know that leading suicide prevention charities, including Samaritans and the Mental Health Foundation, are calling on the Government to ensure that the Online Safety Bill protects people of all ages from all extremely dangerous suicide and self-harm content. The right hon. Member makes very good points about age and on the legal but harmful issue. I hope very much that the Government will look at this again to protect more people from that dangerous content.

Damian Hinds Portrait Damian Hinds
- Hansard - -

I thank the hon. Lady; I think her point stands on its own.

The second additional factor I want to put forward, which may sound odd, is that in this context there is not a hard line between what is legal and what is not. I mentioned emoji abuse. I am not a lawyer, still less a drafter of parliamentary legislation—there are those here who are—but I suggest it will be very hard to legislate for what constitutes emoji abuse in racism. Take something such as extremism. Extremist material has always been available; it is just that it used to be available on photocopied or carbon-copied sheets of paper. It was probably necessary to go to some draughty hall somewhere or some backstreet bookshop in a remote part of London to access it, and very few people did. The difference now is that the same material is available to everyone if they go looking for it; sometimes it might come to them even if they do not go looking for it. I think the context here is different.

This debate—not the debate we are having today, but the broader debate—is sometimes conducted in terms that are just too theoretical. People sometimes have the impression that we will allow these companies to arbitrarily take down stuff that is legal but that they just do not like—stuff that does not fit with their view of the world or their politics. On the contrary, the way the Bill has been drafted means that it will require consistency of approach and protect free speech.

I am close to the end of my speech, but let us pause for a moment to consider the sorts of things we are talking about. My right hon. Friend the Member for Mid Bedfordshire (Ms Dorries) made a written ministerial statement setting out an indicative list of the priority harms for adults. They are abuse and harassment—not mere disagreement, but abuse and harassment—the circulation of real or manufactured intimate images without the subject’s consent; material that promotes self-harm; material that promotes eating disorders; legal suicide content; and harmful health content that is demonstrably false, such as urging people to drink bleach to cure cancer.

I suggest that when people talk about free speech, they do not usually mean those kinds of things; they normally mean expressing a view or being robust in argument. We have the most oppositional, confrontational parliamentary democracy in the world, and we are proud of our ability to do better, to make better law and hold people to account through that process, but that is not the same thing as we are talking about here. Moreover, there is a misconception that the Bill would ban those things; in fact, the Bill states only that a service must have a policy about how it deals with them. A helpful Government amendment makes it clear that that policy could be, “Well, we’re not dealing with it at all. We are allowing content on these things.”

There are also empowerment tools—my hon. Friend the Member for Stroud (Siobhan Baillie) may say more about that later in relation to anonymity—but we want users to be in control. If there is this contractual relationship, where it is clearly set out what is allowed in this space and someone signs up to it, I suggest that enhances their freedoms as well as their rights.

I recognise that there are concerns, and it is right to consider them. It may be that the Bill can be tightened to reassure everybody, while keeping these important protections. That might be around the non-priority areas, which perhaps people consider to be too broad. There might also be value in putting the list of priority harms in the Bill, so that people are not concerned that this could balloon.

As I said at the start, the Minister, my hon. Friend the Member for Folkestone and Hythe, knows more about this than probably any other living human being. He literally works tirelessly on it and is immensely motivated, for all the right reasons. I have probably not said anything in the past 10 minutes that he did not already know. I know it is a difficult job to square the circle and consider these tensions.

My main message to the Minister and the Government is, with all the work that he and others have done, please let us get on with it. Let us get the Bill into law as soon as possible. If changes need to be made to reassure people, then absolutely let us make them, but most of all, let us keep up the momentum.

Peter Dowd Portrait Peter Dowd (in the Chair)
- Hansard - - - Excerpts

I would not have dreamed of interfering in your largesse, but I am pleased that you interfered in your own. Thank you very much.

--- Later in debate ---
Simon Fell Portrait Simon Fell (Barrow and Furness) (Con)
- Hansard - - - Excerpts

It is an honour to serve under your chairmanship, Mr Dowd. I thank my right hon. Friend the Member for East Hampshire (Damian Hinds) for securing this debate. It is a hackneyed phrase, but the Online Safety Bill is important and genuinely groundbreaking. There will always be a balance to strike between allowing free speech and stopping harms. I think we are on the right side of that balance, but we may need to come back to it later, because it is crucial.

I want to cover two topics in a short amount of time. The first is online harms through social media platforms, touching on the legal but harmful and small, high-harm platforms, and the second is fraud. Starting with fraud, I declare an interest, having spent a decade in that world before I came here.

Damian Hinds Portrait Damian Hinds
- Hansard - -

The world of law enforcement.

Simon Fell Portrait Simon Fell
- Hansard - - - Excerpts

I thank my right hon. Friend for clarifying that for me—although I would be better off now had I been on the other side of the fence.

Fraud is at epidemic levels. Which? research recently found that six in 10 people who have been victims of fraud suffered significant mental health harms as a result. I use this example repeatedly in this place. In my past life I met, through a safeguarding group, an old lady who accessed the world through her landline telephone. She was scammed out of £20,000 or so through that phone, and then disconnected from the rest of the world afterwards because she simply could not trust that phone when it rang anymore.

We live in an increasingly interconnected world where we are pushing our services online. As we are doing that we cannot afford to be disconnecting people from the online world and taking away from them the services we are opening up to them. That is why it is essential to have vital protections against fraud and fraudulent adverts on some of the larger social platforms and search engines. I know it is out of the scope of this debate but, on the point made by my hon. Friend the Member for Hexham (Guy Opperman), that is also why it is crucial to fund the law enforcement agencies that go after the people responsible.

My right hon. Friend the Member for East Hampshire is right: banks have a financial motivation to act on fraud. They are losing money. They have the incentive. Where that motivation is not there, and where there is a disincentive for organisations to act, as is especially the case with internet advertising, we have to move forward with the legislation and remove those disincentives.

On harms, my right hon. Friend the Member for East Hampshire is right to mention the harmful but legal. We have to act on this stuff and we have to do it quickly. We cannot stray away from the problems that currently exist online. I serve on the Home Affairs Committee and we have seen and examined the online hate being directed at footballers; the platforms are not acting on it, despite it being pointed out to them.

When it comes to disinformation and small, high-harm platforms—