1 Guy Opperman debates involving the Department for Digital, Culture, Media & Sport

Online Harms

Guy Opperman Excerpts
Wednesday 26th October 2022

(2 years ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Hinds Portrait Damian Hinds
- Hansard - - - Excerpts

I absolutely agree about the centrality of the algorithms and about understanding how they work. We may come on to that later in the debate. That brings me on to my next point. Of course, we should not think of Molly’s tragedy as a single event; there have been other tragedies. There is also a long tail of harm done to young people through an increased prevalence of self-harm, eating disorders, and the contribution to general mental ill-health. All of that has a societal cost, as well as a cost to the individual. That is also a literal cost, in terms of cash, as well as the terrible social cost.

Importantly, this is not only about children. Ages 18 to 21 can be a vulnerable time for some of the issues I have just mentioned. Of course, with domestic abuse, antisemitism, racist abuse, and so on, most of that is perpetrated by—and inflicted on—people well above the age of majority. I found that the importance and breadth of this subject was reflected in my Outlook inbox over the past few days. Whenever a Member’s name is on the Order Paper for a Westminster Hall debate, they get all sorts of briefings from various third parties, but today’s has broken all records. I have heard from everybody, from Lego to the Countryside Alliance.

On that subject, I thank some of the brilliant organisations that work so hard in this area, such as 5Rights, the Children’s Charities Coalition, the National Society for the Prevention of Cruelty to Children, of course, the Carnegie Trust, the City of London Corporation, UK Finance, the Samaritans, Kick It Out, and more.

I should also note the three e-petitions linked to this subject, reflecting the public’s engagement: the e-petition to ban anonymous accounts on social media, which has almost 17,000 signatories; the petition to hold online trolls accountable, with more than 130,000 signatories; and the e-petition for verified ID to be required to open a social media account, with almost 700,000 signatories.

Such is the interest in this subject and the Online Safety Bill, which is about to come back to the Commons, that someone could be forgiven for thinking that it is about to solve all of our problems, but I am afraid that it will not. It is a framework that will evolve, and this will not be the last time that we have to legislate on the subject. Indeed, many of the things that must be done probably cannot be legislated for anyway. Additionally, technology evolves. A decade ago, legislators were not talking about the effect of livestreaming on child abuse. We certainly were not talking about the use of emojis in racist abuse. Today, we are just getting to grips with what the metaverse will be and what it implies. Who knows, in five or 10 years’ time, what the equivalent subjects will be?

From my most recent ministerial role as Minister of State for Security, there are three areas covered in the Online Safety Bill that I will mention to stress the importance of pressing on with it and getting it passed into law. The first is child abuse, which I have just mentioned. Of course, some child abuse is perpetrated on the internet, but it is more about distribution. Every time that an abusive image of a child is forwarded, that victim is re-victimised. It also creates the demand for further primary abuse. I commend the agencies, the National Crime Agency and CEOP—Child Exploitation and Online Protection Command—and the brilliant organisations, some of which I have mentioned, that work in this area, including the international framework around NCMEC, the National Centre for Missing and Exploited Children, in the United States.

However, I am afraid that it is a growth area. That is why we must move quickly. The National Crime Agency estimates that between 550,000 and 850,000 people pose, in varying degrees, a sexual risk to children. Shall I repeat those numbers? Just let them sink in. That is an enormous number of people. With the internet, the accessibility is much greater than ever before. The Internet Watch Foundation notes a growth in sexual abuse content available online, particularly in the category known as “self-generated” imagery.

The second area is fraud, which is now the No. 1 category of crime in this country by volume—and in many other countries. Almost all of it has an online aspect or is entirely online. I commend the Minister, and the various former Ministers in the Chamber, on their work in ensuring that fraud is properly addressed in the Bill. There have been three moves forward in that area, and my hon. Friends the Members for Hexham (Guy Opperman) and for Barrow and Furness (Simon Fell) may speak a bit more about that later. We need to ensure that fraud is in scope, that it becomes a priority offence and, crucially, that advertising for fraud is added to the offences covered.

I hope that, over time, the Government can continue to look at how to sharpen our focus in this area and, in particular, how to line up everybody’s incentives. Right now, the banks have a great incentive to stop fraud because they are liable for the losses. Anybody who has tried to make an online payment recently will know what that does. When people are given a direct financial incentive—a cost—to this thing being perpetrated, they will go to extraordinary lengths to try to stop it happening. If we could get that same focus on people accepting the content or ads that turn out to be fraud, imagine what we could do—my hon. Friend may be about to tell us.

Guy Opperman Portrait Guy Opperman (Hexham) (Con)
- Hansard - -

I commend my right hon. Friend for the work that he has done. He knows, because we spoke about this when we both were Ministers, that the key implementation once this Bill is law will be fraudulent advertising. I speak as a former Pensions Minister, and every single day up and down this country our pensioners are defrauded of at least £1 million, if not £2 million or £3 million. It is important that there are targeted penalties against online companies, notably Google, but also that there are police forces to take cases forward. The City of London Police is very good, but its resources are slim at present. Does he agree that those things need to be addressed as the Bill goes forward?

Damian Hinds Portrait Damian Hinds
- Hansard - - - Excerpts

I agree. Some of those matters should be addressed in the Bill and some outside it, but my hon. Friend, whom I commend for all his work, particularly on pensions fraud and investment fraud, is absolutely right that as the balance in the types of crimes has shifted, the ways we resource ourselves and tool up to deal with them has to reflect that.

Could you give me an indication, Mr Dowd, of how many Members are speaking in this debate after me?

--- Later in debate ---
Guy Opperman Portrait Guy Opperman (Hexham) (Con)
- Hansard - -

Another day, another Westminster Hall speech.

When I was the Pensions Minister I saw, sadly, hundreds of our constituents being defrauded of millions of pounds every single day by fake advertisers, primarily on Google, Instagram, Facebook and various other social media providers. The offences that have been added in clauses 34 to 36 of the Online Safety Bill are welcome, but I want an assurance from the Minister that there is provision against unregulated advertisers.

I give the example of Aviva, which gave evidence to the Work and Pensions Committee. It indicated that there were 55 separate fake Aviva sites advertising on Google for financial services. Constituents, particularly the elderly and the vulnerable, were being misled by those people and were signing away significant proportions of money. I hope the provisions in clause 36 cover that situation, but I would be nervous that the Minister would rely on consumer protection in respect of the unfair trading regulations and the actions of the Competition and Markets Authority. I mean no disrespect, but those provisions are pretty ineffective and do not really address these points.

To deal with such issues, the answer is clearly to have a burden of proof on the recipient of the advert that they are vicariously liable for the content they have on their site. That would have the massive benefit, as identified by my right hon. Friend the Member for East Hampshire (Damian Hinds), of putting the burden on the site to justify the content on its site, and there should be consequential fines that should be significant and repeated in their actions. It is very important that the Minister works with the new Home Office teams and that the police forces that are going to take these issues forward are beefed up considerably, because there simply is currently not enough resource to address these issues.

I thank organisations such as The Times—Matt Dathan has done good work on this issue. A lack of implementation will not be for a lack of money. We should bear in mind that Google, or Alphabet, made $14 billion profit last quarter. Its ability to regulate and follow through—to take the work that it is required to do by the Bill and to check advertisers and be responsible for the content, to put it bluntly—is very do-able, under all circumstances. I strongly urge the Minister to double-check that unregulated advertisers are covered in clause 36 and that there will be genuine fines and vicarious liability going forward.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful to my hon. Friend. The answer is yes, absolutely. It was always the case with the Bill that illegal content, including fraud, was in scope. The question in the original draft Bill was that that did not include advertising. Advertising can be in the form of display advertising that can be seen on social media platforms; for search services, it can also be boosted search returns. Under the Bill, known frauds and scams that have been identified should not appear in advertising on regulated platforms. That change was recommended by the Joint Committee, and the Government accepted it. It is really important that that is the case, because the company should have a liability; we cannot work just on the basis that the offence has been committed by the person who has created the advert and who is running the scam. If an intermediary platform is profiting out of someone else’s illegal activity, that should not be allowed. It would be within Ofcom’s regulatory powers to identify whether that is happening and to see that platforms are taking action against it. If not, those companies will be failing in their safety duties, and they will be liable for very large fines that can be levied against them for breaching their obligations, as set out in the Online Safety Bill, which can be up to 10% of global revenues in any one year. That power will absolutely be there.

Some companies could choose to have systems in place to make it less likely that scam ads would appear on their platforms. Google has a policy under which it works with the Financial Conduct Authority and does not accept financial product advertising from organisations that are not FCA accredited. That has been quite an effective way to filter out a lot of potential scam ads before they appear. Whether companies have policies such as that, or other ways of doing these things, they will have to demonstrate to Ofcom that those are effective. [Interruption.] Does my hon. Friend the Member for Hexham (Guy Opperman) want to come in on that? I can see him poised to spring forward.

Guy Opperman Portrait Guy Opperman
- Hansard - -

No, keep going.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I would like to touch on some of the other issues that have been raised in the debate. The hon. Member for Leeds East (Richard Burgon) and others made the point about smaller, high-risk platforms. All platforms, regardless of size, have to meet the illegal priority harm standard. For the worst offences, they will already have to produce risk assessments and respond to Ofcom’s request for information. Given that, I would suspect that, if Ofcom had a suspicion that serious illegal activity, or other activity that was causing serious concern, was taking place on a smaller platform, it would have powers to investigate and would probably find that the platform was in breach of those responsibilities. It is not the case that if a company is not a category 1 company, it is not held to account under the illegal priority harms clauses of the Bill. Those clauses cover a wide range of offences, and it is important—this was an important amendment to the Bill recommended by the Joint Committee—that those offences were written into the Bill so that people can see what they are.

The hon. Member for Pontypridd raised the issue of violence against women and girls, but what I would say is that violence against everyone is included in the Bill. The offences of promoting or inciting violence, harassment, stalking and sending unsolicited sexual images are all included in the Bill. The way the schedule 7 offences work is that the schedule lists existing areas of law. Violence against women and girls is covered by lots of different laws; that is why there is not a single offence for it and why it is not listed. That does not mean that we do not take it seriously. As I said to the hon. Lady when we debated this issue on the first day of Report, we all understand that women and girls are far more likely to be victims of abuse online, and they are therefore the group that should benefit the most from the provisions in the Bill.

The hon. Member for Coventry North West (Taiwo Owatemi) spoke about cyber-bullying. Again, offences relating to harassment are included in the Bill. This is also an important area where the regulator’s job is to ensure that companies enforce their own terms of service. For example, TikTok, which is very popular with younger users, has in place quite strict policies on preventing bullying, abuse and intimidation on its services. But does it enforce that effectively? So far, we have largely relied on the platforms self-declaring whether that is the case; we have never had the ability to really know. Now Ofcom will have that power, and it will be able to challenge companies such as TikTok. I have raised with TikTok as well my concern about the prevalence of blackout challenge content, which remains on that platform and which has led to people losing their lives. Could TikTok be more effective at removing more of that content? We will now have the regulatory power to investigate—to get behind the curtain and to see what is really going on.