Thursday 13th January 2022

(2 years, 11 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Gentleman raises an important issue. The Committee agreed in the report that there must be an expedited process of transparency, so that when people are using anonymity to abuse other people—saying things for which in public they might be sued or have action taken against them—it must be much easier to swiftly identify who those people are. People must know that if they post hate online directed at other people and commit an offence in doing so, their anonymity will not be a shield that will protect them: they will be identified readily and action taken against them. Of course there are cases where anonymity may be required, when people are speaking out against an oppressive regime or victims of abuse are telling their story, but it should not be used as a shield to abuse others. We set that out in the report and the hon. Gentleman is right that the Bill needs to move on it.

We are not just asking the companies to moderate content; we are asking them to moderate their systems as well. Their systems play an active role in directing people towards hate and abuse. A study commissioned by Facebook showed that over 60% of people who joined groups that showed extremist content did so at the active recommendation of the platform itself. In her evidence to the Committee, Facebook whistleblower Frances Haugen made clear the active role of systems in promoting and driving content through to people, making them the target of abuse, and making vulnerable people more likely to be confronted with and directed towards content that will exacerbate their vulnerabilities.

Facebook and companies like it may not have invented hate but they are driving hate and making it worse. They must be responsible for these systems. It is right that the Bill will allow the regulator to hold those companies to account not just for what they do or do not take down, but for the way they use the systems that they have created and designed to make money for themselves by keeping people on them longer, such that they are responsible for them. The key thing at the heart of the Bill and at the heart of the report published by the Joint Committee is that the companies must be held liable for the systems they have created. The Committee recommended a structural change to the Bill to make it absolutely clear that what is illegal offline should be regulated online. Existing offences in law should be written into the Bill and it should be demonstrated how the regulator will set the thresholds for enforcement of those measures online.

This approach has been made possible because of the work of the Law Commission in producing its recommendations, particularly in introducing new offences around actively promoting self-harm and promoting content and information that is known to be false. A new measure will give us the mechanism to deal with malicious deepfake films being targeted at people. There are also necessary measures to make sure that there are guiding principles that the regulator has to work to, and the companies have to work to, to ensure regard to public health in dealing with dangerous disinformation relating to the pandemic or other public health issues.

We also have to ensure an obligation for the regulator to uphold principles of freedom of expression. It is important that effective action should be taken against hate speech, extremism, illegal content and all harmful content that is within the scope of the Bill, but if companies are removing content that has every right to be there—where the positive expression of people’s opinions has every right to be online—then the regulator should have the power to intervene in that direction as well.

At the heart of the regime has to be a system where Ofcom, as the independent regulator, can set mandatory codes and standards that we expect the companies to meet, and then use its powers to investigate and audit them to make sure that they are complying. We cannot have a system that is based on self-declared transparency reports by the companies where even they themselves struggle to explain what the results mean and there is no mechanism for understanding whether they are giving us the full picture or only a highly partial one. The regulator must have that power. Crucially, the codes of practice should set the mandatory minimum standards. We should not have Silicon Valley deciding what the online safety of citizens in this country should be. That should be determined through legislation passed through this Parliament empowering the regulator to set the minimum standards and take enforcement action when they have not been met.

We also believe that the Bill would be improved by removing a controversial area, the principles in clause 11. The priority areas of harm are determined by the Secretary of State and advisory to the companies. If we base the regulatory regime and the codes of practice on established offences that this Parliament has already created, which are known and understood and therefore enforced, we can say they are mandatory and clear and that there has been a parliamentary approval process in creating the offences in the first place.

Where new areas of harm are added to the schedules and the codes of practice, there should be an affirmative procedure in both Houses of Parliament to approve those changes to the code, so that Members have the chance to vote on changes to the codes of practice and the introduction of new offences as a consequence of those offences being created.

The Committee took a lot of evidence on the question of online fraud and scams. We received evidence from the Work and Pensions Committee and the Treasury Committee advising us that this should be done: if a known scam or attempt to rip off and defraud people is present on a website or social media platform, be it through advertising or any kind of posting, it should be within the scope and it should be for the regulator to require its removal. There should not be a general blanket exemption for advertising, which would create a perverse incentive to promote such content more actively.

Kevin Hollinrake Portrait Kevin Hollinrake (Thirsk and Malton) (Con)
- View Speech - Hansard - -

I thank my hon. Friend for his work on this important issue. Does he agree, as referred to in the report, that platforms must be required to proactively seek out that content and ensure it is changed, and if not, remove it, rather than all removals being prompted by users?

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It is vital that companies are made to act proactively. That is one of the problems with the current regime, where action against illegal content is only required once it is reported to the companies and they are not proactively identifying it. My hon. Friend is right about that, particularly with frauds and scams where the perpetrators are known. The role of the regulator is to ensure that companies do not run those ads. The advertising authorities can still take action against individual advertisers, as can the police, but there should be a proactive responsibility on the platforms themselves.

If you will allow me to say one or two more things, Madam Deputy Speaker, we believe it is important that there should be user redress through the system. That is why the Committee recommended creating an ombudsman if complaints have been exhausted without successful resolution, but also permitting civil redress through the courts.

If an individual or their family has been greatly harmed as a consequence of what they have seen on social media, they may take some solace in the fact that the regulator has intervened against the company for its failures and levied fines or taken action against individual directors. However, as an individual can take a case to the courts for a company’s failure to meet its obligations under data protection law, that should also apply to online safety legislation. An individual should have the right, on their own or with others, to sue a company for failing to meet its obligations under an online safety Act.

I commend the report to the House and thank everyone involved in its production for their hard work. This is a Bill we desperately need, and I look forward to seeing it pass through the House in this Session.

--- Later in debate ---
Kevin Hollinrake Portrait Kevin Hollinrake (Thirsk and Malton) (Con)
- View Speech - Hansard - -

It is a pleasure to be called in this important debate, Madam Deputy Speaker. I wish to talk about online fraud; in my capacity as chair of the all-party group on fair business banking and as a member of the Treasury Committee, I think that is a matter of extreme importance. I congratulate the Joint Committee on its work, and my hon. Friend the Member for Folkestone and Hythe (Damian Collins), its Chairman, has done excellent work on this, particularly in pages 58 to 60 and 75 to 79 of the report.

It is good to see that the Government are looking at online fraud within the context of this Bill, but they must look at paid-for content as well. It is crucial that we do that, and the Minister has been very good in engaging on this issue. He knows how important it is, given his background. When the FCA, the Treasury, UK Finance, the Advertising Standards Authority and the Treasury Committee are all in favour of including fraud and paid-for content within the scope of this Bill, it is incumbent on the Government to do so. Otherwise, as the Treasury Committee says, there will be large financial losses to the public. Up to 40% of all crime is now fraud and, as the report says, 85% of fraud involves the internet in some way or other, so it is crucial that we cover this in the Bill.

I am massively in favour of competition and absolutely congratulate the platforms on their market dominance, but they have taken that market share in paid-for content away from our local newspapers and other such media. It is therefore crucial that we put those platforms on a fair and level playing field with those other media. I do not think we do that, and we need to be far tougher with these platforms. Clearly, they make a huge amount of money, but in many ways they get away with murder in terms of the regulation of their content, in a way that newspapers never would have done.

In my days in business, when we were advertising in newspapers we had to prove that we were who we said we were in terms of being a business, and the newspapers would look at the content of our adverts and they made sure we verified our claims. Neither of those things happens with respect to these platforms; they simply take the money and the approach is, “Let the people who are viewing it beware.” It is simply not a fair and level playing field. Newspapers were the gatekeepers but these platforms are absolutely not.

As my hon. Friend the Member for Boston and Skegness (Matt Warman) said, what works offline should be covered online, but that is not the situation at the moment. So I agree with the report that we need the platforms to be proactive in making sure that fraudulent content is removed. It needs to be covered under clause 41(4) and “priority illegal content”, so that platforms have to be proactive in taking this stuff down. We also need to be looking at clause 39 and removing paid-for ads to make sure that platforms are also proactive in removing paid-for online fraudulent content.

There is another thing we need to make sure is covered in the Bill in its final form. Fraud is not just an offence against individuals, as companies often get defrauded by mechanisms through these platforms, and we want to make sure they are covered as well. One way of doing that and focusing the platforms’ attention on this—I am not quite clear on this and I probably need to spend a bit of time with the Minister and the Chair of the Joint Committee on it—is by looking at what redress is available to people who do lose out. I know that the Committee is recommending an external redress process to cover this, but would that cover redress for financial loss? It think it should. So if the individual or the company cannot get redress through the company that they were defrauded by—it is pretty unlikely that they would—or the bank that facilitated the transaction, the platform should cover the redress to compensate those people for the loss. That would really focus the attention of platforms on making sure that this content was removed.

I am not sure that people know that the ASA, which looks at this kind of stuff and makes sure that advertising is appropriate, has no means of sanctioning a company for making claims that are not valid and do not meet with the expectations of the consumers. There are pretty much no sanctions for defrauding a company or individual in this way, or even for misleading them into buying the wrong product or making the wrong investment, as we saw with London Capital and Finance.

This Bill is a huge opportunity, and we have to make sure it is all-encompassing and ticks many of the boxes that others have spoken about in this debate.

--- Later in debate ---
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- View Speech - Hansard - - - Excerpts

I congratulate my hon. Friend the Member for Folkestone and Hythe (Damian Collins) on securing today’s debate and chairing the Joint Committee with such aplomb and expertise. I thank Members from all parties on the Committee—from not just this House, but the other place—for their incredibly hard work. I put on the record my thanks to them; as my hon. Friend said, Baroness Kidron and Lord Gilbert are with us today. I thank them all for their extremely thorough and detailed work. We have been studying their report—all 191 pages—very carefully, and it will definitely have an impact on the legislation as it is updated.

I also thank the Select Committee on Digital, Culture, Media and Sport and its Chair, my hon. Friend the Member for Solihull (Julian Knight), for its work. I look forward very much to its report, which my hon. Friend said would be published imminently. I encourage the Committee to ensure that it is indeed published as soon as possible, so that we can take account of its recommendations as well. I can confirm that we will be making changes to the Bill in the light of the recommendations of the Joint Committee report and those of the anticipated report from the Select Committee. We understand that there are a number of respects in which the Bill can be improved substantially. The Government certainly have no monopoly on wisdom, and we intend to profit from the huge experience of the members of the Committees, and Members of the House, in making improvements—significant improvements —to the Bill. We intend to produce a revised and updated Bill before the end of the current Session.

We intend this Bill to be a world-leading piece of legislation. We believe that the United Kingdom has an opportunity to set a global example which other countries will follow. As the hon. Member for Pontypridd (Alex Davies-Jones) said, the Bill has been some time in gestation, but because this is such a complicated topic, it is important that we get the legislation right.

This is, I think, a good moment to thank previous Secretaries of State and Ministers for the work that they did in laying the foundations on which we are now building—in fact, in building the walls as well; we are just putting the roof on. In particular, I know of the work done in this area by my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright) and my right hon. Friend the Member for Basingstoke (Mrs Miller), and also the work done by my hon. Friends the Members for Gosport (Dame Caroline Dinenage) and for Boston and Skegness (Matt Warman). I am sure that the whole House will want to thank them for the fantastic work that they did in taking us to the point where we now stand.

I entirely agree with the sentiments expressed by the Chairman of the Joint Committee, my hon. Friend the Member for Folkestone and Hythe, who said in his opening speech that social media firms had brought this legislation on themselves by the irresponsibility that they have often shown by placing profit ahead of humanity. That was powerfully illustrated by the evidence presented to the Joint Committee, and separately to the United States Senate and The Wall Street Journal, by the Facebook whistleblower Frances Haugen, who explained how Facebook’s use of algorithms—mentioned by Members, including my hon. Friends the Members for Gosport and for Bosworth (Dr Evans)—prioritised profit by promoting content that was harmful or incendiary simply because it made money, with scant, if any, regard to the harm being caused. Our view is that such an attitude is not only inappropriate but wrong.

Two or three Members have referred to the tragic suicide of 14-year-old Molly Russell, which followed a huge amount of very troubling suicide-related content being served up to her by Instagram. That sort of thing simply should not be happening. There are all too many other examples of social media firms not promptly handing over identification information to the police—I encountered a constituency case of that kind a couple of years ago—and not taking down content that is illegal, or content that clearly contravenes their terms and conditions.

This state of affairs cannot persist, and it is right for the House to act. I am heartened to note that, broadly speaking, we will be acting on a cross-party basis, because I think that that will make the message we send the world and the action we are taking all the more powerful. However, as Members have said today, even before the Act is passed, social media firms can act. They can edit their algorithms tomorrow, and I urge them to do exactly that. They should not be waiting for us to legislate; they should do the right thing today. We will be watching very closely: the House will be watching, and the public will be watching.

Kevin Hollinrake Portrait Kevin Hollinrake
- Hansard - -

Will my hon. Friend give way?

Kevin Hollinrake Portrait Kevin Hollinrake
- Hansard - -

I will be very brief. My hon. Friend has talked about cross-party working, and there is clearly cross-party consensus that paid-for advertising should be included in the scope of the Bill. Is that something that he intends to do?

Chris Philp Portrait Chris Philp
- View Speech - Hansard - - - Excerpts

My hon. Friend anticipates my next point. I was about to come on to some of the specifics—very quickly, because time is short.

I am not going to be pre-announcing any firm commitments today because work is still ongoing, including the collective agreement process in Government, but on fraud and paid-for advertising, we have heard the message of the Joint Committee, the Financial Conduct Authority, the financial services sector, campaigners and Members of this House such as my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake). The right hon. Member for East Ham (Stephen Timms) raised this, as did the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Cities of London and Westminster (Nickie Aiken). I was at Revolut’s head office in Canary Wharf earlier today and it raised the issue as well. It is a message that the Government have absolutely heard, and it is something that we very much hope we will be able to address when we bring the Bill forward.

I cannot make any specific commitments because the work is still ongoing, but that message is loudly heard, as is the message communicated by the right hon. Member for Barking, my right hon. Friend the Member for Basingstoke and the hon. Member for Bath (Wera Hobhouse) on the work by the Law Commission on the communications offences, which will really tighten up some of the issues to do with what are essentially malicious or harmful communications, issues such as cyber-flashing and issues to do with epilepsy that we have heard about this afternoon. We are studying those Law Commission proposals very positively and carefully, as the Joint Committee recommended that we do.

We have also heard clearly the messages concerning commercial pornography. We understand the issues presented by the fact that the Bill, as drafted, does not cover that. Again, that is something we are currently working on very hard indeed.

Anonymity is another important issue raised today by my right hon. Friend the Member for Basingstoke and the hon. Member for Upper Bann (Carla Lockhart), among others. They and the Joint Committee have suggested that users should be given the option to protect themselves from anonymous content. They also addressed the critical question of traceability when law enforcement needs to investigate something. Again, those messages have been heard very clearly and we are working very hard on those.

That brings me to the tragic case raised by the hon. Member for Reading East (Matt Rodda) of his constituent Olly, who was so appallingly murdered; the murder appears to have been organised online. Under the Bill as drafted, organising an act like that—an illegal act—will be dealt with. I have just mentioned the point about traceability, which we are studying very carefully. The hon. Member said he had some concerns that the social media companies concerned did not provide the police with the identification information required when requested. I had a similar case a couple of years ago with Snapchat. If he could look into the details of that and come back to me with the specifics, I would be very interested to hear those because that would give us additional evidence if further steps need to be taken via the amended Bill. If he could come back to me on that, I would be very grateful.

A number of Members have rightly raised the point about transparency and understanding exactly what these social media firms are doing. The right hon. Member for Barking made that point powerfully, as did the hon. Member for Newcastle upon Tyne Central (Chi Onwurah). Of course, the Bill does give Ofcom extremely wide-ranging powers to require information to be delivered up. It also imposes transparency obligations upon these companies. There are criminal sanctions on individuals if those provisions are broken, and we have heard clearly the suggestion that those be brought forward and commenced much earlier. The Bill will also contain strong protections for free speech. I have not got time to talk about that more, but protecting free speech clearly is very important.

The country demands action, this House demands action and we will take it.