All 3 Debates between Margaret Hodge and Damian Collins

Mon 5th Dec 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Mon 7th Mar 2022
Economic Crime (Transparency and Enforcement) Bill
Commons Chamber

Committee stage: Committee of the whole House & Committee stage

Online Safety Bill

Debate between Margaret Hodge and Damian Collins
Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

Indeed. The way the hon. Gentleman describes his new clause, which I will look at, is absolutely right, but can I just make a more general point because it speaks to the point about legal but harmful? What I really fear with the legal but harmful rule is that we create more and more laws to make content illegal and that, ironically, locks up more and more people, rather than creates structures and systems that will prevent the harm occurring in the first place. So I am not always in favour of new laws simply criminalising individuals. I would love us to have kept to the legal but harmful route.

We can look to Elon Musk’s recent controversial takeover of Twitter. Decisions taken by Twitter’s newest owner—by Elon Musk himself—saw use of the N-word increase by nearly 500% within 12 hours of acquisition. And allowing Donald Trump back on Twitter gives a chilling permission to Trump and others to use the site yet again to incite violence.

The tech giants know that their business models are dangerous. Platforms can train their systems to recognise so-called borderline content and reduce engagement. However, it is for business reasons, and business reasons alone, that they actively choose not to do that. In fact, they do the opposite and promote content known to trigger extreme emotions. These platforms are like a “danger for profit” machine, and the decision to allow that exploitation is coming from the top. Do not take my word for it; just listen to the words of Ian Russell. He has said:

“The only person that I’ve ever come across in this whole world…that thought that content”—

the content that Molly viewed—

“was safe was…Meta.”

There is a huge disconnect between what silicon valley executives think is safe and what we expect, both for ourselves and for our children. By introducing liability for directors, the behaviour of these companies might finally change. Experience elsewhere has shown us that that would prove to be the most effective way of keeping online users safe. New clause 17 would hold directors of a regulated service personally liable on the grounds that they have failed, or are failing, to comply with any duties set in relation to their service, for instance failure that leads to the death of a child. The new clause further states that the decision on who was liable would be made by Ofcom, not the provider, meaning that responsibility could not be shirked.

I say to all Members that if we really want to reduce the amount of harmful abuse online, then making senior directors personally liable is a very good way of achieving it. Some 82% of UK adults agree with us, Labour Front Benchers agree and Back Benchers across the House agree. So I urge the Government to rethink their position on director liability and support new clause 17 as a cross-party amendment. I really think it will make a difference.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

As Members know, there is a tradition in the United States that when the President signs a new Bill into law, people gather around him in the Oval Office, and multiple pens are used and presented to people who had a part in that Bill being drafted. If we required the King to do something similar with this Bill and gave a pen to every Minister, every Member who had served on a scrutiny Committee and every hon. Member who introduced an amendment that was accepted, we would need a lot of pens and it would take a long time. In some ways, however, that shows the House at its best; the Bill’s introduction has been a highly collaborative process.

The right hon. Member for Barking (Dame Margaret Hodge) was kind in her words about me and my right hon. Friend the Member for Croydon South (Chris Philp). I know that my successor will continue in the same tradition and, more importantly, that he is supported by a team of officials who have dedicated, in some cases, years of their career to the Bill, who care deeply about it and who want to see it introduced with success. I had better be nice to them because some of them are sitting in the Box.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend raises a very good question. As well as having a named individual with criminal liability for the supplying of information, should there be somebody who is accountable within a company, whether that comes with criminal sanctions or not—somebody whose job it is to know? As all hon. Members know if they have served on the Digital, Culture, Media and Sport Committee, which I chaired, on the Public Accounts Committee or on other Select Committees that have questioned people from the big tech companies, the frustrating thing is that no matter who they put up, it never seems to be the person who actually knows.

There needs to be someone who is legally liable, whether or not they have criminal liability, and is the accountable officer. In the same way as in a financial institution, it is really important to have someone whose job it is to know what is going on and who has certain liabilities. The Bill gives Ofcom the power to seek information and to appoint experts within a company to dig information out and work with the company to get it, but the companies need to feel the same sense of liability that a bank would if its systems had been used to launder money and it had not raised a flag.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I will dare to give way to yet another former Committee Chair—the former chair of the Public Accounts Committee.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

I draw all hon. Members’ attention to issues relating to Barclays Bank in the wake of the economic crisis. An authority—I think it was the Serious Fraud Office—attempted to hold both the bank and its directors to account, but it failed because there was not a corporate criminal liability clause that worked. It was too difficult. Putting such a provision in the Bill would be a means of holding individual directors as well as companies to account, whatever standard of proof was used.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I thank the right hon. Lady for that information.

Let me move on to the debate about encryption, which my right hon. Friend the Member for Haltemprice and Howden has mentioned. I think it is important that Ofcom and law enforcement agencies be able to access information from companies that could be useful in prosecuting cases related to terrorism and child sexual exploitation. No one is suggesting that encrypted messaging services such as WhatsApp should be de-encrypted, and there is no requirement in the Bill for encryption to end, but we might ask how Meta makes money out of WhatsApp when it appears to be free. One way in which it makes money is by gathering huge amounts of data and information about the people who use it, about the names of WhatsApp groups and about the websites people visit before and after sending messages. It gathers a lot of background metadata about people’s activity around using the app and service.

If someone has visited a website on which severe illegal activity is taking place and has then used a messaging service, and the person to whom they sent the message has done the same, it should be grounds for investigation. It should be easy for law enforcement to get hold of the relevant information without the companies resisting. It should be possible for Ofcom to ask questions about how readily the companies make that information available. That is what the Government seek to do through their amendments on encryption. They are not about creating a back door for encryption, which could create other dangers, and not just on freedom of expression grounds: once a back door to a system is created, even if it is only for the company itself or for law enforcement, other people tend to find their way in.

Online Safety Bill

Debate between Margaret Hodge and Damian Collins
Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - -

I welcome the Minister to his position, and it is wonderful to have somebody else who—like the previous Minister, the hon. Member for Croydon South (Chris Philp)—knows what he is talking about. On this issue, which is pretty key, I think it would work if minimum standards were set on the risk assessments that platforms have to make to judge what is legal but harmful content, but at the moment such minimum standards are not in the Bill. Could the Minister comment on that? Otherwise, there is a danger that platforms will set a risk assessment that allows really vile harmful but legal content to carry on appearing on their platform.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The right hon. Lady makes a very important point. There have to be minimum safety standards, and I think that was also reflected in the report of the Joint Committee, which I chaired. Those minimum legal standards are set where the criminal law is set for these priority legal offences. A company may have higher terms of service—it may operate at a higher level—in which case it will be judged on the operation of its terms of service. However, for priority illegal content, it cannot have a code of practice that is below the legal threshold, and it would be in breach of the provisions if it did. For priority illegal offences, the minimum threshold is set by the law.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

I understand that in relation to illegal harmful content, but I am talking about legal but harmful content. I understand that the Joint Committee that the hon. Member chaired recommended that for legal but harmful content, there should be minimum standards against which the platforms would be judged. I may have missed it, but I cannot see that in the Bill.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.

In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.

The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.

Economic Crime (Transparency and Enforcement) Bill

Debate between Margaret Hodge and Damian Collins
Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

There are already arrangements for the sharing of information and data, but it is not enough. It is absurd. When I talk to the enforcement agencies and the anti-money laundering people working in the banks, they tell me that they cannot share information. If one bank has information that makes it suspicious about a particular client, it ought to be able to share it around the banking system so that they can all take action. There are pragmatic steps that we could take to share information and knowledge across jurisdictions, from America through to Europe to us, which would massively improve things and actually bring in money.

Let me take one example that came out of the FinCEN files. Standard Chartered Bank is a British bank. In 2019, it was fined by both America and ourselves for poor money laundering controls and other offences, including breaching sanctions in Iran. The British bank was fined £842 million in America, but only £102 million here by the Financial Conduct Authority in the UK. The Americans have got it right. There are lessons that we can learn from them. There are also ways in which we could properly resource all the enforcement agencies. We could perhaps reduce them as well—we do not need all these people. Every time I refer a matter, whether it is for a corrupt or an illegal activity, to one enforcement agency, I am either told that it is the responsibility of another agency or it goes into a big black hole and I never see anything arising out of it again. That situation is completely and totally unacceptable.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - - - Excerpts

On this point about information sharing, which is so important, we have an established financial system to do that. Does the right hon. Lady share my concern that many people who are affected by these sanctions may use cryptocurrency both to hide money and to move money, and that many of the cryptocurrency exchanges are saying that they will not take enforcement action against, say, Russian nationals, only against named individuals on the sanctions list?

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

Cryptocurrency has become the new way in which money is laundered. Corrupt and stolen money ends up in the pockets of one individual, and then gets back into the system for them to spend it elsewhere. I completely agree with the hon. Gentleman: it is important that we get our heads around cryptocurrency and that we legislate appropriately to tackle it.

The other way of looking at this issue, and the reason why we have tabled the new clause, is that our law enforcement bodies, while they are not as good as the Americans’, bring resources back to the UK through fines. Between 2016 and 2021, the law enforcement bodies brought £3.9 billion back into the UK coffers. If that money had been reinvested, which is one of the ideas for funding the enforcement agencies, it could have brought an extra three quarters of a billion pounds to be spent on enforcement by all those agencies. That is a lot of money, and it would have been effective; it would have had a snowball effect of increasing our budget.

New clause 2 is there to ensure that we get the enforcement right—that we have not only the powers but the resources we need to make sense of and put into effect the important legislation we are passing today. I hope it will have support right across the Committee; it certainly has support among Back Benchers, and I would love it if the Government accepted it and it became part of the Bill.