Margaret Hodge debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Tue 17th Jan 2023
Mon 5th Dec 2022
Mon 5th Dec 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading

Online Safety Bill

Margaret Hodge Excerpts
Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

The hon. Member is absolutely right, and I do not think anyone in the House would disagree with that. We have to carry on learning in life, and that links to technology and other issues. That applies to all of us across the board, and we need people in positions of authority to ensure that the right kind of information is shared, to protect our young people.

I look forward to hearing from the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), who has been so good in engaging on this issue, and I thank him for the proactive way in which he has spent time with all of us. Will we see the Government’s amendment prior to the Bill going to the other place for its Second Reading there? It is vital for all colleagues who support new clause 2 to have clear assurances that the provisions we support, which could have passed through this House, will not be diluted in the other place by Ministers. Furthermore—we should discuss this today—what steps are the Government and Ofcom taking to secure the agreement of tech companies to work to ensure that senior managers are committed and proactive in meeting their duties under clause 11?

I recognise that a lot of things will flow through secondary legislation, but on top of that, engagement with tech companies is vital, so that they can prepare, be ready and know what duties will be upon them. We also need to know what further guidance and regulation will come forward to secure the delivery of clause 11 duties and hold tech companies to account.

In the interests of time, I will shorten my remarks. I trust and hope that Ministers will give those details. It is important to give those assurances before the Bill moves to the House of Lords. We need to know that those protections will not be diluted. This is such a sensitive issue. We have come a long way, and that is thanks to colleagues on both sides of the House. It is important that we get the right outcomes, because all of us want to make sure that children are protected from the dreadful harms that we have seen online.

Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Parliament Live - Hansard - -

This is a really important piece of legislation. As my hon. Friend the Member for Pontypridd (Alex Davies-Jones) said, it has taken far too long to get to this point. The Bill has been considered in a painstaking way by Members across the House. While today’s announcement that we will introduce senior manager and director liability is most welcome, the recent decisions to strip out vast chunks of the Bill—clauses that would have contributed to making online a safe place for us all—represent a tragic opportunity missed by the Government, and it will fall to a Labour Government to put things right. I know from the assurances given by those on our Front Bench that they will do just that.

I do not want to spend too much time on it, but in discussing the removal of provisions on “legal but harmful” content, I have to talk a little bit about the Jewish community. The hope that the Online Safety Bill would give us some respite from the torrent of antisemitic abuse that some of us have been subjected to has been thwarted. The Centre for Countering Digital Hate has conducted research in this area, and it found that nine out of 10 antisemitic posts on Facebook and Twitter stay there, despite requests to have them removed. Its analysis of 714 posts containing anti-Jewish hate found that they were viewed by more than 7.3 million people across the platforms, and that 80% of posts containing holocaust denial and 70% identified as neo-Nazi were not acted on, although they were in breach of the rules set by the platforms. People like me are left with a sense of bitterness that our suffering has to be tolerated because of some ideological, misplaced, flawed and ill-thought-out interpretation of freedom of speech.

I turn to new clause 2, tabled by the hon. Member for Stone (Sir William Cash) and the hon. Member for Penistone and Stocksbridge (Miriam Cates). I congratulate them on the work they have done in bringing this forward. I think they will probably agree with me that this issue should never have divided us as it did before Christmas, when I tabled a similar amendment. It is not a party political issue; it is a common-sense measure that best serves the national interest and will make online a safer place for children. I am pleased that the hon. Members for Stone and for Penistone and Stocksbridge have persuaded their colleagues of the justification and that the Government have listened to them—I am only sorry that I was not as successful.

This is an important measure. The business model that platforms operate encourages, not just passively but actively, the flourishing of abusive content online. They do not just fail to remove that content, but actively promote its inclusion through the algorithms that they employ. Sadly, people get a kick out of reading hateful, harmful and abusive content online, as the platform companies and their senior managers know. It is in their interest to encourage maximum traffic on their platforms, and if that means letting people post and see vile abuse, they will. The greater the traffic on such sites, the more attractive they become to advertisers and the more advertisers are willing to pay for the ads that they post on the sites. The platforms make money out of online abuse.

Originally, the Government wanted to deal with the problem by fining the companies, but companies would simply treat such fines as a cost to their business. It would not change their model or the platforms’ behaviour, although it might add to the charges for those who want to advertise on the platforms. Furthermore, we know that senior directors, owners and managers personally take decisions about the content that they allow to appear on their platforms and that their approach affects what people post.

Elon Musk’s controversial and aggressive takeover of Twitter, where he labelled the sensible moderation of content as a violation of freedom of speech, led to a 500% increase in the use of the N-word within 12 hours of his acquisition. Telegram, whose CEO is Pavel Durov, has become the app of choice of terror networks such as ISIS, according to research conducted by the Middle East Media Research Institute. When challenged about that, however, Durov refused to act on the intelligence to moderate content and said:

“You cannot make messaging technology secure for everybody except for terrorists.”

If senior managers have responsibility for the content on their platforms, they must be held to account, because we know that doing so will mean that online businesses become a safer place for our children.

We have to decide whose side we are on. Are we really putting our children’s wellbeing first, or are we putting the platforms’ interest first? Of course, everybody will claim that we are putting children’s interests first, but if we are, we have to put our money where our mouth is, which involves making the managers truly accountable for what appears on their platforms. We know that legislating for director liability works, because it has worked for health and safety on construction sites, in the Bribery Act 2010 and on tax evasion. I hope to move similar amendments when we consider the Economic Crime and Corporate Transparency Bill on Report next week.

This is not simply a punitive measure—in fact, the last thing we want to do is lock up a lot of platform owners—but a tool to transform behaviour. We will not be locking up the tech giants, but we will be ensuring that they moderate their content. Achieving this change shows the House truly working at its best, cross-party, and focusing on the merits of the argument rather than playing party politics with such a serious issue. I commend new clause 2 to the House.

None Portrait Several hon. Members rose—
- Hansard -

Online Safety Bill

Margaret Hodge Excerpts
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think this is why Ofcom has discretion, so that it can determine that. The most egregious examples are the ones people can learn from, and it is about doing this in proportion. My hon. Friend is absolutely right that if we are swamped with small notifications, this will be hidden in plain sight. That would not be useful, particularly for parents, to best understand what is going on. It is all about making more informed decisions.

The House will be aware that we recently announced our intention to make a number of other changes to the Bill. We are making those changes because we believe it is vital that people can continue to express themselves freely and engage in pluralistic debate online. That is why the Bill will be amended to strengthen its provisions relating to children and to ensure that the Bill’s protections for adults strike the right balance with its protections for free speech.

Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - -

The Minister is alluding, I assume, to the legal but harmful provision, but what does he think about this as an example? People are clever; they do not use illegal language. They will not say, “I want to kill all Jews”, but they may well—and do—say, “I want to harm all globalists.” What is the Minister’s view of that?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The right hon. Lady and I have had a detailed chat about some of the abuse that she and many others have been suffering, and there were some particularly egregious examples. This Bill is not, and never will be, a silver bullet. This has to be worked through, with the Government acting with media platforms and social media platforms, and parents also have a role. This will evolve, but we first need to get back to the fundamental point that social media platforms are not geared up to enforce their own terms and conditions. That is ridiculous, a quarter of a century after the world wide web kicked in, and when social media platforms have been around for the best part of 20 years. We are shutting the stable door afterwards, and trying to come up with legislation two decades later.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank my hon. Friend for that important and powerful intervention. Let us be clear: everything that Kanye West said online is completely abhorrent and has no place in our society. It is not for any of us to glorify Hitler and his comments or praise him for the work he did; that is absolutely abhorrent and it should never be online. Sadly, however, that is exactly the type of legal but harmful content that will now be allowed to proliferate online because of the Government’s swathes of changes to the Bill, meaning that that would be allowed to be seen by everybody. Kanye West has 30 million followers online. His followers will be able to look at, share, research and glorify that content without any consequence to that content being freely available online.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

Further to that point, it is not just that some of the content will be deeply offensive to the Jewish community; it could also harm wider society. Some further examples of postings that would be considered legal but harmful are likening vaccination efforts to Nazi death camps and alleging that NHS nurses should stand trial for genocide. Does my hon. Friend not agree that the changes the Government are now proposing will lead to enormous and very damaging impacts right through society?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My right hon. Friend is absolutely right. I am keen to bring this back into scope before Mr Speaker chastises us any further, but she is right to say that this will have a direct real-world impact. This is what happens when we focus on content rather than directly on the platforms and the algorithms on the platforms proliferating this content. That is where the focus needs to be. It is the algorithms that share and amplify this content to these many followers time and again that need to be tackled, rather than the content itself. That is what we have been pleading with the Government to concentrate on, but here we are in this mess.

We are pleased that the Government have taken on board Labour’s policy to criminalise certain behaviours—including the encouragement of self-harm, sharing people’s intimate images without their consent, and controlling or coercive behaviours—but we believe that the communications offences more widely should remain in order to tackle dangerous online harms at their root. We have worked consistently to get this Bill over the line and we have reached out to do so. It has been subject to far too many delays and it is on the Government’s hands that we are again facing substantial delays, when internet regulation has never been more sorely needed. I know that the Minister knows that, and I sincerely hope he will take our concerns seriously. I reach out to him again across the Dispatch Box, and look forward to working with him and challenging him further where required as the Bill progresses. I look forward to getting the Bill on to the statute book.

--- Later in debate ---
Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

My hon. Friend is absolutely right. I thank her for not just her intervention but her steadfast work when she was a Home Office Minister with responsibility for safeguarding. I also thank the Internet Watch Foundation; many of the statistics and figures that we have been using about child sexual abuse and exploitation content, and the take-downs, are thanks to its work. There is some important work to do there. The Minister will be familiar with its work—[Interruption.] Exactly that.

We need the expertise of the Internet Watch Foundation, so it is about integrating that skillset. There is a great deal of expertise out there, including at the Internet Watch Foundation, at GIFCT on the CT side and, obviously, in our services and agencies. As my right hon. Friend the Member for Basingstoke said, it is crucial that we pool organisations’ expertise to implement the Bill, as we will not be able to create it all over again overnight in government.

I thank my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) for tabling new clause 16, which would create new offences to address the challenges caused by those who promote, encourage and assist self-harm. That has been the subject of much of the debate already, which is absolutely right when we think about the victims and their families. In particular, I thank the Samaritans and others for their work to highlight this important issue. I do not need to dwell on the Samaritans’ report, because I think all hon. Members have read it.

All hon. Members who spoke in the early stages of the Bill, which I did not because I was in government, highlighted this essential area. It is important to ensure that we do everything we can to address it in the right way. Like all right hon. and hon. Members, I pay tribute to the family of Molly Russell. There are no words for the suffering that they have endured, but their campaign of bravery, courage and fortitude aims to close every loophole to stop other young people being put at risk.

Right hon. and hon. Members meet young people in schools every week, and we are also parents and, in some cases, grandparents. To know that this grey area leaves so many youngsters at risk is devastating, so we have almost a collective corporate duty to stand up and do the right thing. The long and short of it is that we need to be satisfied, when passing the Bill, that we are taking action to protect vulnerable people and youngsters who are susceptible to dangerous communications.

As I have emphasised, we should also seek to punish those who cause and perpetrate this harm and do everything we can to protect those who are vulnerable, those with learning disabilities, those with mental health conditions, and those who are exposed to self-harm content. We need to protect them and we have a duty to do that, so I look forward to the Minister’s reply.

I welcome new clauses 45 to 50, tabled by my right hon. Friend the Member for Basingstoke. I pay tribute to her for her work; she has been a strong campaigner for protecting the privacy of individuals, especially women and children, and for closing loopholes that have enabled people to be humiliated or harmed in the ways she has spoken about so consistently in the House. I am pleased that the Deputy Prime Minister, my right hon. Friend the Member for Esher and Walton (Dominic Raab), announced last month that the Government would table amendments in the other place to criminalise the sharing of intimate images, photographs and videos without consent; that is long overdue. When I was Home Secretary I heard the most appalling cases, with which my right hon. Friend the Member for Basingstoke will be familiar. I have met so many victims and survivors, and we owe it to them to do the right thing.

It would be reassuring to hear not just from the Minister in this debate, but from other Ministers in the Departments involved in the Bill, to ensure they are consistent in giving voice to the issues and in working through their Ministries on the implementation—not just of this Bill, but of the golden thread that runs throughout the legislation. Over the last three years, we have rightly produced a lot of legislation to go after perpetrators, and support women and girls, including the Domestic Abuse Act 2021. We should use those platforms to stand up for the individuals affected by these issues.

I want to highlight the importance of the provisions to protect women and girls, particularly the victims and survivors of domestic abuse and violence. Some abusive partners and ex-partners use intimate images in their possession; as the Minister said, that is coercive control which means that the victim ends up living their life in fear. That is completely wrong. We have heard and experienced too many harrowing and shocking stories of women who have suffered as a result of the use of such images and videos. It must now be a priority for the criminal justice system, and the online platforms in particular, to remove such content. This is no longer a negotiation. Too many of us—including myself, when I was Home Secretary—have phoned platforms at weekends and insisted that they take down content. Quite frankly, I have then been told, “Twitter doesn’t work on a Saturday, Home Secretary” or “This is going to take time.” That is not acceptable. It is an absolute insult to the victims, and is morally reprehensible and wrong. The platforms must be held to account.

Hon. Members will be well aware of the Home Office’s work on the tackling violence against women and girls strategy. I pay tribute to all colleagues, but particularly my hon. Friend the Member for Redditch (Rachel Maclean), who was the Minister at the time. The strategy came about after much pain, sorrow and loss of life, and it garnered an unprecedented 180,000 responses. The range of concerns raised were predominantly related to the issues we are discussing today. We can no longer stay mute and turn a blind eye. We must ensure that the safety of women in the public space offline—on the streets—and online is respected. We know how women feel about the threats. The strategy highlighted so much; I do not want to go over it again, as it is well documented and I have spoken about it in the House many times.

It remains a cause of concern that the Bill does not include a specific VAWG code of practice. We want and need the Bill. We are not going to fix everything through it, but, having spent valued time with victims and survivors, I genuinely believe that we could move towards a code of practice. Colleagues, this is an area on which we should unite, and we should bring such a provision forward; it is vital.

Let me say a few words in support of new clause 23, which was tabled by my right hon. Friend the Member for Basingstoke. I have always been a vocal and strong supporter of services for victims of crime, and of victims full stop. I think it was 10 years ago that I stood in this House and proposed a victims code of practice—a victims Bill is coming, and we look forward to that as well. This Government have a strong record of putting more resources into support for victims, including the £440 million over three years, but it is imperative that offenders—those responsible for the harm caused to victims—are made to pay, and it is absolutely right that they should pay more in compensation.

Companies profiteering from online platforms where these harms are being perpetrated should be held to account. When companies fail in their duties and have been found wanting, they must make a contribution for the harm caused. There are ways in which we can do that. There has been a debate already, and I heard the hon. Member for Pontypridd (Alex Davies-Jones) speak for the Opposition about one way, but I think we should be much more specific now, particularly in individual cases. I want to see those companies pay the price for their crimes, and I expect the financial penalties issued to reflect the severity of the harm caused—we should support that—and that such money should go to supporting the victims.

I pay tribute to the charities, advocacy groups and other groups that, day in and day out, have supported the victims of crime and of online harms. I have had an insight into that work from my former role in Government, but we should never underestimate how traumatic and harrowing it is. I say that about the support groups, but we have to magnify that multiple times for the victims. This is one area where we must ensure that more is done to provide extra resources for them. I look forward to hearing more from the Minister, but also from Ministers from other Departments in this space.

I will conclude on new clause 28, which has already been raised, on the advocacy body for children. There is a long way to go with this—there really is. Children are harmed in just too many ways, and the harm is unspeakable. We have touched on this in earlier debates and discussions on the Bill, in relation to child users on online platforms, and there will be further harm. I gently urge the Government —if not today or through this Bill, then later—to think about how we can pull together the skills and expertise in organisations outside this House and outside Government that give voice to children who have nowhere else to go.

This is not just about the online space; in the cases in the constituency of the hon. Member for Rotherham (Sarah Champion) and other constituencies, we have seen children being harmed under cover. Statutory services failed them and the state failed them. It was state institutional failure that let children down in the cases in Rotherham and other child grooming cases. We could see that all over again in the online space, and I really urge the Government to make sure that that does not happen—and actually never happens again, because those cases are far too harrowing.

There really is a lot here, and we must come together to ensure that the Bill comes to pass, but there are so many other areas where we can collectively put aside party politics and give voice to those who really need representation.

Margaret Hodge Portrait Dame Margaret Hodge
- Parliament Live - Hansard - -

I pay tribute to all the relatives and families of the victims of online abuse who have chosen to be with us today. I am sure that, for a lot of you, our debate is very dry and detached, yet we would not be here but for you. Our hearts are with you all.

I welcome the Minister to his new role. I hope that he will guide his Bill with the same spirit set by his predecessors, the right hon. Member for Croydon South (Chris Philp) and the hon. Member for Folkestone and Hythe (Damian Collins), who is present today and has done much work on this issue. Both Ministers listened and accepted ideas suggested by Back Benchers across the House. As a result, we had a better Bill.

--- Later in debate ---
David Davis Portrait Mr David Davis
- Hansard - - - Excerpts

The right hon. Lady and I have co-operated to deal with international corporate villains, so I am interested in her proposal. However, a great number of these actions are taken by algorithms—I speak as someone who was taken down by a Google algorithm—so what happens then? I see no reason why we should not penalise directors, but how do we establish culpability?

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

That is for an investigation by the appropriate enforcement agency—Ofcom et al.—and if there is evidence that culpability rests with the managing director, the owner or whoever, they should be prosecuted. It is as simple as that. A case would have to be established through evidence, and that should be carried out by the enforcement agency. I do not think that this is any different from any other form of financial or other crime. In fact, it is from my experience in that that I came to this conclusion.

John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - - - Excerpts

The right hon. Lady is making a powerful case, particularly on the effective enforcement of rules to ensure that they bite properly and that people genuinely pay attention to them. She gave the example of a senior executive talking about whether people should be stopped for getting it wrong—I think the case she mentioned was holocaust denial—by making factually inaccurate statements or allowing factually inaccurate statements to persist on their platform. May I suggest that her measures would be even stronger if she were to support new clause 34, which I have tabled? My new clause would require factual inaccuracy to become wrong, to be prevented and to be pursued by the kinds of regulators she is talking about. It would be a much stronger basis on which her measure could then abut.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

Indeed. The way the hon. Gentleman describes his new clause, which I will look at, is absolutely right, but can I just make a more general point because it speaks to the point about legal but harmful? What I really fear with the legal but harmful rule is that we create more and more laws to make content illegal and that, ironically, locks up more and more people, rather than creates structures and systems that will prevent the harm occurring in the first place. So I am not always in favour of new laws simply criminalising individuals. I would love us to have kept to the legal but harmful route.

We can look to Elon Musk’s recent controversial takeover of Twitter. Decisions taken by Twitter’s newest owner—by Elon Musk himself—saw use of the N-word increase by nearly 500% within 12 hours of acquisition. And allowing Donald Trump back on Twitter gives a chilling permission to Trump and others to use the site yet again to incite violence.

The tech giants know that their business models are dangerous. Platforms can train their systems to recognise so-called borderline content and reduce engagement. However, it is for business reasons, and business reasons alone, that they actively choose not to do that. In fact, they do the opposite and promote content known to trigger extreme emotions. These platforms are like a “danger for profit” machine, and the decision to allow that exploitation is coming from the top. Do not take my word for it; just listen to the words of Ian Russell. He has said:

“The only person that I’ve ever come across in this whole world…that thought that content”—

the content that Molly viewed—

“was safe was…Meta.”

There is a huge disconnect between what silicon valley executives think is safe and what we expect, both for ourselves and for our children. By introducing liability for directors, the behaviour of these companies might finally change. Experience elsewhere has shown us that that would prove to be the most effective way of keeping online users safe. New clause 17 would hold directors of a regulated service personally liable on the grounds that they have failed, or are failing, to comply with any duties set in relation to their service, for instance failure that leads to the death of a child. The new clause further states that the decision on who was liable would be made by Ofcom, not the provider, meaning that responsibility could not be shirked.

I say to all Members that if we really want to reduce the amount of harmful abuse online, then making senior directors personally liable is a very good way of achieving it. Some 82% of UK adults agree with us, Labour Front Benchers agree and Back Benchers across the House agree. So I urge the Government to rethink their position on director liability and support new clause 17 as a cross-party amendment. I really think it will make a difference.

Damian Collins Portrait Damian Collins
- Parliament Live - Hansard - - - Excerpts

As Members know, there is a tradition in the United States that when the President signs a new Bill into law, people gather around him in the Oval Office, and multiple pens are used and presented to people who had a part in that Bill being drafted. If we required the King to do something similar with this Bill and gave a pen to every Minister, every Member who had served on a scrutiny Committee and every hon. Member who introduced an amendment that was accepted, we would need a lot of pens and it would take a long time. In some ways, however, that shows the House at its best; the Bill’s introduction has been a highly collaborative process.

The right hon. Member for Barking (Dame Margaret Hodge) was kind in her words about me and my right hon. Friend the Member for Croydon South (Chris Philp). I know that my successor will continue in the same tradition and, more importantly, that he is supported by a team of officials who have dedicated, in some cases, years of their career to the Bill, who care deeply about it and who want to see it introduced with success. I had better be nice to them because some of them are sitting in the Box.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend raises a very good question. As well as having a named individual with criminal liability for the supplying of information, should there be somebody who is accountable within a company, whether that comes with criminal sanctions or not—somebody whose job it is to know? As all hon. Members know if they have served on the Digital, Culture, Media and Sport Committee, which I chaired, on the Public Accounts Committee or on other Select Committees that have questioned people from the big tech companies, the frustrating thing is that no matter who they put up, it never seems to be the person who actually knows.

There needs to be someone who is legally liable, whether or not they have criminal liability, and is the accountable officer. In the same way as in a financial institution, it is really important to have someone whose job it is to know what is going on and who has certain liabilities. The Bill gives Ofcom the power to seek information and to appoint experts within a company to dig information out and work with the company to get it, but the companies need to feel the same sense of liability that a bank would if its systems had been used to launder money and it had not raised a flag.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I will dare to give way to yet another former Committee Chair—the former chair of the Public Accounts Committee.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

I draw all hon. Members’ attention to issues relating to Barclays Bank in the wake of the economic crisis. An authority—I think it was the Serious Fraud Office—attempted to hold both the bank and its directors to account, but it failed because there was not a corporate criminal liability clause that worked. It was too difficult. Putting such a provision in the Bill would be a means of holding individual directors as well as companies to account, whatever standard of proof was used.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I thank the right hon. Lady for that information.

Let me move on to the debate about encryption, which my right hon. Friend the Member for Haltemprice and Howden has mentioned. I think it is important that Ofcom and law enforcement agencies be able to access information from companies that could be useful in prosecuting cases related to terrorism and child sexual exploitation. No one is suggesting that encrypted messaging services such as WhatsApp should be de-encrypted, and there is no requirement in the Bill for encryption to end, but we might ask how Meta makes money out of WhatsApp when it appears to be free. One way in which it makes money is by gathering huge amounts of data and information about the people who use it, about the names of WhatsApp groups and about the websites people visit before and after sending messages. It gathers a lot of background metadata about people’s activity around using the app and service.

If someone has visited a website on which severe illegal activity is taking place and has then used a messaging service, and the person to whom they sent the message has done the same, it should be grounds for investigation. It should be easy for law enforcement to get hold of the relevant information without the companies resisting. It should be possible for Ofcom to ask questions about how readily the companies make that information available. That is what the Government seek to do through their amendments on encryption. They are not about creating a back door for encryption, which could create other dangers, and not just on freedom of expression grounds: once a back door to a system is created, even if it is only for the company itself or for law enforcement, other people tend to find their way in.

--- Later in debate ---
Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

Is the Minister saying he is open to changing his view on why he is minded to reject new clause 17 tonight?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think I am changing my view. I am saying that this is not the last stage of the Bill, so there will be plenty of opportunity further to test this, should Members want to do so.

On new clause 28, the Government recognise and agree with the intent behind this amendment to ensure that the interests of child users of regulated services are represented. Protecting children online is the top priority in this Bill, and its key measures will ensure that children are protected from harmful content. The Bill appoints a regulator with comprehensive powers to force tech companies to keep children safe online, and the Bill’s provisions will ensure that Ofcom will listen and respond to the needs of children when identifying priority areas for regulatory action, setting out guidance for companies, taking enforcement action and responding to super-complaints.

Right from the outset, Ofcom must ensure that its risk assessment and priorities reflect the needs of children. For example, Ofcom is required to undertake research that will help understand emerging risks to child safety. We have heard a lot today about the emerging risks with changing technology, and it is important that we keep on top of those and have that children’s voice at the heart of this. The Bill also expands the scope of the Communications Consumer Panel to online safety matters. That independent panel of experts ensures that user needs are at the heart of Ofcom’s regulatory approach. Ofcom will also have the flexibility to choose other mechanisms to better understand user experiences and emerging threats. For example, it may set up user panels or focus groups.

Importantly, Ofcom will have to engage with expert bodies representing children when developing codes of practice and other regulatory guidance. For example, Ofcom will be required to consult persons who represent the interests of children when developing its codes of practice. That means that Ofcom’s codes will be fully informed by how children behave online, how they experience harm and what impact the proposed measures will have on their online experience. The super-complaints process will further enable independent bodies advocating for children to have their voices heard, and will help Ofcom to recognise and eliminate systemic failures.

As we have heard, the Government also plan to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it develops its code of practice. That amendment will be tabled in the House of Lords. Through this consultation, the commissioner will be able to flag systemic issues or issues of particular importance to the regulator, helping Ofcom to target investigations and, if necessary, sanctions at matters that most affect children’s online experience.

As such, there are ample opportunities in the framework for children’s voices to be heard, and the Government are not convinced of the need to legislate for another child user advocacy body. There are plenty of bodies out there that Ofcom will already be reaching out to and there is an abundance of experience in committed representative groups that are already engaged and will be engaged with the online safety framework. They include the existing statutory body responsible for promoting the interests of children, the Children’s Commissioner. Adding an additional statutory body would duplicate existing provision, creating a confusing landscape, and that would not be in the best interests of children.

Online Safety Bill (Programme) (No. 4)

Margaret Hodge Excerpts
Michelle Donelan Portrait The Secretary of State for Digital, Culture, Media and Sport (Michelle Donelan)
- Parliament Live - Hansard - - - Excerpts

I beg to move,

That the following provisions shall apply to the Online Safety Bill for the purpose of varying and supplementing the Order of 19 April 2022 in the last session of Parliament (Online Safety Bill: Programme) as varied by the Orders of 12 July 2022 (Online Safety Bill: Programme (No.2)) and today (Online Safety Bill: Programme (No.3)).

Re-committal

(1) The Bill shall be re-committed to a Public Bill Committee in respect of the following Clauses and Schedules—

(a) in Part 3, Clauses 11 to 14, 17 to 20, 29, 45, 54 and 55 of the Bill as amended in Public Bill Committee;

(b) in Part 4, Clause 64 of, and Schedule 8 to, the Bill as amended in Public Bill Committee;

(c) in Part 7, Clauses 78, 81, 86, 89 and 112 of, and Schedule 11 to, the Bill as amended in Public Bill Committee;

(d) in Part 9, Clause 150 of the Bill as amended in Public Bill Committee;

(e) in Part 11, Clause 161 of the Bill as amended in Public Bill Committee;

(f) in Part 12, Clauses 192, 195 and 196 of the Bill as amended in Public Bill Committee;

(g) New Clause [Repeal of Part 4B of the Communications Act: transitional provision etc], if it has been added to the Bill, and New Schedule [Video-sharing platform services: transitional provision etc], if it has been added to the Bill.

Proceedings in Public Bill Committee on re-committal

(2) Proceedings in the Public Bill Committee on re-committal shall (so far as not previously concluded) be brought to a conclusion on Thursday 15 December 2022.

(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.

Consideration following re-committal and Third Reading

(4) Proceedings on Consideration following re-committal shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.

(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.

(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration following re-committal.

I know that colleagues across the House have dedicated a huge amount of time to getting the Bill to this point, especially my predecessor, my right hon. Friend the Member for Mid Bedfordshire (Ms Dorries), who unfortunately could not be with us today. I thank everybody for their contributions through the pre-legislative scrutiny and passage and for their engagement with me since I took office. Since then, the Bill has been my No. 1 priority.

Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - -

Does the right hon. Member not agree that it is regrettable that her junior Minister—the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Sutton and Cheam (Paul Scully)—failed to acknowledge in his winding-up speech that there had been any contributions to the debate on Report from Labour Members?

Michelle Donelan Portrait Michelle Donelan
- Hansard - - - Excerpts

As the right hon. Member will note, the Minister had to stop at a certain point and he had spoken for 45 minutes in his opening remarks. I think that he gave a true reflection of many of the comments that were made tonight. The right hon. Member will also know that all the comments from Opposition Members are on the parliamentary record and were televised.

The sooner that we pass the Bill, the sooner we can start protecting children online. This is a groundbreaking piece of legislation that, as hon. Members have said, will need to evolve as technology changes.

Online Safety Bill

Margaret Hodge Excerpts
This is a really important point that has sometimes been missed in the discussion on the Bill. There are very clear duties relating to illegal harm that companies must proactively identify and mitigate. The transparency requirements for other harmful content are very clear that companies must set out what their policies are. Enforcement action can be taken by the regulator for breach of their policies, but the primary objective is that companies make clear what their policies are. It is not a requirement for companies to remove legal speech if their policies do not allow that.
Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - -

I welcome the Minister to his position, and it is wonderful to have somebody else who—like the previous Minister, the hon. Member for Croydon South (Chris Philp)—knows what he is talking about. On this issue, which is pretty key, I think it would work if minimum standards were set on the risk assessments that platforms have to make to judge what is legal but harmful content, but at the moment such minimum standards are not in the Bill. Could the Minister comment on that? Otherwise, there is a danger that platforms will set a risk assessment that allows really vile harmful but legal content to carry on appearing on their platform.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The right hon. Lady makes a very important point. There have to be minimum safety standards, and I think that was also reflected in the report of the Joint Committee, which I chaired. Those minimum legal standards are set where the criminal law is set for these priority legal offences. A company may have higher terms of service—it may operate at a higher level—in which case it will be judged on the operation of its terms of service. However, for priority illegal content, it cannot have a code of practice that is below the legal threshold, and it would be in breach of the provisions if it did. For priority illegal offences, the minimum threshold is set by the law.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

I understand that in relation to illegal harmful content, but I am talking about legal but harmful content. I understand that the Joint Committee that the hon. Member chaired recommended that for legal but harmful content, there should be minimum standards against which the platforms would be judged. I may have missed it, but I cannot see that in the Bill.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.

In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.

The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In terms of content that is legal but potentially harmful, as the Bill is drafted, the platforms will have to set out their policies, but their policies can say whatever they like, as we discussed earlier. A policy could include actively promoting content that is harmful through algorithms, for commercial purposes. At the moment, the Bill as constructed gives them that freedom. I wonder whether that is an area that we can think about making slightly more prescriptive. Giving them the option to leave the content up there relates to the free speech point, and I accept that, but choosing to algorithmically promote it is slightly different. At the moment, they have the freedom to choose to algorithmically promote content that is toxic but falls just on the right side of legality. If they want to do that, that freedom is there, and I just wonder whether it should be. It is a difficult and complicated topic and we are not going to make progress on it today, but it might be worth giving it a little more thought.

I think I have probably spoken for long enough on this Bill, not just today but over the last few months. I broadly welcome these amendments but I am sure that, as the Bill completes its stages, in the other place as well, there will be opportunities to slightly fine-tune it that all of us can make a contribution to.

Margaret Hodge Portrait Dame Margaret Hodge
- Parliament Live - Hansard - -

First, congratulations to the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins). I think his is one of the very few appointments in these latest shenanigans that is based on expertise and ability. I really welcome him, and the work he has done on the Bill this week has been terrific. I also thank the hon. Member for Croydon South (Chris Philp). When he held the position, he was open to discussion and he accepted a lot of ideas from many of us across the House. As a result, I think we have a better Bill before us today than we would have had. My gratitude goes to him as well.

I support much of the Bill, and its aim of making the UK the safest place to be online is one that we all share. I support the systems-based approach and the role of Ofcom. I support holding the platforms to account and the importance of protecting children. I also welcome the cross-party work that we have done as Back Benchers, and the roles played by both Ministers and by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). I thank him for his openness and his willingness to talk to us. Important amendments have been agreed on fraudulent advertising, bringing forward direct liability so there is not a two-year wait, and epilepsy trolling—my hon. Friend the Member for Batley and Spen (Kim Leadbeater) promoted that amendment.

I also welcome the commitment to bring forward amendments in the Lords relating to the amendments tabled by the hon. Member for Brigg and Goole (Andrew Percy) and the right hon. and learned Member for Kenilworth and Southam—I think those amendments are on the amendment paper but it is difficult to tell. It is important that the onus on platforms to be subject to regulation should be based not on size and functionality but on risk of harm. I look forward to seeing those amendments when they come back from the other place. We all know that the smallest platforms can present the greatest risk. The killing of 51 people in the mosques in Christchurch, New Zealand is probably the most egregious example, as the individual concerned had been on 8chan before committing that crime.

I am speaking to amendments 156 and 157 in my name and in the names of other hon. and right hon. Members. These amendments would address the issue of anonymous abuse. I think we all accept that anonymity is hugely important, particularly to vulnerable groups such as victims of domestic violence, victims of child abuse and whistleblowers. We want to retain anonymity for a whole range of groups and, in framing these amendments, I was very conscious of our total commitment to doing so.

Equally, freedom of speech is very important, as the right hon. Member for Haltemprice and Howden (Mr Davis) said, but freedom of speech has never meant freedom to harm, which is not a right this House should promote. It is difficult to define, and it is difficult to get the parameters correct, but we should not think that freedom of speech is an absolute right without constraints.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

I agree with the right hon. Lady that freedom of speech is not absolute. As set out in article 10 of the European convention on human rights, there have to be checks and balances. Nevertheless, does she agree freedom of speech is an important right that this House should promote, with the checks and balances set out in article 10 of the ECHR?

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - -

Absolutely. I very much welcome the hon. and learned Lady’s amendment, which clarifies the parameters under which freedom of speech can be protected and promoted.

Equally, freedom of speech does not mean freedom from consequences. The police and other enforcement agencies can pursue unlawful abuse, assuming they have the resources, which we have not discussed this afternoon. I know the platforms have committed to providing the finance for such resources, but I still question whether the resources are there.

The problem with the Bill and the Government amendments, particularly Government amendment 70, is that they weaken the platforms’ duty on legal but harmful abuse. Such abuse is mainly anonymous and the abusers are clever. They do not break the law; they avoid the law with the language they use. It might be best if I give an example. People do not say, in an antisemitic way, “I am going to kill all Jews.” We will not necessarily find that online, but we might find, “I am going to harm all globalists.” That is legal but harmful and has the same intent. We should think about that, without being beguiled by the absolute right to freedom of speech that I am afraid the right hon. Member for Haltemprice and Howden is promoting, otherwise we will find that the Bill does not meet the purposes we all want.

Much of the abuse is anonymous. We do not know how much, but much of it is. When there was racist abuse at the Euros, Twitter claimed that 99% of postings of racist abuse were identifiable. Like the Minister, I wrote to Twitter to challenge that claim and found that Twitter was not willing to share its data with me, claiming GDPR constraints.

It is interesting that, in recent days, the papers have said that one reason Elon Musk has given for pulling out of his takeover is that he doubts Twitter’s claim that fake and spam accounts represent less than 5% of users. There is a lack of understanding and knowledge of the extent of anonymous abuse.

In the case I have shared with the Minister on other occasions, I received 90,000 posts in the two months from the publication of the Equality and Human Rights Commission report to the shenanigans about the position of the previous leader of the Labour party—from October to Christmas. The posts were monitored for me by the Community Security Trust. When I asked how many of the posts were anonymous, I was told that it had been unable to do that analysis. I wish there were the resources to do so, but I think most of the posts were anonymous and abusive.

There is certainly public support for trying to tackle abusive posts. A June 2021 YouGov poll found that 78% of the public are in favour of revealing the identity of those who post online, and we should bear that in mind. If people feel strongly about this, and the poll suggests that they do, we should respond and not put it to one side.

The Government have tried to tackle this with a compromise following the very good work by the hon. Member for Stroud (Siobhan Baillie). The Bill places a duty on the platforms to give users the option to verify their identity. If a user chooses to remain unverified, they may not be able to interact with verified accounts. Although I support the motives behind that amendment, I have concerns.

First, the platform itself would have to verify who holds the account, which gives the platforms unprecedented access to personal details. Following Cambridge Analytica, we know how such data can be abused. Data on 87 million identities was stolen, and we know it was used to influence the Trump election in 2016, and it may have been a factor in the Brexit referendum.

Secondly, the police have been very clear on how I should deal with anonymous online abuse. They say that the last thing I should do is remove it, as they need it to be able to judge whether there is a real threat within the abuse that they should take seriously. So individuals having that right does not diminish the real harm they could face if the online abuse is removed.

Thirdly, one of the problems with a lot of online abuse is not just that it is horrible or can be dangerous in particular circumstances, but that it prevents democracy. It inhibits freedom of speech by inhibiting engagement in free, democratic discourse. Online abuse is used to undermine an individual’s credibility. A lot of the abuse I receive seeks to undermine my credibility. It says that I am a bad woman, that I abuse children, that I break tax law and that I do this, that and the other. Building that picture of me as someone who cannot be believed undermines my ability to enter into legitimate democratic debate on issues I care about. Simply removing anonymous online abuse from my account does not stop the circulation of abusive, misleading content that undermines my democratic right to free speech. Therefore, in its own way, it undermines free speech.

Amendments 156 and 157, in my name and in the name of other colleagues, are based on a strong commitment to protecting anonymity, especially for vulnerable groups. We seek to tackle anonymous abuse not by denying anonymity but by ensuring traceability. It is quite simple. The Government recognise the feasibility and importance of that with age verification; they have now accepted the argument on age verification, and I urge them to take it further. Although I have heard that various groups are hostile to what we are suggesting, in a meeting I held last week with HOPE not hate there was agreement that what we are proposing made sense, and therefore we and the Government should pursue it.

Online Safety Bill

Margaret Hodge Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts
Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - -

Thank you, Madam Deputy Speaker. I hope that I will take only three minutes.

The human cost of abuse on the internet is unquantifiable—from self-harm to suicide, grooming to child abuse, and racism to misogyny. A space we thought gave the unheard a legitimate voice has become a space where too many feel forced to stay offline. As a Jewish female politician online, I have seen my identities perversely tied together to discredit my character and therefore silence my voice. I am regularly accused of being a “Zionist hag”, a “paedophile” and a “Nazi”. But this is not just about politicians. We all remember the tsunami of racism following the Euros, and we know women are targeted more online than men. Social media firms will not tackle this because their business model encourages harmful content. Nasty content attracts more traffic; more traffic brings more advertising revenue; and more revenue means bigger profits. Legislation is necessary to make the social media firms act. However, this Bill will simply gather dust if Ofcom and the police remain underfunded. The “polluter pays” principle—that is, securing funding through a levy on the platforms—would be much fairer than taxpayers picking up the bill for corporate failures.

I cherish anonymity for whistleblowers and domestic violence victims—it is vital—but when it is used as a cloak to harm others, it should be challenged. The Government’s halfway measure allows users to choose to block anonymous posts by verifying their own identity. That ignores police advice not to block abusive accounts, as those accounts help to identify genuine threats to individuals, and it ignores the danger of giving platforms the power to verify identities. We should think about the Cambridge Analytica scandal. Surely a third party with experience in unique identification should carry out checks on users. Then we all remain anonymous to platforms, but can be traced by law enforcement if found guilty of harmful abuse. We can then name and shame offenders.

On director liability, fines against platforms become a business cost and will not change behaviour, so personal liability is a powerful deterrent. However, enforcing this liability only when a platform fails to supply information to Ofcom is feeble. Directors must be made liable for breaching safety duties.

Finally, as others have said, most regulations apply only to category 1 platforms. Search engines fall through the cracks; BitChute, Gab, 4chan—all escape, but as we saw in the attacks on Pittsburgh’s synagogue and Christchurch’s mosque, all these platforms helped to foster those events. Regulation must be based on risk, not size. Safety should be embedded in any innovative products, so concern about over-regulating innovation is misplaced. This is the beginning of a generational change. I am grateful to Ministers, because I do think they have listened. If they continue to listen, we can make Britain the safest place online.

Draft Online Safety Bill Report

Margaret Hodge Excerpts
Thursday 13th January 2022

(2 years, 3 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Parliament Live - Hansard - -

I congratulate the hon. Member for Folkestone and Hythe (Damian Collins) and the members of his Committee on bringing forward an incredibly thorough and very good report. I know Ministers have been consulting well with all Back Benchers, and I hope they do not pay lip service to the report’s conclusions, but really take on its important recommendations. What is interesting about this whole debate is that there is a broad consensus on the Back Benches. None of us are bound by ideology on these issues; our approach is based on our experience, the data and the wide body of research.

I will also say at the beginning that the business model of the platforms means that they will never tackle this themselves. They make their money by encouraging traffic on their platforms, and they encourage traffic by allowing abusive content to exist there. Their algorithms are there almost to control and encourage more abusive content. The idea that there can be any self-regulation in the legislation to be proposed by the Government is false.

I will draw attention to three sets of issues in the short time available to me. The first, the recommendations on paid-for scams and frauds, has already been discussed. It is ridiculous that user-generated content can be subject to regulation but that paid-for scams and frauds cannot be. Everybody who gave evidence to the Committee, including the Financial Conduct Authority, pleaded for its inclusion. The figure I have is from Action Fraud: 85% of the £1.7 billion lost in fraudulent scams in the past year resulted from cyber-enabled frauds. During the pandemic, this figure of course exploded. Again, there is no incentive for the platforms to do anything about this. They get paid for by the advertisements so they wish to encourage them. Indeed, there is a double benefit in this particular space for them, because the FCA also pays for them to prioritise the legitimate websites over the scam adds, so again self-regulation will not work. I know that Ministers support the proposal, and I hope that they are not swayed by advice that it is not legally possible, as I just do not accept that. I hope that they do not miss this opportunity by way of promises of legislation down the line.

Stephen Timms Portrait Stephen Timms (East Ham) (Lab)
- Parliament Live - Hansard - - - Excerpts

I very much agree with the point my right hon. Friend is making and with the recommendation in the report. I wonder whether she noticed that the Prime Minister told the Liaison Committee in July that

“one of the key objectives of the Online Safety Bill is to tackle online fraud.”

Does she agree that it cannot possibly do that if it misses out scam adverts?

Margaret Hodge Portrait Dame Margaret Hodge
- Parliament Live - Hansard - -

I completely agree with my right hon. Friend on that, and I hope that the Minister will confirm that he will include this in the legislation.

The second issue I wish to raise relates to anonymity. No one wants to undermine anonymity—we all recognise that it is crucial for whistleblowers, for victims of domestic violence or child abuse, and for others—but we do want to tackle anonymous abuse. Sadly, most of the vile abuse that appears online is anonymous, as we have seen in the spreading of disinformation, particularly in relation to the pandemic. I have seen it in my experience, and it really undermines my right to participate in democratic debate. If people paint someone online as being a terrible person, as a hypocrite or as a hateful, wicked woman, which is what they do with me, that person is then not trusted on anything and therefore their voice is shut down in the democratic debate.

What we are all after is not tackling anonymity but ensuring third party verification of the identity of people so that they can be traced if and when they put abusive content online. The proposals that came from the Law Commission, and which one of the four ex-Culture Secretaries who has worked on this issue has diligently pursued, to introduce a new offence to tackle serious online harms more effectively is very important. It is about shifting from content to the effects of the online harm.

My third point relates to director liability. All my experience in working in the field of tackling illicit finance and economic crime demonstrates to me that if we do not introduce director liability for when wrongdoing occurs in the actions of individuals associated with a company, we do not change the behaviour of those companies. Even fines of £50 million are not significant against Facebook’s gross revenue of more than £29 billion. I do not understand why we have to wait for two years to implement director liability, as it could be done immediately. I would be grateful to the Minister if he said that he will implement that.

The last thing I should say, in my final seconds, is on anonymity. I would like the Minister simply to confirm this afternoon whether he will tackle anonymous abuse and put in place the Law Commission’s proposals. When is the timeframe for that? I very much welcome the report and commend all those who worked so hard to put it together, and I hope we can make progress swiftly on a problem that is growing in British society and that is undermining, not supporting, democracy.

Online Anonymity and Anonymous Abuse

Margaret Hodge Excerpts
Wednesday 24th March 2021

(3 years ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab) [V]
- Hansard - -

I congratulate the hon. Member for Stroud (Siobhan Baillie) on securing this debate.

Legislating on online harms gives us a vital opportunity to call a halt to the extremism, misinformation and avalanche of harmful abuse that has become commonplace on social media. Whether on big platforms such as Twitter or fringe platforms such as Telegram, harmful content is now all-pervasive. Recently, another tsunami of racist abuse was directed at the footballers Marcus Rashford, Lauren James and Anthony Martial. Sometimes, the perpetrators can be identified, but too often those responsible do not reveal who they are. In the past, we argued that online anonymity supported open democratic debate; I am now convinced that anonymity encourages online harm that is not just hateful in itself but is used to spread lies about individuals and aims to undermine their credibility and so shut down their voices. Far from nurturing democratic debate, anonymity undermines democracy.

My work challenging Jew-hate reached a climax last autumn, with the publication of the Equality and Human Rights Commission report into antisemitism in the Labour party. Community Security Trust found that my public comments at that time led to 90,000 mentions on social media. The vast majority were abusive, racist and misogynistic.

Let me share just a few; some are very offensive.

“I hope she dies soon. Dumb bitch”;

“nothing but a couple of shit-stirring…cum buckets, bought and paid for by Israel.”

I was told I was a “Mossad agent”, a “Zionist stooge”, a wrinkly “pedo-lover”. “Traitor.” “Snake.” “Rat.” “Shill.” “Nazi”. This abuse is aggressive, harmful, yet sometimes I have no idea who said it.

Ending anonymity for those who promulgate hate or harm is key to effectively combating it. We must compel social media companies to be able to identify all users. We know that is easily done. Take the online payment company PayPal. Everyone using PayPal must provide their identity when setting up an account. Users’ identity is not public, but it can be traced if required. If social media companies acted similarly, those who use online anonymity for good, such as whistleblowers, or victims of child abuse or domestic abuse, could continue to do so, but those who use anonymity to spread harmful content would be identifiable, and could be dealt with by the appropriate authorities. Knowing that would, at a stroke—

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

I am sorry; we will have to leave it there. Time is up.

Covid-19: Cultural and Entertainment Sectors

Margaret Hodge Excerpts
Tuesday 2nd March 2021

(3 years, 1 month ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab) [V]
- Hansard - -

I am proud to declare an interest—not a financial one, but a passionate one. I chair Theatre Royal Stratford East, the erstwhile home of Joan Littlewood renowned for “Oh, What a Lovely War!”, “Things Ain’t What They Used To Be” and “A Taste of Honey”. I am immensely proud of our success in regaining our historic reputation for excellence and radicalism under the leadership of Nadia Fall, a hugely talented artistic director of Asian heritage, and her team of mainly women theatre makers.

We were on a roll, culminating in receiving an Olivier award for staging Britain’s “Noye’s Fludde”, which involved east end children performing alongside ENO singers. Our mission to create excellent shows and reflect the diversity of our community in everything that we do makes our contribution unique. Then covid erupted and the curtain fell.

Theatres have proved resilient and innovative. We produced an outdoor show called “846”, our response to George Floyd’s death. National Theatre Live has been enjoyed by vast audiences at home. The Kiln’s food programme provides fresh hot meals for hundreds. Battersea Arts Centre delivered digital activity and encouraged young people to keep writing.

Government support has focused too much on buildings, not on people. Life for freelancers, the lifeblood of our theatre, has been grim. We used to employ nearly 200 freelancers annually. This year, it is 75, and mostly on very small projects. With no Government support, freelance actors, directors and designers are walking away, retraining to ensure a secure living. We are haemorrhaging creative talent, most of whom started in the subsidised theatre. Public investment in people led to creative wealth for the nation. Think of Sunday’s Golden Globe Awards: Daniel Kaluuya, who first performed at the Royal Court; John Boyega, who began at Theatre Peckham. Think of Phoebe Waller-Bridge who started at the Soho; James Graham, playwright at Finborough Theatre; Michaela Coel who went from The Yard to critical acclaim on Channel Four. All are big commercial successes today. All are contributing to our vital creative economy, the vibrancy of our city centres and lifting our spirits. They are part of a massively successful ecosystem. Public investment in them drives both commercial success and the quest for diversity and equality. Yet young black and Asian creatives, women and those with disabilities are leaving theatre in droves. Nobody wants theatre to return to being a club for the elite and the well-connected. Investment in people, in the talent of tomorrow, must be our key ask today and only then will the arts bounce back strongly.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

I well remember the right hon. Lady’s theatre and its excellent director who happened to be my namesake.

Online Harms Consultation

Margaret Hodge Excerpts
Tuesday 15th December 2020

(3 years, 4 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Oliver Dowden Portrait Oliver Dowden
- Hansard - - - Excerpts

Sadly that will not be addressed by this legislation, Mr Speaker. [Interruption.] Not that I could—I believe that is a matter for the House.

My hon. Friend makes a very important point about antisemitic abuse. I have met organisations about that in framing the legislation. Most antisemitism is illegal and should be addressed through the provisions made for illegality. Beyond that, we will be setting out, as a priority, harms to be addressed through this legislation.

Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab) [V]
- Hansard - -

I, too, welcome this statement. In the past two months, Community Security Trust has identified 90,000 posts mentioning me. Most were hostile, antisemitic, misogynistic and ageist. Many were anonymous and, through disinformation, aimed to undermine my credibility and so silence me. I would ask the Secretary of State to think again. Does he not agree that anonymity on social media can no longer be universally protected, although it should be protected for groups such as whistleblowers and victims of domestic violence? Will he not agree that where users post illegal content or harmful abuse, social media companies should be required to collect and pass on information on the identity of the user to regulatory bodies and to the police?

Oliver Dowden Portrait Oliver Dowden
- Hansard - - - Excerpts

The right hon. Lady raises a very important point. As a Member of Parliament who proudly represents a very large Jewish community, I know the challenges of antisemitism, and that has been at the front of my mind in framing this legislation. It is a challenging area, this point about anonymity. Of course, if there is criminal conduct that the police and law enforcement agencies are investigating, they have ways of dealing with that anonymity in order to bring criminal cases. The reluctance I have had, and the Government have had, to introduce provision across the board is about how we lift the veil of anonymity while at the same time protecting some very vulnerable people who rely on it. But of course we will continue to keep it under review.