Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Hodge of Barking
Main Page: Baroness Hodge of Barking (Labour - Life peer)Department Debates - View all Baroness Hodge of Barking's debates with the Department for Digital, Culture, Media & Sport
(2 years, 5 months ago)
Commons ChamberI welcome the Minister to his position, and it is wonderful to have somebody else who—like the previous Minister, the hon. Member for Croydon South (Chris Philp)—knows what he is talking about. On this issue, which is pretty key, I think it would work if minimum standards were set on the risk assessments that platforms have to make to judge what is legal but harmful content, but at the moment such minimum standards are not in the Bill. Could the Minister comment on that? Otherwise, there is a danger that platforms will set a risk assessment that allows really vile harmful but legal content to carry on appearing on their platform.
The right hon. Lady makes a very important point. There have to be minimum safety standards, and I think that was also reflected in the report of the Joint Committee, which I chaired. Those minimum legal standards are set where the criminal law is set for these priority legal offences. A company may have higher terms of service—it may operate at a higher level—in which case it will be judged on the operation of its terms of service. However, for priority illegal content, it cannot have a code of practice that is below the legal threshold, and it would be in breach of the provisions if it did. For priority illegal offences, the minimum threshold is set by the law.
I understand that in relation to illegal harmful content, but I am talking about legal but harmful content. I understand that the Joint Committee that the hon. Member chaired recommended that for legal but harmful content, there should be minimum standards against which the platforms would be judged. I may have missed it, but I cannot see that in the Bill.
The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.
In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.
The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.
In terms of content that is legal but potentially harmful, as the Bill is drafted, the platforms will have to set out their policies, but their policies can say whatever they like, as we discussed earlier. A policy could include actively promoting content that is harmful through algorithms, for commercial purposes. At the moment, the Bill as constructed gives them that freedom. I wonder whether that is an area that we can think about making slightly more prescriptive. Giving them the option to leave the content up there relates to the free speech point, and I accept that, but choosing to algorithmically promote it is slightly different. At the moment, they have the freedom to choose to algorithmically promote content that is toxic but falls just on the right side of legality. If they want to do that, that freedom is there, and I just wonder whether it should be. It is a difficult and complicated topic and we are not going to make progress on it today, but it might be worth giving it a little more thought.
I think I have probably spoken for long enough on this Bill, not just today but over the last few months. I broadly welcome these amendments but I am sure that, as the Bill completes its stages, in the other place as well, there will be opportunities to slightly fine-tune it that all of us can make a contribution to.
First, congratulations to the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins). I think his is one of the very few appointments in these latest shenanigans that is based on expertise and ability. I really welcome him, and the work he has done on the Bill this week has been terrific. I also thank the hon. Member for Croydon South (Chris Philp). When he held the position, he was open to discussion and he accepted a lot of ideas from many of us across the House. As a result, I think we have a better Bill before us today than we would have had. My gratitude goes to him as well.
I support much of the Bill, and its aim of making the UK the safest place to be online is one that we all share. I support the systems-based approach and the role of Ofcom. I support holding the platforms to account and the importance of protecting children. I also welcome the cross-party work that we have done as Back Benchers, and the roles played by both Ministers and by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). I thank him for his openness and his willingness to talk to us. Important amendments have been agreed on fraudulent advertising, bringing forward direct liability so there is not a two-year wait, and epilepsy trolling—my hon. Friend the Member for Batley and Spen (Kim Leadbeater) promoted that amendment.
I also welcome the commitment to bring forward amendments in the Lords relating to the amendments tabled by the hon. Member for Brigg and Goole (Andrew Percy) and the right hon. and learned Member for Kenilworth and Southam—I think those amendments are on the amendment paper but it is difficult to tell. It is important that the onus on platforms to be subject to regulation should be based not on size and functionality but on risk of harm. I look forward to seeing those amendments when they come back from the other place. We all know that the smallest platforms can present the greatest risk. The killing of 51 people in the mosques in Christchurch, New Zealand is probably the most egregious example, as the individual concerned had been on 8chan before committing that crime.
I am speaking to amendments 156 and 157 in my name and in the names of other hon. and right hon. Members. These amendments would address the issue of anonymous abuse. I think we all accept that anonymity is hugely important, particularly to vulnerable groups such as victims of domestic violence, victims of child abuse and whistleblowers. We want to retain anonymity for a whole range of groups and, in framing these amendments, I was very conscious of our total commitment to doing so.
Equally, freedom of speech is very important, as the right hon. Member for Haltemprice and Howden (Mr Davis) said, but freedom of speech has never meant freedom to harm, which is not a right this House should promote. It is difficult to define, and it is difficult to get the parameters correct, but we should not think that freedom of speech is an absolute right without constraints.
I agree with the right hon. Lady that freedom of speech is not absolute. As set out in article 10 of the European convention on human rights, there have to be checks and balances. Nevertheless, does she agree freedom of speech is an important right that this House should promote, with the checks and balances set out in article 10 of the ECHR?
Absolutely. I very much welcome the hon. and learned Lady’s amendment, which clarifies the parameters under which freedom of speech can be protected and promoted.
Equally, freedom of speech does not mean freedom from consequences. The police and other enforcement agencies can pursue unlawful abuse, assuming they have the resources, which we have not discussed this afternoon. I know the platforms have committed to providing the finance for such resources, but I still question whether the resources are there.
The problem with the Bill and the Government amendments, particularly Government amendment 70, is that they weaken the platforms’ duty on legal but harmful abuse. Such abuse is mainly anonymous and the abusers are clever. They do not break the law; they avoid the law with the language they use. It might be best if I give an example. People do not say, in an antisemitic way, “I am going to kill all Jews.” We will not necessarily find that online, but we might find, “I am going to harm all globalists.” That is legal but harmful and has the same intent. We should think about that, without being beguiled by the absolute right to freedom of speech that I am afraid the right hon. Member for Haltemprice and Howden is promoting, otherwise we will find that the Bill does not meet the purposes we all want.
Much of the abuse is anonymous. We do not know how much, but much of it is. When there was racist abuse at the Euros, Twitter claimed that 99% of postings of racist abuse were identifiable. Like the Minister, I wrote to Twitter to challenge that claim and found that Twitter was not willing to share its data with me, claiming GDPR constraints.
It is interesting that, in recent days, the papers have said that one reason Elon Musk has given for pulling out of his takeover is that he doubts Twitter’s claim that fake and spam accounts represent less than 5% of users. There is a lack of understanding and knowledge of the extent of anonymous abuse.
In the case I have shared with the Minister on other occasions, I received 90,000 posts in the two months from the publication of the Equality and Human Rights Commission report to the shenanigans about the position of the previous leader of the Labour party—from October to Christmas. The posts were monitored for me by the Community Security Trust. When I asked how many of the posts were anonymous, I was told that it had been unable to do that analysis. I wish there were the resources to do so, but I think most of the posts were anonymous and abusive.
There is certainly public support for trying to tackle abusive posts. A June 2021 YouGov poll found that 78% of the public are in favour of revealing the identity of those who post online, and we should bear that in mind. If people feel strongly about this, and the poll suggests that they do, we should respond and not put it to one side.
The Government have tried to tackle this with a compromise following the very good work by the hon. Member for Stroud (Siobhan Baillie). The Bill places a duty on the platforms to give users the option to verify their identity. If a user chooses to remain unverified, they may not be able to interact with verified accounts. Although I support the motives behind that amendment, I have concerns.
First, the platform itself would have to verify who holds the account, which gives the platforms unprecedented access to personal details. Following Cambridge Analytica, we know how such data can be abused. Data on 87 million identities was stolen, and we know it was used to influence the Trump election in 2016, and it may have been a factor in the Brexit referendum.
Secondly, the police have been very clear on how I should deal with anonymous online abuse. They say that the last thing I should do is remove it, as they need it to be able to judge whether there is a real threat within the abuse that they should take seriously. So individuals having that right does not diminish the real harm they could face if the online abuse is removed.
Thirdly, one of the problems with a lot of online abuse is not just that it is horrible or can be dangerous in particular circumstances, but that it prevents democracy. It inhibits freedom of speech by inhibiting engagement in free, democratic discourse. Online abuse is used to undermine an individual’s credibility. A lot of the abuse I receive seeks to undermine my credibility. It says that I am a bad woman, that I abuse children, that I break tax law and that I do this, that and the other. Building that picture of me as someone who cannot be believed undermines my ability to enter into legitimate democratic debate on issues I care about. Simply removing anonymous online abuse from my account does not stop the circulation of abusive, misleading content that undermines my democratic right to free speech. Therefore, in its own way, it undermines free speech.
Amendments 156 and 157, in my name and in the name of other colleagues, are based on a strong commitment to protecting anonymity, especially for vulnerable groups. We seek to tackle anonymous abuse not by denying anonymity but by ensuring traceability. It is quite simple. The Government recognise the feasibility and importance of that with age verification; they have now accepted the argument on age verification, and I urge them to take it further. Although I have heard that various groups are hostile to what we are suggesting, in a meeting I held last week with HOPE not hate there was agreement that what we are proposing made sense, and therefore we and the Government should pursue it.