Online Abuse Debate
Full Debate: Read Full DebateLiam Byrne
Main Page: Liam Byrne (Labour - Birmingham Hodge Hill and Solihull North)Department Debates - View all Liam Byrne's debates with the Department for Digital, Culture, Media & Sport
(5 years, 7 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Ms Ryan. I, too, congratulate Katie Price and her family on bringing forward the petition. I pay tribute to my hon. Friend the Member for Warrington North (Helen Jones) for an outstanding speech to introduce the debate. It was brilliant because it was based on a thorough analysis of the petition. It is good to see the Petitions Committee working in exactly the way that it should.
I do not want to say too much, because our position on how to tackle this problem has been rehearsed with the Minister a number of times over the last year and a half, but there are three or four things that I want to put on the record. First, it is worth remembering that the scale of abuse is staggering. Three quarters of people with learning disabilities and autism say that they have been victims of hate crime. That is a comprehensive failure as a society and a country to keep our neighbours safe. God knows what sacrifices we have made over the last 50 or 60 years in the defence of democracy and free speech. We live in a country where some of our neighbours are hounded out of those privileges; we have to look at ourselves and conclude that we have so much more to do.
The policing environment for online hate is failing comprehensively. There is a very old concept in policing known as keeping the Queen’s peace. Online, the Queen’s peace is simply not observed. I disagree slightly with the right hon. Member for Arundel and South Downs (Nick Herbert) because it is simply inconceivable ever to expect a police force to police this waterfront. Some time ago, people started producing memes of what goes up online every 60 seconds. As far back as 2017, the statistics were half a million tweets, 500 hours of video and 3.3 million Facebook posts. There is no way any police force on earth will police that waterfront and keep it safe and sound to protect and preserve the Queen’s peace throughout that space. Therefore, we have to put the onus back on some of the most profitable companies on earth.
In the last reported quarter, Facebook made something like £5 billion of net earnings. That means that in the course of this debate, it will have made more than £3 million of profit. It is one of the biggest and most valuable companies on earth, yet it gets away with supporting—not orchestrating or colluding in, but certainly enabling—the abuse of fellow citizens of our society. The time has to come when we say to the wealthiest titans on earth, “Enough is enough.”
The right hon. Gentleman should not traduce what I said. I was quite clear that action needed to be taken across the board, and that social media companies had to accept responsibility. I did not say or seek to imply that the police could police the range of abusive comments across social media. Where they trespass into the criminal, law enforcement agencies do have a responsibility to act, and we need to ensure that they are capable of doing so.
I am grateful for that because I believe we are on the same page. I agree with the right hon. Gentleman that the police forces in this country will need to be radically reconfigured. The time when a police constable might turn up to a burglary and advise how to target harden the home should be about to go, because the cyber-security of the property and the family in question will often be much more important. At the moment, however, in Birmingham we cannot get police to investigate even violent abuses because there are no police—they have been cut in the west midlands to the smallest number since the force was created in 1974. That is a debate for another day.
Four significant changes need to happento the online regulatory and policing environment. I think the Government have accepted the first: there needs to be a duty of care on social media companies. The concept of duty of care is quite well established in law. Its legal tradition goes back to the early 1970s and it is tried and tested. If I went out and built a stadium here in London and filled it full of people, there would be all kinds of rules and regulations that would ensure that I kept those people safe. If I went out and built a similar online stadium and filled it full with all kinds of nonsense, no such regulations would bite on me. That has to change. We have to ask these firms to identify the harms their services and products might cause and to do something about them, and we have to hold them to account for that.
The second idea is much tighter regulation of hate speech, which the Government have not yet accepted the need to look into. We have raised a number of times in debates like this the approach taken by the Ministry of Justice in Germany. Its Network Enforcement Act—or NetzDG law for short—has created a much more effective policing environment for tackling online hate speech, and it has done so in a way that keeps Germany well within its Council of Europe obligations on protecting free speech. It is time we looked at that because, as the report that has come through from the German Ministry of Justice shows, it is beginning to work.
I am told that something like one in seven Facebook moderators now works in Germany. Google, Twitter, Facebook and YouTube have had to take down a significant amount of hateful material. Looking across the Council of Europe space at the countries that are signatories to the European convention on human rights, which includes the protection of free speech, it appears that Germany is leading the way in creating an effective policing environment to tackle hate speech. Surely, it is time for the Government to look at that a little harder.
The third thing we need is a different kind of regulator. Again, I think the Government have accepted that. There are something like nine different regulators with some kind of regulatory, policing or overwatch powers in the internet space. That is too many. We are not saying they need to be boiled down to one, but that number needs to be closer to one than to nine. That means we have to overhaul the regulators, so we are looking forward to seeing a new Bill whenever we see the Queen’s Speech and a new legislative programme for the next Session.
The final change we need, which is more long term, is a bill of digital rights for the 21st century. The reality is that the online world is going to be regulated, re-regulated and re-regulated again over the course of this century. It is therefore important that we set down some first principles that provide something of a north star to guide us and give companies a bit more predictability as we navigate the changes ahead. At the core of that bill of digital rights should be the right to universal digital literacy. Ultimately, as a country, we are all going to have to become more digitally literate so we can start putting back in place some of the norms and boundaries of the civilised discourse that once were the hallmark of democracy in this country.
That is an excellent suggestion. I am happy to put that to my hon. Friend the Minister for Sport, and if the hon. Lady and the hon. Member for Warrington North, who chairs the Petitions Committee, would like to attend that meeting, we will set that up. Yes, we will definitely invite all football authorities to that meeting.
The hon. Member for Warrington North also talked about the effect on moderators. Thousands of people are now employed by tech companies to moderate content and make decisions on whether it crosses the threshold and should be taken down. We are looking more and more to systems of artificial intelligence to do as much of that job as possible, precisely for the reasons she set out. It is a horrendous job to do, and I imagine that over time it ends up affecting the moderators’ mental health. On a positive note, 75% of the 4 million videos that YouTube has taken down in, I think, the past six months were identified and removed via artificial intelligence. That does offer us some hope for the future.
The Minister is being generous. The only danger with introducing such statistics, which all the social media companies are desperate to put into our hands, is that it creates the impression that somehow they are doing enough when they are not. We will never get to a solution to this problem by relying on voluntary action. That is why the law needs to change, and enforcement needs to change.
I certainly agree with the right hon. Gentleman. I am sorry if I gave that impression; I wanted to offer up some hope that over time more and more solutions for removal will be technological so that moderators, who have a terrible job to do, do not have to spend their working lives wading through this horrendous content. To clarify, that is absolutely not at all to say that companies are doing enough. They are doing more, but it is by no means enough as yet.
I see that we have had a change of Chair. It is a pleasure to serve under your chairmanship as well, Mr Austin.
Coming back to the point made by my right hon. Friend the Member for Arundel and South Downs, we intend that the new system of regulation will take some of the burden off the police and place it on to the tech companies. Those companies should be accountable for taking care of their users by eliminating such content, hopefully before it comes online but certainly very swiftly after it is reported.
The law in Germany, which the shadow Minister referred to, requires content to be taken down within 24 hours of companies knowing about it; if it is later than that, swingeing fines can be applied. We want to create an environment in which companies deal with matters themselves and use less and less of our valuable policing time for the privilege.
As I mentioned earlier, we have committed to developing a media literacy strategy—one of the proposals made by Glitch—to ensure that we have a co-ordinated and strategic approach to online media literacy education. We have published a statutory code of practice for social media providers about dealing with harmful contact, and we have consulted on the draft code with a variety of stakeholders, including people with disabilities. The code includes guidance on the importance of social media platforms having clear, accessible reporting processes and accessible information on their terms and conditions, highlighting the importance of consulting users when designing new software, new apps and new safety policies.
There has been some discussion about whether the law itself is adequate, particularly with regard to hate crime. I will say a few words about the Law Commission’s review. In February last year the Prime Minister announced that the Law Commission would undertake a review of current legislation on offensive communications to ensure that laws are up to date with technology. The Law Commission completed the first part of its review and published a report at the end of last year. It engaged with a range of stakeholders, including victims of online abuse, the charities that support them, legal experts and the Government. The report concluded that abusive communications are theoretically criminalised to the same or even greater degree than equivalent offline behaviours—I did not necessarily accept that verdict myself—but practical and cultural barriers mean that not all harmful online conduct is pursued through criminal law enforcement to the same extent that it is in an offline context. I think the consensus in this room is that that is definitely the case.
The Government are now finalising the details of the second phase of the Law Commission’s work. The Law Commission has been asked to complete a wide-ranging review of hate crime legislation in order to explore how to make hate crime legislation more effective, including whether it is effective in addressing crimes targeting someone because of their disability. I urge Members present and organisations that might be taking an interest in this debate to give their input to the review.
Before the Minister finishes, I am grateful for the opportunity to ask her whether she thinks that the Law Commission’s work is going to finish in time to allow her to bring a Bill before the House in the next Session.
I am afraid that I cannot give the right hon. Gentleman that assurance. We are not sure when the next Session will commence, but I fear that the timing of the second phase of that work means that it will not be carried out in time to form the basis of much-needed changes to the law, which I hope the Law Commission will propose. We might have to wait until the following Session. Having said that, the Law Commission might have an opportunity to provide some interim results from its inquiries, and there is nothing to stop an hon. Member introducing a private Member’s Bill, should the opportunity arise, to look closely at the subject and bring something forward for debate.
This review of hate crime is very necessary. One of today’s contributions mentioned the fact that hate crime is aggravated by certain characteristics, including disability, but that might not go far enough. These matters and a review of hate crime are part of the remit of the second phase of the Law Commission’s work. I will also be suggesting to the Law Commission that it looks at the issue of online gender-based hate crime. As the hon. Member for West Ham mentioned, a significant amount of online abuse is misogynistic—it devalues women, it degrades them sexually and it amounts to gender-based hatred. There is a powerful case for women to be afforded the same legal protection against misogynistic online abuse as that given to people with other protected characteristics over which they have no control.
In conclusion, I thank Members for their thoughtful contributions and the Petitions Committee for the huge amount of work it has done on this vital subject. I look forward to continued engagement from across the House as we develop the proposals set out in the online harms White Paper.