(5 years, 6 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
The hon. Lady makes a good point; people would need that. I believe more and more counselling is being offered, but I am not aware of whether that offer is consistent across the industry or provided only by the better-performing companies.
I reassure the hon. Lady that the Government have engaged with disability organisations and will continue to do so. Last year I held a roundtable with organisations focused specifically on online abuse of people with disabilities, and next month I will chair a roundtable focusing on adults with learning disabilities. I really am very sorry if the Government have given the impression that we think these problems are confined to children and young people, because they most certainly are not, as the hon. Lady said eloquently in her speech. I completely agree. In fact, the organisations with whom I had the roundtable mostly represented adults, and the next one will be mostly about young adults with learning disabilities. That is what I will do to follow up the debate and the petition.
I want to say a few words about the online harms White Paper. I reiterate my earlier point that self-regulation has failed—the shadow Minister is right about that. We all agree on that, and that is why the Government will establish a new statutory duty of care to make companies take more responsibility for the safety and security of their users and tackle the harm caused by the content and activity on their services. Compliance with the duty of care will be overseen and enforced by an independent regulator. Companies will be held to account for tackling a comprehensive set of online harms, including behaviours that may or may not be illegal but none the less are highly damaging to individuals and threaten people’s rights online. The Government are consulting on the most appropriate enforcement powers for a regulator.
[Ian Austin in the Chair]
My right hon. Friend the Member for Arundel and South Downs, who is a former Policing Minister, mentioned the structure of policing and whether there are capability as well as resource issues. I should have mentioned that the White Paper is in fact a joint Home Office and DCMS White Paper. We have therefore had input from Home Office Ministers, and I will raise his point with them. [Interruption.] I am somewhat distracted by a lot of noise—I do not know where it is coming from.
I see that we have had a change of Chair. It is a pleasure to serve under your chairmanship as well, Mr Austin.
Coming back to the point made by my right hon. Friend the Member for Arundel and South Downs, we intend that the new system of regulation will take some of the burden off the police and place it on to the tech companies. Those companies should be accountable for taking care of their users by eliminating such content, hopefully before it comes online but certainly very swiftly after it is reported.
The law in Germany, which the shadow Minister referred to, requires content to be taken down within 24 hours of companies knowing about it; if it is later than that, swingeing fines can be applied. We want to create an environment in which companies deal with matters themselves and use less and less of our valuable policing time for the privilege.
As I mentioned earlier, we have committed to developing a media literacy strategy—one of the proposals made by Glitch—to ensure that we have a co-ordinated and strategic approach to online media literacy education. We have published a statutory code of practice for social media providers about dealing with harmful contact, and we have consulted on the draft code with a variety of stakeholders, including people with disabilities. The code includes guidance on the importance of social media platforms having clear, accessible reporting processes and accessible information on their terms and conditions, highlighting the importance of consulting users when designing new software, new apps and new safety policies.
There has been some discussion about whether the law itself is adequate, particularly with regard to hate crime. I will say a few words about the Law Commission’s review. In February last year the Prime Minister announced that the Law Commission would undertake a review of current legislation on offensive communications to ensure that laws are up to date with technology. The Law Commission completed the first part of its review and published a report at the end of last year. It engaged with a range of stakeholders, including victims of online abuse, the charities that support them, legal experts and the Government. The report concluded that abusive communications are theoretically criminalised to the same or even greater degree than equivalent offline behaviours—I did not necessarily accept that verdict myself—but practical and cultural barriers mean that not all harmful online conduct is pursued through criminal law enforcement to the same extent that it is in an offline context. I think the consensus in this room is that that is definitely the case.
The Government are now finalising the details of the second phase of the Law Commission’s work. The Law Commission has been asked to complete a wide-ranging review of hate crime legislation in order to explore how to make hate crime legislation more effective, including whether it is effective in addressing crimes targeting someone because of their disability. I urge Members present and organisations that might be taking an interest in this debate to give their input to the review.