Online Harms

Jim Shannon Excerpts
Thursday 19th November 2020

(3 years, 5 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Jeremy Wright Portrait Jeremy Wright
- Hansard - - - Excerpts

I agree with my right hon. Friend, but I will be careful, Mr Deputy Speaker, in what I say about age verification, because I am conscious that a judicial review case is in progress on that subject. However, I agree that that is something that we could and should do, and not necessarily in direct conjunction with an online harms Bill.

Digital platforms should also recognise that a safer internet is, in the end, good for business. Their business model requires us to spend more and more time online, and we will do that only if we feel safe there. The platforms should recognise that Governments must act in that space, and that people of every country with internet access quite properly expect them to. We have operated for some time on the principle that what is unacceptable offline is unacceptable online. How can it be right that actions and behaviours that cause real harm and would be controlled and restricted in every other environment, whether broadcast media, print media or out on the street, are not restricted at all online?

I accept that freedom of speech online is important, but I cannot accept that the online world is somehow sacred space where regulation has no place regardless of what goes on there. Given the centrality of social media to modern political debate, should we rely on the platforms alone to decide which comments are acceptable and which are unacceptable, especially during election campaigns? I think not, and for me the case for online regulation is clear. However, it must be the right kind of regulation—regulation that gives innovation and invention room to grow, that allows developing enterprises to offer us life-enhancing services and create good jobs, but that requires those enterprises to take proper responsibility for their products and services, and for the consequences of their use. I believe that that balance is to be found in the proposed duty of care for online platforms, as set out in the Government’s White Paper of April last year.

I declare an interest as one of the Ministers who brought forward that White Paper at the time, and I pay tribute to all those in government and beyond, including the talented civil servants at the Department for Digital, Culture, Media and Sport, who worked so hard to complete it. This duty of care is for all online companies that deal with user-generated content to keep those who use their platforms as safe as they reasonably can.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - -

We have covered some important information. Does the right hon. and learned Gentleman agree that there needs to be a new social media regulator with the power to audit and impact social media algorithms to ensure that they do not cause harm? Such a regulator would enable that to happen.

Jeremy Wright Portrait Jeremy Wright
- Hansard - - - Excerpts

I agree that we need a regulator and will come on to exactly that point. The hon. Gentleman is entirely right, for reasons that I will outline in just a moment.

I recognise that what I am talking about is not the answer to every question in this area, but it would be a big step towards a safer online world if designed with sufficient ambition and implemented with sufficient determination. The duty of care should ask nothing unreasonable of the digital platforms. It would be unreasonable, for example, to suggest that every example of harmful content reaching a vulnerable user would automatically be a breach of the duty of care. Platforms should be obliged to put in place systems to protect their users that are as effective as they can be, not that achieve the impossible.

However, meeting that duty of care must mean doing more than is being done now. It should mean proactively scanning the horizon for those emerging harms that the platforms are best placed to see and designing mitigation for them, not waiting for terrible cases and news headlines to prompt action retrospectively. The duty of care should mean changing algorithms that prioritise the harmful and the hateful because they keep our attention longer and cause us to see more adverts. When a search engine asked about suicide shows a how-to guide on taking one’s own life long before it shows the number for the Samaritans, that is a design choice. The duty of care needs to require a different design choice to be made. When it comes to factual inquiries, the duty of care should expect the prioritisation of authoritative sources over scurrilous ones.

It is reasonable to expect these things of the online platforms. Doing what is reasonable to keep us safe must surely be the least we expect of those who create the world in which we now spend so much of our time. We should legislate to say so, and we should legislate to make sure that it happens. That means regulation, and as the hon. Gentleman suggests, it means a regulator—one that has the independence, the resources and the personnel to set and investigate our expectations of the online platforms. For the avoidance of doubt, our expectations should be higher than the platforms’ own terms and conditions. However, if the regulator we create is to be taken seriously by these huge multinational companies, it must also have the power to enforce our expectations. That means that it must have teeth and a range of sanctions, including individual director liability and site blocking in extreme cases.

We need an enforceable duty of care for online platforms to begin making the internet a safer place. Here is the good news for the Minister, who I know understands this agenda well. So often, such debates are intended to persuade the Government to change direction, to follow a different policy path. I am not asking the Government to do that, but rather to continue following the policy path they are already on—I just want them to move faster along that path. I am not pretending that it is an easy path. There will be complex and difficult judgments to be made and significant controversy in what will be groundbreaking and challenging legislation, but we have shied away from this challenge for far too long.

The reason for urgency is not only that, while we delay, lives continue to be ruined by online harms, sufficient though that is. It is also because we have a real opportunity and the obligation of global leadership here. The world has looked with interest at the prospectus we have set out on online harms regulation, and it now needs to see us follow through with action so that we can leverage our country’s well-deserved reputation for respecting innovation and the rule of law to set a global standard in a balanced and effective regulatory approach. We can only do that when the Government bring forward the online harms Bill for Parliament to consider and, yes, perhaps even to improve. We owe it to every preyed-upon child, every frightened parent and everyone abused, intimidated or deliberately misled online to act, and to act now.

--- Later in debate ---
Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - -

I congratulate the right hon. and learned Member for Kenilworth and Southam (Jeremy Wright) on his introduction and on all that he said. In my intervention I referred to the need for a social media regulator, and, as the hon. Member for Carshalton and Wallington (Elliot Colburn) has just said, we need a regulator with teeth. We need a regulator that actually does what it says it is going to do. That is important.

The Conservative manifesto of 2015 was very clear that it pertained not to social media platforms but to pornographic websites, and it committed to protecting children from them through the provision of statutory age verification. Part 3 of the Digital Economy Act 2017 made provision for that and it should have been implemented over a year ago. I respectfully express my dismay and concern that that has not happened.

The non-implementation of part 3 of the Act is a disaster for children as it needlessly exposes them to commercial pornographic websites, when this House has made provision for their protection from some sites. Perhaps the Minister could give us an explanation as to why the Government’s detailed defence in the judicial review for not proceeding with the implementation seems to relate to the protection under paragraph 19, which states:

“US-based browser companies were planning on implementing DNS-over-HTTPS…a new internet standard”.

I have great concerns about that.

I am also troubled by the way in which the Government have moved from the language of requiring age verification for pornographic websites, as referred to in their manifesto, to the very different language of expectation. The Government have said:

“This includes age verification tools and we expect them to continue to play a key role in protecting children online.”

They also said:

“Our proposals will introduce higher levels of protection for children. We will expect companies to use a proportionate range of tools including age assurance and age verification technologies to prevent children from accessing age-inappropriate or harmful content.”

In their initial response to the online harms White Paper consultation, the Government also said:

“we expect companies to use a proportionate range of tools, including age assurance and age verification technologies to prevent children accessing age-inappropriate content such as online pornography and to protect them from harms.”

Quite simply, that is not enough. That should not be an expectation; it should be a requirement. We have to have that in place.

The NSPCC has highlighted some worrying statistics. Instagram removed 75% fewer suicide and self-harm images between July and September 2020, industry compliance to take down child abuse images fell by 89%, and 50% of recorded online grooming cases between April and June this year took place on Facebook platforms. What conversations have the Government had to ensure that Facebook and others design and deliver platforms that put child protection services front and centre, as they should be?