Online Harms White Paper

Tim Loughton Excerpts
Monday 8th April 2019

(5 years ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Jeremy Wright Portrait Jeremy Wright
- Hansard - - - Excerpts

I am grateful to the hon. Gentleman for his support. He makes a fair point. He is of course right that there will be opposition to what is proposed, but it is worth noting that online companies, including Facebook, have recognised that forms of regulation are inevitable, and we shall expect them to co-operate in the design of these processes. If they choose not to, they will find that we shall regulate anyway.

Tim Loughton Portrait Tim Loughton (East Worthing and Shoreham) (Con)
- Hansard - -

Over the past 20 years, the thrust of children’s legislation has been to place a duty on public agencies to co-operate in the protection and safeguarding of vulnerable children, yet no such duty exists for social media companies. In that time, social media companies, using complicated algorithms, have become exceedingly skilful at trying to persuade me that I need to buy essential products that I never knew I could not live without. Will the duty of care require those companies proactively to use algorithms and artificial intelligence not only to block harmful sites in the first place, but to flag up vulnerable users who search for terms such as “kill myself” and clearly harmful websites so they are detected and helped?

Jeremy Wright Portrait Jeremy Wright
- Hansard - - - Excerpts

I am grateful to my hon. Friend. He is right that we should be particularly concerned with the most vulnerable in our society—especially children. The way we envisage the duty of care operating is that online companies should do all they reasonably can to keep their users safe. The greater the user’s vulnerability, the more care they should take to do so. It follows that, in relation to children who may be using those services—of course, this will apply particularly to services that are attractive to children—there will be a greater onus on those responsible to act. We want to see a regulator pay close attention to what has been done—proactively, not simply reactively—to ensure that that harm can be avoided, whether by the use of algorithms or by other methods. The onus will be very clearly on those who provide the service to satisfy the regulator that they are doing all they can. If they are not, the consequences I described earlier can follow.