Draft Online Safety Bill Report Debate
Full Debate: Read Full DebateSuzanne Webb
Main Page: Suzanne Webb (Conservative - Stourbridge)Department Debates - View all Suzanne Webb's debates with the Department for Digital, Culture, Media & Sport
(2 years, 9 months ago)
Commons ChamberIt was a privilege to serve on the Joint Committee, which was chaired admirably by my hon. Friend the Member for Folkestone and Hythe (Damian Collins). It was a cross-party, cross-House experience to which we brought a wealth of our own experience, and the report seems to have been warmly received.
There is no doubt that the Committee’s experience of listening to hours of harrowing testimony and reading the written evidence was truly humbling, but I have always questioned why we actually need the Bill—that was my constant narrative all the way through. As we trawled through the written evidence and listened to hours of harrowing verbal evidence, it was unclear to me why the tech companies did not remove harmful content at the first opportunity and monitor their systems. They should be doing that in the first instance.
Those systems cause so much pain and upset. They have led to insurrection, to prosecutions, to people being robbed of their hard-earned money and to people dying. The problem that our Committee faced is that many tech companies are now bigger than a single news agency—arguably than any Government, for that matter—and have a monopoly on people’s thoughts and beliefs, driven by algorithms that are driven by immense profit.
In the time that I have, I will focus on the governance element of our recommendations. Robust regulatory oversight will be critical to ensuring this Government’s ambition for us to be one of the safest places online in the world. To put in context why that is so important, let me explain about killer algorithms.
An algorithm is a series of instructions telling a computer how to transform a set of facts about the world into useful information. My hon. Friend the Member for Gosport (Dame Caroline Dinenage) touched on the point that an algorithm can constantly recommend pictures of dogs to dog lovers like me, but the dark side is that it can also constantly recommend to a vulnerable teenager pictures of self-harm, suicide content, violent sexual pornography or unsolicited contact with adults they do not know, right the way through to more insidious harms that might be built up over time.
We heard the sad story of the suicide of Molly Russell from her father during the evidence sessions. She was a 14-year-old who killed herself after viewing images of self-harm and suicide online. The coroner heard that in her last six months she used her Instagram account more than 120 times a day, liked more than 11,000 pieces of content and shared over 1,500 videos. An inquest is examining how algorithms contributed to her death.
During the evidence sessions, we also learned of Zach Eagling. Gorgeous 10-year-old Zach has epilepsy; I have had the privilege of meeting him. He was subject to the most deplorable and deliberate practices targeting epilepsy sufferers with flashing images.
Those were two of the stand-out moments that broke my heart during the evidence sessions. Why were the tech companies not stopping these killer algorithms? Why did they allow this to happen? In principle, tech companies self-regulate already, but they have failed. Lack of accountability, combined with commercialisation, has created a perfect storm in which social media can literally kill, so the natural conclusion is that tech companies must be held liable for systems that they have created to make money for themselves and that have had harmful outcomes for others.
Our report recommends compelling service providers to safeguard vulnerable users properly and regulate illegal content. For me, the key recommendation is
“that a senior manager at board level or reporting to the board should be…made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users.”
Let me use the case of 10-year-old Zach. This is what it would mean to him if our recommendations were accepted: sending flashing images to epilepsy sufferers would become a criminal offence.
The human cost of the internet is unquantifiable, and I applaud the Government for what will be a ground-breaking and truly world-leading Bill. Our recommendations will ensure that the Bill holds platforms to account and achieve the Government’s aim of making the United Kingdom the safest place in the world to be online. We owe that to Molly Russell and to Zach Eagling. The tech companies should be removing harmful content and enforcing safety by design at the first opportunity. Surely they do not have to wait for the Bill; they can do the right thing now.