Online Safety Bill Debate
Full Debate: Read Full DebateLord Knight of Weymouth
Main Page: Lord Knight of Weymouth (Labour - Life peer)Department Debates - View all Lord Knight of Weymouth's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Lords ChamberMy Lords, it is a pleasure to follow the noble Baroness, Lady Stowell, and so many other fine speeches today. I should remind your Lordships of my interests. In particular, I have been working with GoBubble, which provides social media filtering technology. I was also a member of the Joint Committee on this Bill and was previously on the Select Committee on Democracy and Digital Technologies, chaired by the noble Lord, Lord Puttnam.
Right at the heart of this Bill are just two interrelated factors. First, there are bad actors: people who deliberately or carelessly do harm to others both in the real world and virtually, both physically and mentally. Our problem is how content from these bad actors interacts with the systems and processes in the online world that personalise and amplify that content. In 2021, 44% of all global spending on advertising was with Meta and Alphabet-owned businesses. Their platforms, such as Facebook, Instagram and YouTube, are machines with the objective of maximising engagement time on the platform in order to sell more advertising.
The machines have no ethics; they have business objectives. If that means feeding outrageous, disturbing or harmful content, so be it. If that means pushing at Molly Russell content that has now been implicated by the coroner in her death, so be it. If that means the corruption of children, self-harm or fraud, so be it. Whatever turns you on, keeps you engaged and keeps you on the platform is what the machines will push your way. This week, the Children’s Commissioner for England reported that one in five boys watch porn at least every day; that more than half of frequent users seek out violent sex acts; and that Twitter is the site where the highest proportion report seeing explicit sexual content.
The platforms are not all bad but the harms of manipulation and corruption are real and urgent. We must, and will, work together to get this Bill improved and passed by the summer. In doing so, our job with this Bill is to impose ethics on the algorithms used by platforms. This is less about bad content and more about systems. It is about content takedown and content suppression. It is as much about freedom of reach as freedom of speech. For too many people—especially women and girls, as the noble Baroness, Lady Morgan, mentioned—their freedom of expression is constrained by platforms because they are shouted down and abused. They need better protection.
Without change, vulnerable adults with learning difficulties will not be protected by this Bill. Without change, the corruption of truth and democracy by the likes of Trump and Putin will continue. Without change, the journalistic and democratic exemptions in the Bill will be exploited by the likes of Tommy Robinson to spread bile. Without change, content from the likes of Andrew Tate will continue to be amplified. His videos have been viewed more than 13 billion times on TikTok alone, including by any of our children whom we have allowed an account. Teachers, parents and grandparents cannot keep up with what is going on with children online; they need ongoing education and help. I am afraid that Ofcom is not cutting through with its media literacy duty. We must use this Bill to change that. We need to constrain the Secretary of State’s powers over Ofcom so that it is properly independent and give young people themselves more influence over the regulator.
There is much to do. This is as important a job of work as any I have been a part of during my 22 years in Parliament. I look forward to working with all Peers to deliver a Bill that prevents harm, criminalises abusers and overlays human ethics on to these machines of mass manipulation.