Online Safety Bill (Programme) (No. 4) Debate
Full Debate: Read Full DebateNatalie Elphicke
Main Page: Natalie Elphicke (Labour - Dover)Department Debates - View all Natalie Elphicke's debates with the Department for Digital, Culture, Media & Sport
(1 year, 11 months ago)
Commons ChamberAs the right hon. Member will note, the Minister had to stop at a certain point and he had spoken for 45 minutes in his opening remarks. I think that he gave a true reflection of many of the comments that were made tonight. The right hon. Member will also know that all the comments from Opposition Members are on the parliamentary record and were televised.
The sooner that we pass the Bill, the sooner we can start protecting children online. This is a groundbreaking piece of legislation that, as hon. Members have said, will need to evolve as technology changes.
Will my right hon. Friend confirm that the Department will consider amendments, in relation to new clause 55, to stop the people smugglers who trade their wares on TikTok?
I commit to my hon. Friend that we will consider those amendments and work very closely with her and other hon. Members.
We have to get this right, which is why we are adding a very short Committee stage to the Bill. We propose that there will be four sittings over two days. That is the right thing to do to allow scrutiny. It will not delay or derail the Bill, but Members deserve to discuss the changes.
With that in mind, I will briefly discuss the new changes that make recommittal necessary. Children are at the very heart of this piece of legislation. Parents, teachers, siblings and carers will look carefully at today’s proceedings, so for all those who are watching, let me be clear: not only have we kept every single protection for children intact, but we have worked with children’s organisations and parents to create new measures to protect children. Platforms will still have to shield children and young people from both illegal content and a whole range of other harmful content, including pornography, violent content and so on. However, they will also face new duties on age limits. No longer will social media companies be able to claim to ban users under 13 while quietly turning a blind eye to the estimated 1.6 million children who use their sites under age. They will also need to publish summaries of their risk assessments relating to illegal content and child safety in order to ensure that there is greater transparency for parents, and to ensure that the voice of children is injected directly into the Bill, Ofcom will consult the Children’s Commissioner in the development of codes of practice.
These changes, which come on top of all the original child protection measures in the Bill, are completely separate from the changes that we have made in respect of adults. For many people, myself included, the so-called “legal but harmful” provisions in the Bill prompted concerns. They would have meant that the Government were creating a quasi-legal category—a grey area—and would have raised the very real risk that to avoid sanctions, platforms would carry out sweeping take-downs of content, including legitimate posts, eroding free speech in the process.