Online Safety Bill Debate
Full Debate: Read Full DebateLord Weir of Ballyholme
Main Page: Lord Weir of Ballyholme (Democratic Unionist Party - Life peer)Department Debates - View all Lord Weir of Ballyholme's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, I will briefly comment positively on the Minister’s explanation of how these offences might work, particularly the association of the liability with the failure to enforce a confirmation decision, which seems entirely sensible. In an earlier stage of the debate, there was a sense that we might associate liability with more general failures to enforce a duty of care. That would have been problematic, because the duty of care is very broad and requires a lot of pieces to be put in place. Associating the offences with the confirmation decision makes absolute sense. Having been in that position, if, as an executive in a tech company, I received a confirmation decision that said, “You must do these things”, and I chose wilfully to ignore that decision, it would be entirely reasonable for me to be held potentially criminally liable for that. That association is a good step forward.
My Lords, I will speak to Amendment 268C, which is in my name and that of the noble Baroness, Lady Benjamin, who has been so proactive in this area. The amendment seeks to clarify the threshold for Ofcom to take immediate enforcement action when children are exposed to suicide, self-harm, eating disorders and pornographic materials. It would require the regulator to either take that action or at least provide an explanation to the Secretary of State within a reasonable timeframe as to why it has chosen not to.
When we pass the Bill, the public will judge it not simply on its contents but on its implementation, its enforcement and the speed of that enforcement. Regulatory regimes as a whole work only if the companies providing the material believe the regulator to be sufficiently muscular in its approach. Therefore, the test is not simply what is there but how long it will take for a notice, whenever it is issued, to lead to direct change.
I will give two scenarios to illustrate the point. Let us take the example of a video encouraging the so-called blackout challenge, or choking challenge, which went viral on social media about two years ago. For those who are unaware, it challenged children to choke themselves to the point at which they lost consciousness and to see how long they could do that. This resulted in the death of about 15 children. If a similar situation arises and a video is not removed because it is not against the terms and conditions of the service, does Ofcom allow the video to circulate for a period of, say, six months while giving a grace period for the platform to introduce age gating? What if the platform fails to implement that highly effective age verification? How long will it take to get through warnings, a provisional notice of contravention, a representation period, a confirmation decision and the implementation of required measures before the site is finally blocked? As I indicated, this is not hypothetical; it draws from a real-life example. We know that this is not simply a matter of direct harm to children; it can lead to a risk of death, and has done in the past.
What about, for example, a pornographic site that simply has a banner where a person can self-declare that they are over 18 in order to access it? I will not rehearse, since they have been gone through a number of times, the dangers for children of early exposure to violent pornography and the impact that will have on respectful relationships, as we know from government reports, and particularly the risk it creates of viewing women as sex objects. It risks additional sexual aggression towards women and perpetuates that aggression. Given that we are aware that large numbers of children have access to this material, surely it would be irresponsible to sacrifice another generation of children to a three-year implementation process.