Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Mr Dowd. I congratulate my hon. Friend the Member for Darlington (Lola McEvoy) on securing this debate. As hon. Members can see, debates in Westminster Hall take a whole different form from debates in the House; they are a lot more informative and collegiate, and Westminster Hall is a much nicer place to debate. I welcome the parents in the Public Gallery and thank them for their commitment and the work they continue to do to make sure that this issue stays on our agenda and continues to be debated. I know they have met my colleagues, and I look forward to meeting them as well.
I am grateful to all hon. Members for the incredibly powerful and informative contributions to today’s debate. As the mother of two young children, I always have online safety on my mind. Every time I am on my phone in front of my children or want to keep them distracted by putting on a YouTube video, it scares me, and the issue is always at the back of my mind. It is important that we as parents always have the safety of our children in mind. My hon. Friend the Member for Rother Valley (Jake Richards) talked about being a parent to really young children while being an MP or candidate. As a mother who had two children in part during the last term, I can assure him that it does get easier. I am happy to exchange some tips.
The growth in the use of phones and social media has been a huge societal change, and one that we as parents and citizens are grappling with. I am grateful to all hon. Members here who are engaging in this debate. The Government are committed to keeping children safe online, and it is crucial that we continue to have conversations about how best to achieve that goal. We live in a digital age, and we know that being online can benefit children of all ages, giving them access to better connections, education, information and entertainment. However, we know that it can also accentuate vulnerabilities and expose children to harmful and age-inappropriate content. We believe that our children should be well-equipped to make the most of the digital opportunities of the future, but we must strike the right balance so that children can access the benefits of being online while we continue to put their safety first.
Last week, the Secretary of State visited NSPCC headquarters to speak to their voice of online youth group. That is just the latest meeting in a programme of engagement undertaken by the Secretary of State and my colleague in the other place, Baroness Maggie Jones. Getting this right has been and will continue to be a long process. Many hon. Members here will remember the battle to get the Online Safety Act passed. Despite the opposition—some Members in this place sought to weaken it—there was cross-party consensus and a lot of support, and so it was passed.
On a number of occasions during the passage of the Online Safety Bill in this House, I raised the story of my constituent Joe Nihill from Leeds, who sadly took his own life after accessing very dangerous suicide-related content. I want to bring to the Minister’s attention that before Ofcom’s new powers are put into practice at some point next year, there is a window where there is a particular onus on internet service providers to take action. The website that my constituent accessed, which encouraged suicide, deterred people from seeking mental health support and livestreamed suicide, has been blocked for people of all ages by Sky and Three. Will the Minister congratulate those two companies for doing that at this stage and encourage all internet service providers to do the same before Ofcom’s new powers are implemented next year?
I thank the hon. Member for making that point and I absolutely welcome that intervention by internet providers. As I will go on to say, internet providers do not have to wait for the Act to be enacted; they can start making such changes now. I absolutely agree with him.
Many colleagues have raised the issue of the adequacy of the Online Safety Act. It is a landmark Act, but it is also imperfect. Ofcom’s need to consult means a long lead-in time; although it is important to get these matters right, that can often feel frustrating. None the less, we are clear that the Government’s priority is Ofcom’s effective implementation of the Act, so that those who use social media, especially children, can benefit from the Act’s wider reach and protections as soon as possible. To that end, the Secretary of State for Science, Innovation and Technology became the first Secretary of State to set out a draft statement of strategic priorities to ensure that safety cannot be an afterthought but must be baked in from the start.
The hon. Member for Strangford (Jim Shannon) raised the issue of suicide and self-harm. Ofcom is in the process of bringing the Online Safety Act’s provisions into effect. Earlier this year, it conducted a consultation on the draft illegal content, with one of the most harmful types being content about suicide. Child safety codes of practice were also consulted on. We expect the draft illegal content codes to be in effect by spring 2025, with child safety codes following in the summer.
Under the Act, user-to-user and search services will need to assess the risk that they might facilitate illegal content and must put in place measures to manage and mitigate any such risk. In addition, in-scope services likely to be accessed by children will need to protect children from content that is legal but none the less harmful to children, including pornography, bullying and violent content. The Act is clear that user-to-user services that allow the most harmful types of content must use highly effective age-assurance technology to prevent children from accessing it.
Ofcom will be able to use robust enforcement powers against companies that fail to fulfil their duties. Ofcom’s draft codes set out what steps services can take to meet those duties. The proposals mean that user-to-user services that do not ban harmful content should introduce highly effective age checks to prevent children from accessing the entire site or app, or age-restrict those parts of the service that host harmful content. The codes also tackle algorithms that amplify harm and feed harmful material to children, which have been discussed today. Under Ofcom’s proposal, services will have to configure their algorithms to filter out the most harmful types of content from children’s feeds, and reduce the visibility and prominence of other harmful content.
The hon. Member for Aberdeen North (Kirsty Blackman), the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) and others discussed strengthening the codes. Ofcom has been very clear that it will look to strengthen the codes in future iterations. The Government will encourage it to do so as harmful online technology and the evidence base about such technology evolves.
I am short of time, so I will have to proceed.
For example, Ofcom recently announced plans to launch a further consultation on the illegal content duties once the first iteration of those duties is set out in spring next year. That iterative approach enables Ofcom to prioritise getting its initial codes in place as soon as possible while it builds on the foundations set out in that first set of codes.
My hon. Friends the Members for Slough (Mr Dhesi) and for Lowestoft (Jess Asato) and the hon. Member for Aberdeen North raised the issue of violence against girls and women. In line with our safer streets mission, platforms will have new duties to create safer spaces for women and girls. It is a priority of the Online Safety Act for platforms proactively to tackle the most harmful illegal content, which includes offences such as harassment, sexual exploitation, extreme pornography, internet image abuse, stalking and controlling or coercive behaviour, much of which disproportionately affects women and girls. All services in scope of the Act need to understand the risks facing women and girls from illegal content online and take action to mitigate that.
My hon. Friend the Member for Carlisle (Ms Minns) set out powerfully the issues around child sexual exploitation and abuse. Child sexual abuse is a vile crime that inflicts long-lasting trauma on victims. UK law is crystal clear: the creation, possession and distribution of child sexual abuse images is illegal. The strongest protections in the Online Safety Act are against child sexual abuse and exploitation. Ofcom will have strong powers to direct online platforms and messaging and search services to combat that kind of abuse. It will be able to require platforms to use accredited, proactive technology to tackle CSEA and will have powers to hold senior managers criminally liable if they fail to protect children.
I am running short of time, so I shall make some final remarks. While we remain resolute in our commitment to implementing the Online Safety Act as quickly and effectively as possible, we recognise the importance of these ongoing conversations, and I am grateful to everyone who has contributed to today’s debate. I am grateful to the brave parents who continue to fight for protections for children online and shine a light on these important issues. The Opposition spokesperson, the hon. Member for Runnymede and Weybridge (Dr Spencer), asked a host of questions. I will respond to him in writing, because I do not have time to do so today, and I will place a copy in the Library.
I call Lola McEvoy to briefly respond to the debate.