Online Safety Act: Implementation Debate
Full Debate: Read Full DebateGregor Poynton
Main Page: Gregor Poynton (Labour - Livingston)Department Debates - View all Gregor Poynton's debates with the Department for Science, Innovation & Technology
(1 day, 15 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairship, Mr Stringer. My congratulations to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) on securing this important debate.
Online safety and the wellbeing of our children and young people in digital and online spaces are issues that guide many of us in the House, across the parties, and across the country. I speak only on my own behalf, but as chair of the all-party parliamentary group on children’s online safety, I believe that the Online Safety Act is landmark legislation that has the potential to transform the safety of children and young people in the online world and I applaud the Government’s commitment to creating the safest possible environment for our children, especially in the face of the growing dangers that lurk in the online space.
The Act is designed to tackle the pervasive issues of child sexual abuse material and online grooming. With provisions such as the requirement for platforms to scan for known child sexual abuse material, it has the potential to reduce significantly the availability of such content. Platforms will now have a legal obligation to take action, including by adopting measures such as hash matching, which will prevent the sharing of known CSAM. This is a major step forward and will undoubtedly save countless children from exploitation.
However, there are some concerns that I wish to raise to ensure that the full potential of the Act is realised. Hon. Members have raised many of them already, but I hope that this will give weight to them, and I hope that Ofcom will be listening to our concerns about the Act’s implementation. One of the most pressing issues raised by experts, including the Internet Watch Foundation, is the interpretation of “technically feasible” in Ofcom’s illegal harms codes. Although the Act requires platforms to take steps to remove illegal content, the codes suggest that services are obliged to do so only when that is deemed technically feasible. That could lead to a situation in which platforms, rather than taking proactive steps to safeguard users, simply opt out of finding innovative solutions to prevent harm.
I do not believe that that is the ambitious, risk-based regulatory approach that Parliament envisaged when it passed the Online Safety Act. These are the same platforms that have spent billions of pounds on R&D developing highly sophisticated algorithms to solve complex technical problems, and effectively targeting ads to drive revenue and serve audiences content that they want to see. They have a global reach: they have the tools, the people and the budgets to solve these problems. Therefore, we must ensure that platforms are incentivised to go beyond the bare minimum and truly innovate to protect our children. I echo the calls from multiple civil society organisations working in this area for us to require platforms to take a safety-by-design approach.
Another serious concern is the potential for platforms to use the safe harbour provision offered by the Act. That would allow companies to claim that they are compliant with the codes of practice, simply by following the prescribed rules and without necessarily addressing the underlying harms on their platforms. As the Internet Watch Foundation has rightly pointed out, it risks leaving platforms operating in a way that is compliant on paper but ineffective in practice.
I also ask Ofcom to look more quickly, as my hon. Friend the Member for Lowestoft (Jess Asato) has suggested, at Apple and Google’s app stores. They have a wealth of data and can be effective gamekeepers, particularly on age verification, if they are pressed into service. Finally, I encourage the Government and Ofcom to address more fully the issue of private communications. Many predators exploit private messaging apps to groom children, yet the Act’s provisions on private communications are limited. It is vital that we ensure that private spaces do not become safe havens for criminals and that platforms are held accountable for the spread of CSAM, regardless of whether that occurs in private or public spaces.
I hope that my hon. Friend the Minister can address those points in her response and that they will be kept front of mind by Ofcom, the Government and the tech giants as we all seek to ensure that digital and online spaces, which are increasingly important in all our lives, are safe and secure for our children and young people.