Oral Answers to Questions

Gregor Poynton Excerpts
Wednesday 25th June 2025

(3 weeks ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

Funnily enough, the statistics in the hon. Member’s constituency are better than the national average—just very slightly, by a smidgen—but I am very happy to meet him. More importantly, he could come into the Department and meet Building Digital UK so that we can explain exactly what needs to happen in his constituency to secure the aims that he is seeking.

Gregor Poynton Portrait Gregor Poynton (Livingston) (Lab)
- Hansard - -

2. What steps his Department is taking to keep children safe online.

Feryal Clark Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Feryal Clark)
- View Speech - Hansard - - - Excerpts

As the chair of the all-party parliamentary group on children’s online safety, my hon. Friend will know that keeping children safe online is a priority for this Government. We are focused on implementing the Online Safety Act 2023 so that children can benefit from its wide reach and protection. The children’s code that is coming in next month will see a step change in the experience of children online in the UK. While we do not pretend that that is job done, and we are working at pace to develop a further online safety package, children will no longer be able to access pornography or other unsuitable content, including content that encourages or promotes self-harm, eating disorders or suicide.

Gregor Poynton Portrait Gregor Poynton
- View Speech - Hansard - -

The National Crime Agency and other law enforcement agencies have highlighted the growing prevalence of AI-generated child sexual abuse material as one of the biggest threats to public safety. It is a growing threat to us online. That is why I was astonished last week to see the Tories and Reform vote against the Crime and Policing Bill, which contains world-leading measures to tackle this horrific crime. Does the Minister agree that it is frankly disgusting to see the Tories and Reform using this issue for party politics?

Feryal Clark Portrait Feryal Clark
- View Speech - Hansard - - - Excerpts

I do indeed agree with my hon. Friend on that. Child sexual exploitation and abuse is one of the most horrendous harms, and the Government are committed to ensuring that UK law keeps pace with criminal use of technologies including AI. As he says, we have introduced a world-leading offence in the Crime and Policing Bill to criminalise AI models that have been optimised to create child sexual abuse material. This new offence builds on the protections in the Online Safety Act, and I am very clear that nothing is off the table when it comes to keeping our children safe.

Online Safety Act: Implementation

Gregor Poynton Excerpts
Wednesday 26th February 2025

(4 months, 2 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Gregor Poynton Portrait Gregor Poynton (Livingston) (Lab)
- Hansard - -

It is a pleasure to serve under your chairship, Mr Stringer. My congratulations to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) on securing this important debate.

Online safety and the wellbeing of our children and young people in digital and online spaces are issues that guide many of us in the House, across the parties, and across the country. I speak only on my own behalf, but as chair of the all-party parliamentary group on children’s online safety, I believe that the Online Safety Act is landmark legislation that has the potential to transform the safety of children and young people in the online world and I applaud the Government’s commitment to creating the safest possible environment for our children, especially in the face of the growing dangers that lurk in the online space.

The Act is designed to tackle the pervasive issues of child sexual abuse material and online grooming. With provisions such as the requirement for platforms to scan for known child sexual abuse material, it has the potential to reduce significantly the availability of such content. Platforms will now have a legal obligation to take action, including by adopting measures such as hash matching, which will prevent the sharing of known CSAM. This is a major step forward and will undoubtedly save countless children from exploitation.

However, there are some concerns that I wish to raise to ensure that the full potential of the Act is realised. Hon. Members have raised many of them already, but I hope that this will give weight to them, and I hope that Ofcom will be listening to our concerns about the Act’s implementation. One of the most pressing issues raised by experts, including the Internet Watch Foundation, is the interpretation of “technically feasible” in Ofcom’s illegal harms codes. Although the Act requires platforms to take steps to remove illegal content, the codes suggest that services are obliged to do so only when that is deemed technically feasible. That could lead to a situation in which platforms, rather than taking proactive steps to safeguard users, simply opt out of finding innovative solutions to prevent harm.

I do not believe that that is the ambitious, risk-based regulatory approach that Parliament envisaged when it passed the Online Safety Act. These are the same platforms that have spent billions of pounds on R&D developing highly sophisticated algorithms to solve complex technical problems, and effectively targeting ads to drive revenue and serve audiences content that they want to see. They have a global reach: they have the tools, the people and the budgets to solve these problems. Therefore, we must ensure that platforms are incentivised to go beyond the bare minimum and truly innovate to protect our children. I echo the calls from multiple civil society organisations working in this area for us to require platforms to take a safety-by-design approach.

Another serious concern is the potential for platforms to use the safe harbour provision offered by the Act. That would allow companies to claim that they are compliant with the codes of practice, simply by following the prescribed rules and without necessarily addressing the underlying harms on their platforms. As the Internet Watch Foundation has rightly pointed out, it risks leaving platforms operating in a way that is compliant on paper but ineffective in practice.

I also ask Ofcom to look more quickly, as my hon. Friend the Member for Lowestoft (Jess Asato) has suggested, at Apple and Google’s app stores. They have a wealth of data and can be effective gamekeepers, particularly on age verification, if they are pressed into service. Finally, I encourage the Government and Ofcom to address more fully the issue of private communications. Many predators exploit private messaging apps to groom children, yet the Act’s provisions on private communications are limited. It is vital that we ensure that private spaces do not become safe havens for criminals and that platforms are held accountable for the spread of CSAM, regardless of whether that occurs in private or public spaces.

I hope that my hon. Friend the Minister can address those points in her response and that they will be kept front of mind by Ofcom, the Government and the tech giants as we all seek to ensure that digital and online spaces, which are increasingly important in all our lives, are safe and secure for our children and young people.

Social Media Use: Minimum Age

Gregor Poynton Excerpts
Monday 24th February 2025

(4 months, 3 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Gregor Poynton Portrait Gregor Poynton (Livingston) (Lab)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Vickers. I am the chair of the all-party parliamentary group on children’s online safety, but I am contributing on my own behalf.

The opportunities and perils of social media are increasingly weighing on the minds of parents, educators, society and, above all, young people themselves. My hon. Friend the Member for Beckenham and Penge (Liam Conlon) spoke about balance, which I think is one of the key issues. Our young people require digital skills and literacy in order to access the modern world, be it the world of work, public services or their social lives, and we need to give them the tools to do that. This has been a useful debate to think about how we do that.

We can see the strength of feeling about this measure reflected in the parliamentary petition that we are discussing. It is a petition signed by 92 of my Livingstone constituents, and it speaks to the widespread anxieties about the impact of social media on our children and young people. Those concerns are real and heartfelt, and they come from not only parents and communities, but young people themselves, who are having to navigate those digital landscapes every single day. They tell us that social media is not merely a tool for connection, but a space where the line between reality and illusion is often blurred, and where photoshopped images and curated lifestyles can distort self-perception.

I am convinced of the merits of enforcing a minimum-age requirement of 16 for social media on the Australian model. I am in favour on mental health grounds, with social media shown in study after study to be linked to increased anxiety, depression and low self-esteem for young people. I am in favour on online safety grounds, with social media exposing children to cyberbullying, predators, misinformation and harmful content. I am in favour in order to try to reverse children’s decreasing attention spans; we need to give our kids the support they need to focus, to learn and to reach their full potential.

I believe, however, that it is vital not to leave young people out of those conversations, but to centre on their concerns and experiences. I spoke the other week to academic colleagues at the University of Manchester, who stressed the complexity and variety of young people’s views on these subjects. They have conducted research, including focus groups, to understand how children use social media and what it means to them and their lives. They pointed out that a key concern of those young people in the focus groups, short of a ban, was the capacity to better distinguish between content that is real and content that is faked, manipulated or highly curated. Their point was not so much about disinformation or misinformation, but more about those perfect lifestyles that are shown on Instagram and other platforms. As has been mentioned, that is more insidious and not easy to ban, but it has a real effect on young people and on their perceptions of themselves and their lives.

That point brings me to the role of social media companies. Many hon. Members have mentioned those companies in this debate, and it is right to say that they have not taken enough responsibility for the content on their sites. The incentive at the moment is to let loose, for eyeballs and time spent, rather than to ensure that the content is properly moderated and is going to the right people. We already have various minimum-age restrictions in place, but the challenge has been to enforce them. Social media companies must adhere to and enforce them. With or without a ban, we need more effective oversight and accountability for how those platforms operate. There is also a vital role for industry leaders such as Google and Apple through their app stores. These gatekeepers possess significant influence and could do much more to ensure that age verification and content moderation is robust and reliable.

Hon. Members have also mentioned smartphone bans. I was pleased that at the Scottish Labour conference at the weekend Anas Sarwar, our leader in Scotland, said that in our manifesto for the 2026 Scottish parliamentary elections will be a ban on mobile phone use in Scottish schools. That gives parents and educators—and children themselves—clarity on what we think is right and not right. I do not believe that having phones in schools is right for children or for the educators trying to do their jobs.

I believe that the minimum age for social media is an idea whose time has come. For me, it is a matter of protection and of ensuring that we prepare young people mentally and emotionally as best we can to handle the pressures that social media can bring. Even without a ban, however, we must ensure today that existing age limits are being properly and rigorously enforced, and we must engage robustly with the tech companies to ensure that they are doing all they can to protect our young people and children.