4 Leigh Ingham debates involving the Department for Science, Innovation & Technology

Online Harms

Leigh Ingham Excerpts
Thursday 19th March 2026

(4 days, 17 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Ian Sollom Portrait Ian Sollom (St Neots and Mid Cambridgeshire) (LD)
- View Speech - Hansard - - - Excerpts

I beg to move,

That this House believes that current legislation is falling short in preventing online harms; and calls on the Government to review whether it is necessary to introduce new legislation that is centred around harm reduction in this Parliament.

I thank the Backbench Business Committee for granting this debate. Not long after my election in 2024, I visited the Internet Watch Foundation in Cambridgeshire. That organisation is on the frontline of the fight against child sexual abuse material, and is one of only a handful of non-law enforcement bodies worldwide with the legal power to proactively seek out and remove online images and videos of such abuse. During my visit, the IWF told me that, in the preceding five years alone, it had taken down more than 1 million webpages that showed at least one child sexual abuse image—often, they showed hundreds or thousands. The IWF’s annual report last year revealed that 2025 was the worst year on record for child sexual abuse material. Its analysts confirmed 312,000 reports—a 7% rise on the year before. Most starkly, in 2024 they discovered 13 AI-generated videos of child sexual abuse, but in 2025 the figure was 3,440—a rise of over 26,000%, for those who are interested in numbers. Nearly two thirds of those videos were category A material, which is the most extreme classification.

A little while after my visit, I began to work with the Molly Rose Foundation on the proposal in this motion. At the time, the Online Safety Act 2023 had been in law for nearly two years, and the protection of children codes of practice that came from it, which promised to improve user safety dramatically, had just been published and implemented. The text of those codes was heavily criticised by civil society, and even by the Children’s Commissioner, who said they would simply not be strong enough to protect children from the

“multitude of harms they are exposed to online every day.”

It seemed timely for a motion to be brought before the House so that we could scrutinise the Online Safety Act and its resultant codes, as they now are being used in practice, and highlight to the Government the need to take action in this Parliament to protect young people. After the codes were implemented in mid-2025, the Mental Health Foundation published research stating that 68% of young people had experienced harmful content online. It described the harm as one of

“the biggest looming threats to young people’s mental health”.

In October 2025, the Molly Rose Foundation found that over a third of children reported that they had been exposed to at least one type of high-risk content in the past week. In a classroom of 30 children, that is 11 who are, every day, being shown content that promotes suicide and self-harm or that romanticises depression and eating disorders. That is the exact “primary priority content” that the UK’s flagship piece of online safety legislation explicitly promised it would protect them from. Just this week, the BBC aired “Inside the Rage Machine”, which used whistleblower testimony and evidence to lay bare how social media giants such as Meta and TikTok are consistently and deliberately pushing harmful content to users, after finding that their outrage fuelled engagement.

All of that is to say that if the motion for this debate seemed appropriate at the beginning of this Parliament, when I first visited the IWF, it is now urgent. Every week, I hear from parents, young people and organisations who are fighting a losing battle against the proliferation of online harms because, despite its noble aims, the current legislation is falling short of what Parliament envisaged it would do.

Leigh Ingham Portrait Leigh Ingham (Stafford) (Lab)
- Hansard - -

Last week, I ran a supermarket surgery in my constituency. I had a flipboard that asked whether people felt that social media should be banned for under-16s. It is rare to get this level of agreement, but 78% of my constituents of all ages—older people, young people and even children—said yes. What was consistent was the fear they felt about this space and the belief that it is doing damage to young people as they grow up. I am not 100% sure on my position yet, but does the hon. Member agree that the Government are right to consult to work out the best option to protect young people from social media?

Ian Sollom Portrait Ian Sollom
- Hansard - - - Excerpts

The text of the motion asks for a review, and that is certainly what I want to see.

I have not come here today to stir up panic or to imply that the wellbeing of our children, or indeed our adults, is doomed. There is hope and we should not have to accept harm as a reality of life on the internet. As the Molly Rose Foundation chief executive officer, Andy Burrows, noted this week after campaigning pushed both TikTok and Meta to row back on plans for end-to-end encryption in direct messaging,

“tech firms are not immune to pressure”.

However, pressure on its own is not enough. The Government must urgently look at strengthening the Online Safety Act to ensure that pressure has robust legislative backing behind it, and that Ofcom actually has the power to enforce the regulations that will protect us all from harm.

Online harm comes in three forms. First, there is harmful content: the outright illegal and the extreme, posted and peddled by bad actors across social media platforms. Then we have harmful interactions with bad actors, including grooming, cyber-bullying and extortion. I am sure that Members across the House will share many stories of the impact of both types of harm today; it is a tragedy just how many there are. I want to focus on the third form of online harm, which is the harm that arises from not just the type of content encountered online, but the intensity with which it is repeatedly pushed on to young people by the platforms themselves.

This week, I was pleased to participate in the Royal Society pairing scheme. I was paired with Doctor Lizzy Winstone, a researcher from the University of Bristol whose work focuses on how young people use social media and its impact on their mental health. Her most recent research investigates the algorithmic recommendation of content as one of the primary mechanisms that shapes young people’s digital mental health. She and others have found that a large part of online harm is structural, arising from not just individual bad actors, but business models designed at their very core to maximise attention and to profit from provocation.

Social media is built to be addictive. Hooking users in and keeping them engaged is at the very heart of almost every platform’s business model. Algorithmic models cause harm through both overtly harmful content and content that is harmless on the face of it. There are attention deficit harms caused by passive screen watching and health harms associated with an increasingly sedentary lifestyle. Higher social media use has been directly linked to shorter sleep duration and difficulties with sleep onset. Gambling harm is often overlooked, but a recent Guardian investigation found that Meta AI was pointing vulnerable social media users to illegal online casinos and even suggesting ways to bypass UK gambling safeguards. Regulation is clearly not keeping pace with the evolving digital landscape.

Often, it is the directly harmful, even illegal, content that is caught up in these algorithms. The shock, disgust and strong emotion inevitably caused by this content creates engagement: we watch for longer, we engage more, and the algorithm takes this as permission to show us even more of it to keep us hooked. Endless scrolling functionalities allow already vulnerable users to fall into a world where there is no escape from this cycle. Members will be aware that we Liberal Democrats have long called for platforms to implement built-in caps on social media doomscrolling.

In 2017, it was concluded for the first time ever that content on social media had contributed to the death of a young person when teenager Molly Russell tragically took her own life. Before she died, she had viewed thousands of suicide and self-harm videos and images on Pinterest and Instagram, some of which were pushed to her without her asking to see them. The word used by the coroner was that Molly was able—even encouraged by platforms—to “binge” this content.

The normalisation of these recommendation mechanisms has created an awful, self-perpetuating cycle. One case study from the University of Bristol described a 17-year-old girl who was forcing herself to repeatedly watch graphic content of a gory accident on TikTok to try to desensitise herself to violence. She knew that she would be regularly exposed to this kind of content online and wanted to train herself to be able to watch it and not feel sick. We can only assume that due to her increased attention, she was shown even more of this horrific content.

Recommendation systems in and of themselves are no bad thing. They create a personalised space to explore interests and sometimes do filter out content that a user has no interest in. The problem is that a user’s engagement with content does not always indicate their actual interest in it. Another young person from the University of Bristol study—a trans man—described feeling compelled to intervene in homophobic and transphobic comments sections, to try to support his community and challenge prejudice. He was understood by the platform to have engaged, and subsequently he was bombarded with more and more of the same hateful content. The tension between knowing that his algorithm would register his intervention as interest and wanting to actively challenge hateful views was a constant source of stress online.

Problems also arise from a lack of transparency. Not only are social media platforms under no obligation to publish their algorithms, but with AI increasingly being used to build and continually iterate these algorithms, the platforms themselves are often unaware of the exact mechanisms that shape experience. Harm is occurring as a result of an unaccountable black box. Young people are not entirely passive in this system—they know it is happening—but platform tools provide very limited control over what the algorithm continues to recommend.

Looking at Ofcom’s summary of the protection of children codes of practice, we can see how a weak interpretation of the Online Safety Act is allowing such harm to be perpetuated. Volume 4, section 17 says that platforms must

“Ensure content recommender systems are designed and operated so that content indicated potentially to be PPC”—

primary priority content, which is suicide, self-harm, eating disorders and mental health content—

“is excluded from the recommender feeds of children”.

Research shows that children were most likely to report having seen harmful content through feeds with recommender systems—very few actively seek it out—so the intention behind this measure seems good. But then we see that it applies only to “child-accessible” parts of a service that are

“medium or high risk for one or more specific kinds of PPC”.

In Ofcom’s December review, not a single social media platform rated itself high risk for suicide or self-harm content. There is a clear gap between the intention of the legislation and how it is being implemented. That is because the Online Safety Act and its codes are ultimately built around compliance and not harm reduction. Rules-based legislation means that platforms can happily meet their legal duties if measures in the codes are followed, and they are under no obligation to effectively and proactively address the harms identified in their risk assessments. Putting only a moral duty on platforms to protect young people from harm is not going to work—we have seen for years that it does not work.

How can we expect the very same platforms that have been shown to deliberately and knowingly peddle harmful content to young people to essentially police themselves? Why would they bother when it is so much more profitable to tick already loosely defined boxes? A full review of the current legislation must investigate the barriers that Ofcom says are preventing it from delivering on the intentions of Parliament. That includes the safe harbour principle, which allows platforms to claim compliance and skirt enforcement action on harms about which they are already aware, and the complete lack of any obligation in the Act that platforms take active steps to reduce the risk of harm to users. In practice, that means that a platform can follow Ofcom’s codes to the letter, even while its own risk assessment shows that it is aware of serious ongoing harm, and face no enforcement consequences.

Amendments could be passed within months to introduce the robust, risk-based minimum age limits that we Liberal Democrats have been calling for. Minimum joining ages should be determined by a platform-specific assessment of age appropriateness in risk. That will incentivise the market to adopt lower-risk functionalities if platforms wish to open themselves to a wider pool of users.

We could argue that a review of sorts has already taken place: every coroner’s report, every tragic story told in the Chamber and every investigation by charities and organisations make up that review. The evidence is plainly there, but the harm is being allowed to continue. We are here as Members of Parliament to scrutinise, and we have done that. There have been 12 debates with the words “online safety” in the title this Parliament and there have been hundreds of references to “online harm”, yet there has been little indication that the Government are addressing the core issues raised in this debate.

I hope that Members will use this debate to raise the full range of harms we hear about in our work. I ask the Minister to respond specifically to these questions: will the Government examine whether the safe harbour principle is serving Parliament’s original intentions or has become a mechanism that platforms use to avoid accountability for harms about which they are already aware? Will the Government commit to ensuring that any new legislation this Parliament brings forward is built around harm reduction and not compliance?

Rural Mobile Connectivity

Leigh Ingham Excerpts
Thursday 12th February 2026

(1 month, 1 week ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Leigh Ingham Portrait Leigh Ingham (Stafford) (Lab)
- View Speech - Hansard - -

I thank the hon. Member for North Shropshire (Helen Morgan) for bringing forward this important debate. Mobile connectivity remains a real concern for many in rural communities, especially in my constituency of Stafford, Eccleshall and the villages. It affects how people run their businesses, how they keep in touch with family and how they keep safe, as the hon. Lady said, particularly on isolated roads and farms.

I will begin by acknowledging the great progress that has already been made in this area. My experience of the shared rural network and its extended coverage to around 280,000 homes and businesses after £500 million of investment has been positive. I recognise the importance of that investment in the difference that it has made to Staffordshire.

My constituency is about 40% rural, with 60% of people living in towns, and there is such a marked difference in the experience of connectivity. The gap is still quite stark: nearly half of rural deprived areas are not classed as 5G hotspots, compared with just 2.7% of urban deprived communities. In rural areas, only 20% of mobile masts have 5G deployed, compared with 48% in urban areas. Those figures show that although coverage may look strong on national averages, rural communities are feeling the gap and feeling left behind.

The issue is raised with me and my team regularly. Residents in Mare, Whitmore and Acton have all contacted me about unreliable mobile access, which leads to dropped calls, weak indoor signal and stretches of road with no coverage at all. That is just part of their daily lives when living in rural areas.

Sarah Edwards Portrait Sarah Edwards (Tamworth) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making an excellent speech. Similarly, in my constituency, I surveyed residents in a number of villages such as Edingale, Clifton Campville and Harlaston to ask how bad the situation was. Some 49% said that their mobile connectivity was so bad that they could not work from home or even run their business. Does she agree that this really has to be a priority so that our villages are not left behind?

Leigh Ingham Portrait Leigh Ingham
- Hansard - -

I thank my hon. Friend. As fellow Staffordshire MPs, we experience broadly similar issues, and that echoes exactly what many people in my constituency have told me, particularly about working from home.

For farming businesses in particular, of which I have many in my constituency, the impact is even more clear. The National Farmers Union’s most recent survey found that only 22% of respondents report reliable mobile signal across their entire farm, and nearly one in 10 have no 4G or 5G access at all. At the same time, 98% said that mobile signal is important to their business. Here in this House, we regularly ask farmers to access schemes online, communicate digitally with our agencies and adopt new technology, yet many operate with poor or patchy connectivity. The gap between need and access is stark.

In the village of Church Eaton, residents endured years of very poor mobile coverage, at times unable to make 999 calls or receive NHS alerts, despite a mast having already been built under the shared rural network. The infrastructure was there, but it had not been switched on, which left the village in limbo and left residents—let’s be honest—really annoyed. Working closely with the determined residents—I pay tribute to them and the parish council that has campaigned on this for many years—I raised the issue in Parliament and VodafoneThree’s leadership got in contact directly to help get that site back into the company’s investment plan. I am pleased that following that joint effort and constructive engagement, the mast has been running since September, bringing reliable 4G coverage to the village for the first time, but it should not take an MP standing here for that to happen. The infrastructure was already there; the village was not waiting for it. The mast has meant stronger coverage not only for VodafoneThree customers in the village, but for customers of a wide variety of signal providers.

I want to place on the record my thanks again to the parish councils across my constituency that have worked on this issue for years. They regularly gather evidence, engage with providers and keep the issue alive. Their persistence is invaluable in making progress in this space, because as more and more public services move online, access to stable mobile and broadband connectivity become even more important.

Rural communities should not be left waiting while national averages improve on paper. We need faster delivery of the shared rural network to eliminate the total notspots, alongside support for a mix of technologies to reach hard-to-access areas. Most importantly, rural communities must have confidence that they are not an afterthought in any roll-out plans. People living in villages, on farms and down country lanes deserve the same reliable connectivity as anyone else, and closing that gap is essential for fairness and productivity, and also to increasing opportunity in rural Britain. We have some wonderful businesses and local farms that want to develop, but a lack of connectivity can hold them back. I would love to hear what steps the Minister is taking to advance mobile connectivity and involve rural communities moving forward.

Neil O'Brien Portrait Neil O’Brien (Harborough, Oadby and Wigston) (Con)
- View Speech - Hansard - - - Excerpts

On a point of order, Madam Deputy Speaker. For the last 18 months, the Government have been sitting on the guidance relating to gender-questioning children in schools—a very controversial subject—which the Government keep saying is coming. It has become apparent in the last half an hour that they plan to publish this guidance at 4 pm today, just moments before the House goes into recess for a week. It is hard to see this as anything other than a deliberate attempt to avoid the scrutiny of this House on an important issue. What can we do to put this right?

Oral Answers to Questions

Leigh Ingham Excerpts
Wednesday 8th January 2025

(1 year, 2 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Peter Kyle Portrait Peter Kyle
- View Speech - Hansard - - - Excerpts

I think the hon. Gentleman missed the investment summit that the Government held just before Christmas, at which a record £60 billion was invested into this country, £24 billion of which was AI-related. That is almost as much going directly into AI as was committed in total at the previous Government’s investment summit. This Government are unlocking investment; the previous Administration wrecked our economy and public services, and failed to secure faith in our economy for foreign companies to invest in this country.

Leigh Ingham Portrait Leigh Ingham (Stafford) (Lab)
- Hansard - -

6. What steps his Department is taking to increase levels of innovation in Staffordshire.

Feryal Clark Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Feryal Clark)
- View Speech - Hansard - - - Excerpts

In case the House has not heard, this Government are driving innovation, with a record £20.4 billion of research and development investment for 2025-26, powering an innovation-led economy across the UK. In Staffordshire, UK Research and Innovation is backing more than £29 million for 70 cutting-edge research and innovation projects. A stand-out example is Innovate UK’s support for the Staffordshire net zero skills for growth project, which is equipping the country to seize opportunities in the net zero transition.

Leigh Ingham Portrait Leigh Ingham
- View Speech - Hansard - -

Towns such as those in my constituency are key to the economy, but can face unique challenges in accessing innovation opportunities. Please could the Minister tell me how she plans to ensure that towns such as Stafford and Eccleshall are able to access new jobs, skills, investment and growth opportunities?

Feryal Clark Portrait Feryal Clark
- View Speech - Hansard - - - Excerpts

The Department has a clear vision to ensure that the UK remains at the forefront of global innovation—a place where cutting-edge businesses of all sizes can start and grow, and where local people have high-quality jobs, building on local strengths. I am delighted to hear about the new multimillion-pound facility being built at Newcastle and Stafford colleges’ Stafford campus in my hon. Friend’s constituency, supported by £15 million of Government investment. It will welcome learners from September and will help to provide the technical skills that businesses need, both now and in the future, to support regional and national productivity.

Online Safety: Children and Young People

Leigh Ingham Excerpts
Tuesday 26th November 2024

(1 year, 3 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Leigh Ingham Portrait Leigh Ingham (Stafford) (Lab)
- Hansard - -

Huge congratulations to my hon. Friend the Member for Darlington (Lola McEvoy) for securing this debate, which I know is of grave concern not only for my constituents in Stafford, Eccleshall and the villages, but for parents and caregivers throughout the country.

I am concerned that there is a disproportionate impact on girls and young women regarding online harm. Take, for example, the report just mentioned regarding exposure to harmful content; that recent report stated that 60% of girls aged 11 to 16 said that they had received negative comments about their appearance online, so I am very concerned about that growing impact on young people, particularly girls and young women.

Even more troubling is the increase in severe online abuse, such as grooming. In cases where the victim’s gender was identified between 2023 and 24, an overwhelming 81% of the children targeted were girls. I believe the increase in online harm to be directly connected to the increase in violence against women and girls.

I therefore join calls for significantly enhanced rules on social media platforms to safeguard our young people. That must tackle both the blunt and sharp ends of online harm: the insidious exposure to harmful content and the more direct and egregious abuses, such as grooming.