Digital Exploitation of Women and Girls Debate

Full Debate: Read Full Debate
Department: Home Office

Digital Exploitation of Women and Girls

Lola McEvoy Excerpts
Tuesday 27th January 2026

(1 day, 9 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Lola McEvoy Portrait Lola McEvoy (Darlington) (Lab)
- Hansard - -

It is a pleasure to serve under your chairmanship, Ms Jardine. The argument for urgent and robust regulation to protect girls online has been won, thanks in no small part to the grit and resilience of survivors who have spoken out, and of the Minister herself, who is a formidable force in this area.

I began campaigning on this issue in defence of 14-year-old girls. Being 14 is tough: your hormones are wild, your body is changing, you take risks and you are desperate to belong. It has not been that long—although it is longer than I care to state to the House—since I was 14, so I do remember it. Some people advocate for an imaginary time gone by, filled with innocence, skipping ropes and cross-stitch, but that world was not real for most girls. Most 14-year-old girls will take risks, will keep secrets, will have a crush or 10, and will say mean things they should not say. That is part of growing up.

What we have allowed to happen to our girls, through the explosion of unsupervised stranger contact and self-published content, is utterly appalling. It is not normal, and we must take action now. I will outline as quickly as possible why I think we must take action to ban any form of stranger contact for under-16s online and why self-published content and functionalities that publish unregulated and unvetted content need to be banned for under-16s, to stop the exploitation.

The first meeting I had when I was elected was with the headteachers of secondary schools in Darlington, about online safety. I wanted to hear what the real issue was in Darlington and how severe it was, because so many parents had raised it with me. The results from the first forum, in which a thousand pupils came forward, were worse than I had feared: 60% of girls had known someone who had been bullied or blackmailed online, 53% of girls had been contacted by somebody lying about their age, 49% had been asked for pictures or personal information, compared with 28% of boys, and over 70% had been contacted by a stranger. One 14-year-old girl told me that they had been added to groups of strangers and that extreme content was then shared. That was on an app where you are not supposed to have contact from strangers.

The platforms say that adults should no longer be able to contact under-16s, but it is obvious that it is still happening. It is easy to pass through: recent analysis of the ban in Australia showed that a lot of children had drawn on moustaches and coloured in fake beards in order to pass the facial recognition age verification test. I urge the Minister to look into that and to offer her support for a more rigorous ban.

Obviously, strangers should not be able to contact under-16-year-old girls. The secondary point about enforcement is quite clear, but there is also a point that has not yet been made to the House, about self-publishing and online safety. Where we see self-published content, we see an organised criminal network of people grooming children through links and through more enticing content that might lead them into a darker space that is even less regulated. I urge the Minister to support work to address that.

Finally, we need to address the deep imbalance in who pays the price for online extortion with images of girls. Girls who make a single mistake are made to suffer a permanent price. That is obviously wrong. Their image could be circulated at any time. The threat of it is unbearable: it can be used anywhere and shared with any number of people throughout their life. That is brutal enough, but as one girl in the forum said to me, “It’s so wrong. The girl’s always blamed. She’s totally responsible.” Can the Minister outline how we can do more to support girls who are victims and survivors?

--- Later in debate ---
Marie Goldman Portrait Marie Goldman (Chelmsford) (LD)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Ms Jardine. I thank the hon. Member for Preston (Sir Mark Hendrick) for securing this very important debate.

All of us here recognise that rapid technological change is creating new risks and contexts for the exploitation of women and girls. The speed, anonymity and ease of communication provided by social media, combined with increasingly sophisticated AI tools, have led to new forms of sexual abuse emerging at alarming rates.

In 2025, the Internet Watch Foundation discovered 3,440 AI videos of child sexual abuse, a vast increase on the previous year, when only 13 such videos were found. That is why this debate, and our swift action, are so important. Far more needs to be done to keep women and girls safe online, and doing so is becoming at once more difficult and more urgent by the day. If we do not act swiftly, we cannot be surprised when new technologies exploit our lack of urgency.

Lola McEvoy Portrait Lola McEvoy
- Hansard - -

Does the hon. Member have any observations on the fact that technology has always outstripped legislation, and that this is actually about accountability and the enforcement of regulation?

Marie Goldman Portrait Marie Goldman
- Hansard - - - Excerpts

Absolutely, and that has always been the case. Equally, we need to learn from the fact that it has always been the case and not be surprised when these things happen. We must not wring our hands and say, “There is harm being done—what could we possibly do about it?” We need to think smarter than that and bring in legislation that is much more forward-thinking and adaptable, and enables swifter action.

As hon. Members have already pointed out in this debate, digital abuse and exploitation are overwhelmingly targeted at women and girls. Research from Internet Matters found that 99% of new deepfakes are of women and girls. Moreover, according to the Revenge Porn Helpline, 98% of intimate images reported to its service were of women and 99% of deepfake intimate image abuse depicts women. It has also been discovered that many AI nudification tools do not actually work on images of boys and men.

We have now reached a point where AI tools embedded in major platforms are capable of producing sexual abuse material, demonstrating serious failings in our current framework. X’s AI tool, Grok, is a case in point. We have talked about this many times before. Grok facilitated the illegal generation and circulation of non-consensual sexual images, yet Ofcom’s response was, I am sorry to say, woefully slow. The executive summary of the violence against women and girls strategy states that it will

“ensure that the UK has one of the most robust responses to perpetrators of VAWG in the world.”

I agree with that intention, but we must recognise that Ofcom’s response was not wholly robust. We must do something about that; we owe it to women and girls in this country to act sooner and stronger. We need more effective legislation and a regulator with the capability and confidence to take appropriate and, crucially, swift action.

I welcome the move to make the creation of non-consensual intimate AI images a priority offence under the Online Safety Act, but that will be effective only if online platforms and services are held accountable under that Act. My Liberal Democrat colleagues and I have called on the National Crime Agency to launch an urgent criminal investigation into X, which should still happen, and to treat the generation of illegal, sexual abuse material with the seriousness it demands. We must act decisively when social media platforms refuse to comply with the law.

It is also time that we introduce age ratings for online platforms and limit harmful social media to over-16s. How can we expect to tackle violence against women and girls when the next generation is being drip-fed misogynistic content on social media?

--- Later in debate ---
Sarah Bool Portrait Sarah Bool (South Northamptonshire) (Con)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Ms Jardine. Digital exploitation does not affect women and girls exclusively, but, given that four in five victims of online grooming are girls, it is an issue that we must focus on. As MPs, we are all aware of the risks and threats that women face in the online sphere. It is no surprise that the National Society for the Prevention of Cruelty to Children found that only 9% of girls feel safe in online spaces. The accounts of stalking given earlier are terrifying, especially those using Ring doorbells, which are designed to keep people safe; that they would be manipulated in that way is horrific. The case of Holly that involved Snapchat in the constituency of the hon. Member for Hexham (Joe Morris) is frankly horrifying.

There is no doubt that the complexity of the online world has resulted in significant digital exploitation. At this very moment, online content is being produced that takes advantage of women for financial gain. That is particularly worrying given that, according to Ofcom’s 2025 report on the time people spend online, women are spending more time than men across an array of websites. The issues around the digital exploitation of women and girls are particularly prominent on social media sites: over half of girls and women report receiving sexist comments about themselves online. This is a problem on an industrial scale.

The recent Grok sexual imagery debacle brought this into sharp focus. It demonstrated the dangers posed to women who had not even engaged with the technology. People merely used an existing image to take advantage of the technology and spread it using the power of social media. I welcome steps to stop it, but are we equipped to handle the changing digital landscape in the future? The Online Safety Act introduced key changes to the Sexual Offences Act 2003 and criminalised sharing intimate images of another person without their consent. The Government are now adding provisions to the Act to make it a criminal offence to create non-consensual intimate images. Do the Government believe that that will be sufficient, and that Ofcom has the necessary powers to stop this abhorrent practice?

What I have seen from the Government so far is a reactive approach to AI and how it relates to women and girls. The technology is undoubtedly here to stay, but given the uncertainty of its development, is the Minister confident that the Government’s approach is sufficiently agile to prevent people from taking advantage of the technology to exploit women and girls?

As we have heard, AI is only one part of the problem: social media is driving much of the digital exploitation of women and girls. Data from 44 forces provided to the NSPCC showed that the police recorded 7,263 “sexual communications with a child” offences in the last year— a number that has almost doubled since the offence came into force in 2017-18. Data from the crime survey of England and Wales showed an increase of 6% in child exploitation offences compared with the previous year, and that comes on top of evidence that these platforms are linked to the fact girls are twice as likely as boys to experience anxiety. Recent data shows that girls who use social media at the age of 11 report greater distrust of other people at the age of 14.

The problem is only growing. Every day that the Government delay is another day that millions of girls are left at risk. We do not need further reviews or consultations; we need a ban on social media for under-16s. It is time to grip this issue.

Lola McEvoy Portrait Lola McEvoy
- Hansard - -

Will the hon. Lady elaborate on her definitions of “social media” and “ban”?

Sarah Bool Portrait Sarah Bool
- Hansard - - - Excerpts

In terms of social media, I mean platforms such as Facebook and Snapchat; I am not talking about WhatsApp, which is a communication platform that many families use, although we have to be careful how it is used, because images can be shared on it.

A ban is about ensuring that children cannot access these platforms. The issue has been raised at different levels. The problem is the content that children can see, and especially the way the algorithms are used. I recognise that the companies also need to take responsibility for what is being accessed and how people are accessing it, because this is going on at a scale larger than any parent could imagine. This is not the social media that we grew up on, where we used to post a little note on a wall for our friend’s birthday or upload photos from a night out—that is definitely not what children are seeing.

Lola McEvoy Portrait Lola McEvoy
- Hansard - -

My problem with the hon. Lady’s argument is that we have constantly said that our legislation is lagging behind technological advances, but the proposed solution is to name a number of platforms where there is evidence of exploitation, crime and damage. I agree that we need to do that, but is it not better to make evergreen legislation, as some Members have argued, than to have a list of examples that somebody else has come up with?

Sarah Bool Portrait Sarah Bool
- Hansard - - - Excerpts

I agree, but we need to take action now on the ones that we are aware of. Our legislation absolutely needs to be much more agile for the future, and I am not saying that a ban will be a silver bullet, but it will protect many girls from digital exploitation. That is why I am asking the Minister to follow the policy set out by the Conservative party, which was accepted in the House of Lords, and prohibit those under the age of 16 from using social media. If we do not put our children into those arenas, they will be far less likely to experience the opportunities for exploitation that stem from the internet and target the young and the vulnerable.

If the Government support those measures, they could move fast and take action without delay. Let me be clear: the challenges posed by digital exploitation will not vanish if we prohibit the use of social media, but that would be a bulwark against the dangers that social media poses, particularly to young people. If we allow people to access these platforms when they are more mature and more educated, we can hopefully achieve reductions in exploitation.