Digital Exploitation of Women and Girls Debate
Full Debate: Read Full DebateMarie Goldman
Main Page: Marie Goldman (Liberal Democrat - Chelmsford)Department Debates - View all Marie Goldman's debates with the Home Office
(1 day, 9 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Marie Goldman (Chelmsford) (LD)
It is a pleasure to serve under your chairship, Ms Jardine. I thank the hon. Member for Preston (Sir Mark Hendrick) for securing this very important debate.
All of us here recognise that rapid technological change is creating new risks and contexts for the exploitation of women and girls. The speed, anonymity and ease of communication provided by social media, combined with increasingly sophisticated AI tools, have led to new forms of sexual abuse emerging at alarming rates.
In 2025, the Internet Watch Foundation discovered 3,440 AI videos of child sexual abuse, a vast increase on the previous year, when only 13 such videos were found. That is why this debate, and our swift action, are so important. Far more needs to be done to keep women and girls safe online, and doing so is becoming at once more difficult and more urgent by the day. If we do not act swiftly, we cannot be surprised when new technologies exploit our lack of urgency.
Lola McEvoy
Does the hon. Member have any observations on the fact that technology has always outstripped legislation, and that this is actually about accountability and the enforcement of regulation?
Marie Goldman
Absolutely, and that has always been the case. Equally, we need to learn from the fact that it has always been the case and not be surprised when these things happen. We must not wring our hands and say, “There is harm being done—what could we possibly do about it?” We need to think smarter than that and bring in legislation that is much more forward-thinking and adaptable, and enables swifter action.
As hon. Members have already pointed out in this debate, digital abuse and exploitation are overwhelmingly targeted at women and girls. Research from Internet Matters found that 99% of new deepfakes are of women and girls. Moreover, according to the Revenge Porn Helpline, 98% of intimate images reported to its service were of women and 99% of deepfake intimate image abuse depicts women. It has also been discovered that many AI nudification tools do not actually work on images of boys and men.
We have now reached a point where AI tools embedded in major platforms are capable of producing sexual abuse material, demonstrating serious failings in our current framework. X’s AI tool, Grok, is a case in point. We have talked about this many times before. Grok facilitated the illegal generation and circulation of non-consensual sexual images, yet Ofcom’s response was, I am sorry to say, woefully slow. The executive summary of the violence against women and girls strategy states that it will
“ensure that the UK has one of the most robust responses to perpetrators of VAWG in the world.”
I agree with that intention, but we must recognise that Ofcom’s response was not wholly robust. We must do something about that; we owe it to women and girls in this country to act sooner and stronger. We need more effective legislation and a regulator with the capability and confidence to take appropriate and, crucially, swift action.
I welcome the move to make the creation of non-consensual intimate AI images a priority offence under the Online Safety Act, but that will be effective only if online platforms and services are held accountable under that Act. My Liberal Democrat colleagues and I have called on the National Crime Agency to launch an urgent criminal investigation into X, which should still happen, and to treat the generation of illegal, sexual abuse material with the seriousness it demands. We must act decisively when social media platforms refuse to comply with the law.
It is also time that we introduce age ratings for online platforms and limit harmful social media to over-16s. How can we expect to tackle violence against women and girls when the next generation is being drip-fed misogynistic content on social media?
The hon. Member is right. Does she agree that online pornography remains an issue that needs to be tackled? The statistics show that more than 50% of young boys aged 11 to 13 have already seen porn, and that it is shaping their minds about what consent is.
Marie Goldman
There are so many aspects to this problem. What we, the parents, saw in the fledgling days of social media is not at all what our children are seeing now. We need to recognise that and act against it. What our children see online is already affecting their worldview. Internet Matters research from 2023 found that 42% of children aged nine to 16 had a favourable or neutral view of the well-known misogynistic influencer Andrew Tate, and that older teenage boys were particularly susceptible. That is incredibly worrying. Decisive action to tackle the digital exploitation of women and girls is needed across the board. Online harm is genuine harm, and we must treat it as such. There is a lot of work to do, but I am keen to work cross-party to get it done. I hope the Minister is too.