(1 day, 9 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Mr Richard Quigley (Isle of Wight West) (Lab)
It is a beyond fantastic pleasure to serve under your chairship, Ms Jardine. I thank my hon. Friend the Member for Preston (Sir Mark Hendrick) for securing such an important and extremely timely debate. As we have seen over the past weeks, months and years, the exploitation of women and girls comes in many forms; the digital landscape does not automatically protect them from physical harm. We have seen recently how easily predators can sexually exploit women and girls through deepfakes on social media, and how these actions are actively promoted by platforms. That exploitation can also translate into the exploitation of physical insecurities, which can lead women and girls down a dangerous and unattainable path.
Unrealistic body standards in advertising are not a new phenomenon, but AI has deepened the problem. It is beyond the pale: it now enables girls to be exposed to imagery promoting body types that quite literally defy the laws of physics. Such content is not something that they stumble on; it is actively pushed at them through targeted advertising. Once a social media platform identifies a user as a girl or a young woman, the algorithm begins promoting harmful material directly into their feed.
A recent Government report found that 19% of girls aged between 14 and 16 had been exposed to harmful content promoting extreme thinness—nearly three times the figure reported by their male classmates. A leaked internal report from Meta went further, indicating that its eating disorder-related algorithm is more likely to target users who have previously engaged with content about body dissatisfaction. That is a malicious example of how these companies, which by the way are fully aware of these risks, continue to exploit the insecurities of women and girls.
Some of the most distressing cases of deepfakes and technology-facilitated sexual abuse have resulted in women losing their job, their reputation and even access to their children. These incidents send an appalling message that perpetrators can wield overwhelming power over their victims. They confirm the very fears that many survivors live with every day.
If social media and AI companies are willing to stand by while their algorithms amplify content about dangerous eating disorders and give abusers free rein to harm women and their children, we must act. This is a significant fight, but one that I and many of my colleagues are absolutely committed to continuing. If we are not vigilant, we risk allowing our digital world to be shaped by the misogynistic impulses of figures like Elon Musk and the wider manosphere, rather than building an online world that safeguards and protects women and girls.