(9 months, 3 weeks ago)
Lords ChamberTo ask His Majesty’s Government whether they plan to prohibit “nudify” apps which create intimate images of other people using artificial intelligence without their consent.
My Lords, the Online Safety Act introduced new offences which criminalised the sharing of, or threatening to share, intimate images, including deepfakes, without consent. Where individuals create these images using any kind of technology and share or threaten to share them online, they may be committing an offence. The Act will additionally give online platforms new duties to tackle this content by removing it, including where it has been created via AI apps.
I thank my noble friend the Minister for his Answer. There has been a huge increase in the use of nudify apps and the creation of deepfake porn since the Law Commission stated that it was less sure that the level of harm caused by the making of these images and videos was serious enough to criminalise. Does my noble friend agree that the making of these images and videos without a person’s consent does in fact cause serious harm, regardless of whether a person is aware of it, and that, if allowed to continue, represents a real threat to all women?
I start by acknowledging that the creation of intimate image deepfakes using AI or other means is abusive and deeply distressing to anyone concerned and very disturbing to all of us. The Law Commission consulted widely on this, looking at the process of taking, making, possessing and sharing deepfakes, and its conclusion was that the focus of legislative effort ought to be on sharing, which it now is. That said, this is a fast-moving space. The capabilities of these tools are growing rapidly and, sadly, the number of users is growing rapidly, so we will continue to monitor that.