(3 weeks, 4 days ago)
Grand CommitteeMy Lords, I thank the Minister for her introduction. I endorse everything she said about intimate image abuse and the importance of legislation to make sure that the perpetrators are penalised and that social media outlets have additional duties under Schedule 7 for priority offences. I am absolutely on the same page as the Minister on this, and I very much welcome what she said. It is interesting that we are dealing with another 2003 Act that, again, is showing itself fit for purpose and able to be amended; perhaps there is some cause to take comfort from our legislative process.
I was interested to hear what the Minister said about the coverage of the offences introduced by the Online Safety Act. She considered that the sharing of sexually explicit material included deepfakes. There was a promise—the noble Viscount will remember it—that the Criminal Justice Bill, which was not passed in the end, would cover that element. It included intent, like the current offence—the one that has been incorporated into Schedule 7. The Private Member’s Bill of the noble Baroness, Lady Owen—I have it in my hand—explicitly introduces an offence that does not require intent, and I very much support that.
I do not believe that this is the last word to be said on the kinds of IIA offence that need to be incorporated as priority offences under Schedule 7. I would very much like to hear what the noble Baroness has to say about why we require intent when, quite frankly, the creation of these deepfakes requires activity that is clearly harmful. We clearly should make sure that the perpetrators are caught. Given the history of this, I am slightly surprised that the Government’s current interpretation of the new offence in the Online Safety Act includes deepfakes. It is gratifying, but the Government nevertheless need to go further.
My Lords, I welcome the Minister’s remarks and the Government’s step to introduce this SI. I have concerns that it misses the wider problems. The powers given to Ofcom in the Online Safety Act require a lengthy process to implement and are not able to respond quickly. They also do not provide individuals with any redress. Therefore, this SI adding to the list of priority offences, while necessary, does not give victims the recourse they need.
My concern is that Ofcom is approaching this digital problem in an analogue way. It has the power to fine and even disrupt business but, in a digital space—where, when one website is blocked, another can open immediately—Ofcom would, in this scenario, have to restart its process all over again. These powers are not nimble or rapid enough, and they do not reflect the nature of the online space. They leave victims open and exposed to continuing distress. I would be grateful if the Government offered some assurances in this area.
The changes miss the wider problem of non-compliance by host websites outside the UK. As I have previously discussed in your Lordships’ House, the Revenge Porn Helpline has a removal rate of 90% of reported non-consensual sexually explicit content, both real and deepfake. However, in 10% of cases, the host website will not comply with the removal of the content. These sites are often hosted in countries such as Russia or those in Latin America. In cases of non-compliance by host websites, the victims continue to suffer, even where there has been a successful conviction.
If we take the example of a man who was convicted in the UK of blackmailing 200 women, the Revenge Porn Helpline successfully removed 161,000 images but 4,000 still remain online three years later, with platforms continuing to ignore the take-down requests. I would be grateful if the Government could outline how they are seeking to tackle the removal of this content, featuring British citizens, hosted in jurisdictions where host sites are not complying with removal.
(1 month, 1 week ago)
Lords ChamberI thank my noble friend for her question. We are absolutely determined to keep children safe online and to use the Online Safety Act to provide protection across all the categories under its jurisdiction. Ofcom’s draft guidance lays out which technologies could constitute, for example, highly effective age assurance to protect children, and it will have a full range of enforcement powers to take action against companies that do not follow the duties, including substantial fines. I absolutely agree with my noble friend that robustness is key here. I think some people are frustrated that some of the duties in the Online Safety Act are taking time to be rolled out, but it was a feature of the Act that it would be done on that basis. We are very keen, as everybody in the House is, to see it enacted in full as soon as it can be.
My Lords, the Revenge Porn Helpline has a removal rate of 90% of non-consensually shared intimate content, including deepfake. However, in 10% of cases, the host site will not comply with its removal, even where there has been a successful conviction. These sites are often hosted in Russia and Latin America, and are unlikely to come under Ofcom’s scope, even with the changes that make sharing a priority offence. Can the Minister inform the House what action the Government are taking to address non-compliance, and does she agree that it would be better adopt a rapid and wide-ranging approach—favoured by victims—to deem NCII content illegal, thus giving internet service providers the power to block it?
I thank the noble Baroness for her continuing interest in this issue and her campaigning work. The Government have already put forward secondary legislation to ensure that the new intimate image abuse offence is made a priority under the Online Safety Act, and all other acts of deepfake portrayal will come under the Act if they are illegal. Going back to the earlier question about robustness, we absolutely expect Ofcom to implement those protections in a robust way.
(2 months, 1 week ago)
Lords ChamberMy noble friend makes the important point that international co-operation is absolutely vital. We continue to talk to all our friends across the globe, exchanging information and making sure that best practice arises from those discussions.
My Lords, research by Vodafone found that algorithms are pushing content to boys related to misogyny and violence following innocent and unrelated searches. Can the Minister say whether the Government are looking into how these algorithms have been used not only to push misinformation and disinformation but to push people towards and reinforce more extreme views?
My Lords, deepfakes and other forms of manipulated media are captured by the Online Safety Act where they constitute illegal content or harmful content to children in scope of the regulatory framework. Under the Act, all companies will be forced to take action against illegal content online, including illegal misinformation and disinformation, and they will be required to remove in-scope content. These duties will also apply to in-scope AI-generated content and AI-powered features.