(1 year, 3 months ago)
Grand CommitteeMy Lords, these regulations were laid before the House on 12 September this year. The Government stated in their manifesto that they would
“use every government tool available to target perpetrators and address the root causes of abuse and violence”
in order to achieve their
“landmark mission to halve violence against women and girls in a decade”.
Through this statutory instrument, we are broadening online platforms’ and search engines’ responsibilities for tackling intimate image abuse under the Online Safety Act. More than one in three women have experienced abuse online. The rise in intimate image abuse is not only devastating for victims but also spreads misogyny on social media that can develop into potentially dangerous relationships offline. One in 14 adults in England and Wales has experienced threats to share intimate images, rising to one in seven young women aged 18 to 34.
It is crucial that we tackle these crimes from every angle, including online, and ensure that tech companies step up and play their part. That is why we are laying this statutory instrument. Through it, we will widen online platforms’ and search engines’ obligations to tackle intimate image abuse under the Online Safety Act. As noble Lords will know, the Act received Royal Assent on 26 October 2023. It places strong new duties on online user-to-user platforms and search services to protect their users from harm.
As part of this, the Act gives service providers new “illegal content duties”. Under these duties, online platforms need to assess the risk that their services will allow users to encounter illegal content or be
“used for the commission or facilitation of a priority offence”.
They then need to take steps to mitigate identified risks. These will include implementing safety-by-design measures to reduce risks and content moderation systems to remove illegal content where it appears.
The Online Safety Act sets out a list of priority offences for the purposes of providers’ illegal content duties. These offences reflect the most serious and prevalent online illegal content and activity. They are set out in schedules to the Act. Platforms will need to take additional steps to tackle these kinds of illegal activities under their illegal content duties.
The priority offences list currently includes certain intimate image abuse offences. Through this statutory instrument, we are adding new intimate image abuse offences to the priority list. This replaces an old intimate image abuse offence, which has now been repealed. These new offences are in the Sexual Offences Act 2003. They took effect earlier this year. The older offence was in the Criminal Justice and Courts Act 2015. The repealed offence covered sharing intimate images where the intent was to cause distress. The new offences are broader; they criminalise sharing intimate images without having a reasonable belief that the subject would consent to sharing the images. These offences include the sharing of manufactured or manipulated images, including so-called deepfakes.
Since these new offences are more expansive, adding them as priority offences means online platforms will be required to tackle more intimate image abuse on their services. This means that we are broadening the scope of what constitutes illegal intimate image content in the Online Safety Act. It also makes it clear that platforms’ priority illegal content duties extend to AI-generated deepfakes and other manufactured intimate images. This is because the new offences that we are adding explicitly cover this content.
As I have set out above, these changes affect the illegal content duties in the Online Safety Act. They will ensure that tech companies play their part in kicking this content off social media. These are just part of a range of wider protections coming into force next spring through the Online Safety Act that will mean that social media companies have to remove the most harmful illegal content, a lot of which disproportionately affects women and girls, such as through harassment and controlling or coercive behaviour.
Ofcom will set out the specific steps that providers can take to fulfil their illegal content duties for intimate image abuse and other illegal content in codes of practice and guidance documentation. It is currently producing this documentation. We anticipate that the new duties will start to be enforced from spring next year once Ofcom has issued these codes of practice and they have come into force. Providers will also need to have done their risk assessment for illegal content by then. We anticipate that Ofcom will recommend that providers should take action in a number of areas. These include content moderation, reporting and complaints procedures, and safety-by-design steps, such as testing their algorithm systems to see whether illegal content is being recommended to users. We are committed to working with Ofcom to get these protections in place as quickly as possible. We are focused on delivering.
Where companies are not removing and proactively stopping this vile material appearing on their platforms, Ofcom will have robust powers to take enforcement action against them. This includes imposing fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is highest.
In conclusion, through this statutory instrument we are broadening providers’ duties for intimate image abuse content. Service providers will need to take proactive steps to search for, remove and limit people’s exposure to this harmful kind of illegal content, including where it has been manufactured or manipulated. I hope noble Lords will commend these further steps that we have taken that take the provisions in the Online Safety Act a useful further step forward. I commend these regulations to the Committee, and I beg to move.
My Lords, I thank the Minister for her introduction. I endorse everything she said about intimate image abuse and the importance of legislation to make sure that the perpetrators are penalised and that social media outlets have additional duties under Schedule 7 for priority offences. I am absolutely on the same page as the Minister on this, and I very much welcome what she said. It is interesting that we are dealing with another 2003 Act that, again, is showing itself fit for purpose and able to be amended; perhaps there is some cause to take comfort from our legislative process.
I was interested to hear what the Minister said about the coverage of the offences introduced by the Online Safety Act. She considered that the sharing of sexually explicit material included deepfakes. There was a promise—the noble Viscount will remember it—that the Criminal Justice Bill, which was not passed in the end, would cover that element. It included intent, like the current offence—the one that has been incorporated into Schedule 7. The Private Member’s Bill of the noble Baroness, Lady Owen—I have it in my hand—explicitly introduces an offence that does not require intent, and I very much support that.
I do not believe that this is the last word to be said on the kinds of IIA offence that need to be incorporated as priority offences under Schedule 7. I would very much like to hear what the noble Baroness has to say about why we require intent when, quite frankly, the creation of these deepfakes requires activity that is clearly harmful. We clearly should make sure that the perpetrators are caught. Given the history of this, I am slightly surprised that the Government’s current interpretation of the new offence in the Online Safety Act includes deepfakes. It is gratifying, but the Government nevertheless need to go further.
My Lords, I welcome the Minister’s remarks and the Government’s step to introduce this SI. I have concerns that it misses the wider problems. The powers given to Ofcom in the Online Safety Act require a lengthy process to implement and are not able to respond quickly. They also do not provide individuals with any redress. Therefore, this SI adding to the list of priority offences, while necessary, does not give victims the recourse they need.
My concern is that Ofcom is approaching this digital problem in an analogue way. It has the power to fine and even disrupt business but, in a digital space—where, when one website is blocked, another can open immediately—Ofcom would, in this scenario, have to restart its process all over again. These powers are not nimble or rapid enough, and they do not reflect the nature of the online space. They leave victims open and exposed to continuing distress. I would be grateful if the Government offered some assurances in this area.
The changes miss the wider problem of non-compliance by host websites outside the UK. As I have previously discussed in your Lordships’ House, the Revenge Porn Helpline has a removal rate of 90% of reported non-consensual sexually explicit content, both real and deepfake. However, in 10% of cases, the host website will not comply with the removal of the content. These sites are often hosted in countries such as Russia or those in Latin America. In cases of non-compliance by host websites, the victims continue to suffer, even where there has been a successful conviction.
If we take the example of a man who was convicted in the UK of blackmailing 200 women, the Revenge Porn Helpline successfully removed 161,000 images but 4,000 still remain online three years later, with platforms continuing to ignore the take-down requests. I would be grateful if the Government could outline how they are seeking to tackle the removal of this content, featuring British citizens, hosted in jurisdictions where host sites are not complying with removal.
(1 year, 3 months ago)
Grand CommitteeMy Lords, this order was laid before the House on 9 September this year. The Online Safety Act lays the foundations of strong protection for children and adults online. I am grateful to noble Lords for their continued interest in the Online Safety Act and its implementation. It is critical that the Act is made fully operational as soon as possible, and the Government are committed to ensuring that its protections are delivered as soon as possible. This statutory instrument will further support the implementation of the Act by Ofcom.
This statutory instrument concerns Ofcom’s ability to share business information with Ministers for the purpose of fulfilling functions under the Online Safety Act 2023, under Section 393 of the Communications Act 2003. This corrects an oversight in the original Online Safety Act that was identified following its passage.
Section 393 of the Communications Act 2003 contains a general restriction on Ofcom disclosing information about particular businesses without consent from the affected businesses, but with exemptions, including where this facilitates Ofcom in carrying out its regulatory functions and facilitates other specified persons in carrying out specific functions. However, this section does not currently enable Ofcom to share information with Ministers for the purpose of fulfilling functions under the Online Safety Act. This means that, were Ofcom to disclose information about businesses to the Secretary of State, it may be in breach of the law.
It is important that a gateway exists for sharing information for these purposes so that the Secretary of State can carry out functions under the Online Safety Act, such as setting the fee threshold for the online safety regime in 2025 or carrying out post-implementation reviews of the Act required under Section 178. This statutory instrument will therefore amend the Communications Act 2003 to allow Ofcom to share information with the Secretary of State and other Ministers, strictly for the purpose of fulfilling functions under the Online Safety Act 2023.
There are strong legislative safeguards and limitations on the disclosure of this information, and Ofcom is experienced in handling confidential and sensitive information obtained from the services it regulates. Ofcom must comply with UK data protection law and would need to show that the processing of any personal data was necessary for a lawful purpose. As a public body, Ofcom is also required to act compatibly with the Article 8 right of privacy under the European Convention on Human Rights.
We will therefore continue to review the Online Safety Act, so that Ofcom is able to support the delivery of functions under the Act where it is appropriate. That is a brief but detailed summary of why this instrument is necessary. I should stress that it contains a technical amendment to deal with a very small legal aspect. Nevertheless, I will be interested to hear noble Lords’ comments on the SI. I beg to move.
My Lords, I thank the Minister for her introduction and for explaining the essence of the SI. We all have a bit of pride of creation in the Online Safety Act; there are one or two of us around today who clearly have a continuing interest in it. This is one of the smaller outcomes of the Act and, as the Minister says, it is an essentially an oversight. I would say that a tidying-up operation is involved here. It is rather gratifying to see that the Communications Act still has such importance, 21 years after it was passed. It is somewhat extraordinary for legislation to be invoked after that period of time in an area such as communications, which is so fast-moving.
My question for the Minister is whether the examples that she gave or which were contained in the Explanatory Memorandum, regarding the need for information to be obtained by the Secretary of State in respect of Section 178, on reviewing the regulatory framework, and Section 86, on the threshold for payment of fees, are exclusive. Are there other aspects of the Online Safety Act where the Secretary of State requires that legislation?
We are always wary of the powers given to Secretaries of State, as the noble Viscount, Lord Camrose, will probably remember to his cost. But at every point, the tyres on legislation need to be kicked to make sure that the Secretary of State has just the powers that they need—and that we do not go further than we need to or have a skeleton Bill, et cetera—so the usual mantra will apply: we want to make sure that the Secretary of State’s powers are proportionate.
It would be very useful to hear from the Minister what other powers are involved. Is it quite a number, were these two just the most plausible or are there six other sets of powers which might not be so attractive? That is the only caveat I would make in this respect.
(1 year, 4 months ago)
Lords ChamberThe noble Lord is absolutely right. The scale of violent images featuring women and girls in our country is intolerable, and this Government will treat it as the national emergency it is. The noble Lord will be pleased to hear that the Government have set out an unprecedented mission to halve violence against women and girls within a decade. We are using every government tool we have to target the perpetrators and address the root cause of violence. That involves many legislative and non-legislative measures, as the noble Lord will appreciate, including tackling the education issue. However, ultimately, we have to make sure that the legislation is robust and that we take action, which we intend to do.
My Lords, as the Minister and others have mentioned, there is considerable and increasing concern about deepfake pornographic material, particularly the so-called nudification apps, which can be easily accessed by users of any age. What action will the Government be taking against this unacceptable technology, and will an offence be included in the forthcoming crime and policing Bill?
The noble Lord raises an important point. Where nudification apps and other material do not come under the remit of the Online Safety Act, we will look at other legislative tools to make sure that all new forms of technology—including AI and its implications for online images —are included in robust legislation, in whatever form it takes. Our priority is to implement the Online Safety Act, but we are also looking at what other tools might be necessary going forward. As the Secretary of State has said, this is an iterative process; the Online Safety Act is not the end of the game. We are looking at what further steps we need to take, and I hope the noble Lord will bear with us.
(1 year, 5 months ago)
Lords ChamberMy Lords, it is a pleasure to follow the noble Lord, Lord Hunt, and a particular pleasure to follow so closely the comprehensive introduction by our excellent former chair, the noble Lord, Lord Hollick.
As the noble Lord alluded to, the Grenfell report and today’s Statement have been an extremely sobering reminder of the importance of effective regulation and the effective oversight of regulators. The principal job of regulation is to ensure societal safety and benefit—in essence, mitigating risk. In that context, the performance of the UK regulators, as well as the nature of regulation, is crucial.
In the early part of this year, the spotlight was on regulation and the effectiveness of our regulators. Our report was followed by a major contribution to the debate from the Institute for Government. We then had the Government’s own White Paper, Smarter Regulation, which seemed designed principally to take the growth duty established in 2015 even further with a more permissive approach to risk and a “service mindset”, and risked creating less clarity with yet another set of regulatory principles going beyond those in the Better Regulation Framework and the Regulators’ Code.
Our report was, however, described as excellent by the Minister for Investment and Regulatory Reform in the Department for Business and Trade under the previous Government, the noble Lord, Lord Johnson of Lainston, whom I am pleased to see taking part in the debate today. I hope that the new Government will agree with that assessment and take our recommendations further forward.
Both we and the Institute for Government identified a worrying lack of scrutiny of our regulators—indeed, a worrying lack of even identifying who our regulators are. The NAO puts the number of regulators at around 90 and the Institute for Government at 116, but some believe that there are as many as 200 that we need to take account of. So it is welcome that the previous Government’s response said that a register of regulators, detailing all UK regulators, their roles, duties and sponsor departments, was in the offing. Is this ready to be launched?
The crux of our report was to address performance, strategic independence and oversight of UK regulators. In exploring existing oversight, accountability measures and the effectiveness of parliamentary oversight, it was clear that we needed to improve self-reporting by regulators. However, a growth duty performance framework, as proposed in the White Paper, does not fit the bill.
Regulators should also be subject to regular performance evaluations, as we recommended; these reviews should be made public to ensure transparency and accountability. To ensure that these are effective, we recommended, as the noble Lord, Lord Hollick mentioned, establishing a new office for regulatory performance—an independent statutory body analogous to the National Audit Office—to undertake regular performance reviews of regulators and to report to Parliament. It was good to see that, similar to our proposal, the Institute for Government called for a regulatory oversight support unit in its subsequent report, Parliament and Regulators.
As regards independence, we had concerns about the potential politicisation of regulatory appointments. Appointment processes for regulators should be transparent and merit-based, with greater parliamentary scrutiny to avoid politicisation. Although strategic guidance from the Government is necessary, it should not compromise the operational independence of regulators.
What is the new Government’s approach to this? Labour’s general election manifesto emphasised fostering innovation and improving regulation to support economic growth, with a key proposal to establish a regulatory innovation office in order to streamline regulatory processes for new technologies and set targets for tech regulators. I hope that that does not take us down the same trajectory as the previous Government. Regulation is not the enemy of innovation, or indeed growth, but can in fact, by providing certainty of standards, be the platform for it.
At the time of our report, the IfG rightly said:
“It would be a mistake for the committee to consider its work complete … new members can build on its agenda in their future work, including by fleshing out its proposals for how ‘Ofreg’ would work in practice”.
We should take that to heart. There is still a great deal of work to do to make sure that our regulators are clearly independent of government, are able to work effectively, and are properly resourced and scrutinised. I hope that the new Government will engage closely with the committee in their work.
(1 year, 6 months ago)
Lords ChamberI thank my noble friend for those good wishes. Of course, he is raising a really important issue of great concern to all of us. During the last election, we felt that the Government were well prepared to ensure the democratic integrity of our UK elections. We did have robust systems in place to protect against interference, through the Defending Democracy Taskforce and the Joint Election and Security Preparedness unit. We continue to work with the Home Office and the security services to assess the impact of that work. Going forward, the Online Safety Act goes further by putting new requirements on social media platforms to swiftly remove illegal misinformation and disinformation, including where it is AI-generated, as soon as it becomes available. We are still assessing the need for further legislation in the light of the latest intelligence, but I assure my noble friend that we take this issue extremely seriously. It affects the future of our democratic process, which I know is vital to all of us.
My Lords, I welcome the creation of an AI opportunities plan, announced by the Government, but, as the noble Lord, Lord Knight, says, we must also tackle the risks. In other jurisdictions across the world, including the EU, AI-driven live facial recognition technology is considered to seriously infringe the right to privacy and have issues with accuracy and bias, and is being banned or restricted for both law enforcement and business use. Will the Government, in their planned AI legislation, provide equivalent safeguards for UK citizens and ensure their trust in new technology?
I thank the noble Lord for that question and for all the work he has done on the AI issue, including his new book, which I am sure is essential reading over the summer for everybody. I should say that several noble Lords in this Chamber have written books on AI, so noble Lords might want to consider that for their holiday reading.
The noble Lord will know that the use and regulation of live facial recognition is for each country to decide. We already have some regulations about it, but it is already governed by data protection, equality and human rights legislation, supplemented by specific police guidance. It is absolutely vital that its use is only when it is necessary, proportionate and fair. We will continue to look at the legislation and at whether privacy is being sufficiently protected. That is an issue that will come forward when the future legislation is being prepared.