(1 year, 5 months ago)
Lords ChamberMy Lords, Amendments 238A and 238D seek to change the parliamentary process for laying—oh, I am skipping ahead with final day of Report enthusiasm.
As noble Lords know, companies will fund the costs of Ofcom’s online safety functions through annual fees. This means that the regime which the Bill ushers in will be cost neutral to the taxpayer. Once the fee regime is operational, regulated providers with revenue at or above a set threshold will be required to notify Ofcom and to pay a proportionate fee. Ofcom will calculate fees with reference to the provider’s qualifying worldwide revenue.
The Delegated Powers and Regulatory Reform Committee of your Lordships’ House has made two recommendations relating to the fee regime which we have accepted, and the amendments we are discussing in this group reflect this. In addition, we are making an additional change to definitions to ensure that Ofcom can collect proportionate fees.
A number of the amendments in my name relate to qualifying worldwide revenue. Presently, the Bill outlines that this should be defined in a published statement laid before Parliament. Your Lordships’ committee advised that it should be defined through regulations subject to the affirmative procedure. We have agreed with this and are proposing changes to Clause 76 so that Ofcom can make provisions about qualifying worldwide revenue by regulations which, as per the committee’s recommendations, will be subject to the affirmative procedure.
Secondly, the committee recommended that we change the method by which the revenue threshold is defined. Presently, as set out in the Bill, it is set by the Secretary of State in a published statement laid before Parliament. The committee recommended that the threshold be set through regulations subject to the negative procedure and we are amending Clause 77 to make the recommended change.
Other amendments seek to make a further change to enable Ofcom to collect proportionate fees from providers. A provider of a regulated service the qualifying worldwide revenue of which is equal to, or greater than, the financial threshold will be required to notify Ofcom and pay an annual fee, calculated by reference to its qualifying worldwide revenue. Currently, this means that that fee calculation can be based only on the revenue of the regulated provider. The structure of some technology companies, however, means that how they accrue revenue is not always straightforward. The entity which meets the definition of a provider may therefore not be the entity which generates revenue referable to the regulated service.
Regulations to be made by Ofcom about the qualifying worldwide revenue will therefore be able to provide that the revenue accruing to certain entities in the same group as a provider of a regulated service can be taken into account for the purposes of determining qualifying worldwide revenue. This will enable Ofcom, when making such regulations, to make provisions, if necessary, to account for instances where a provider has a complex group structure; for example, where the regulated provider might accrue only a portion of the revenue referrable to the regulated service, the rest of which might be accrued by other entities in the group’s structure. These amendments to Clause 76 address these issues by allowing Ofcom to make regulations which provide that the revenue from certain other entities within the provider’s group structure can be taken into account. I beg to move.
My Lords, we have not talked much about fees in our consideration of the Bill, and I will not talk much about them today, but there are some important questions. We should not skip too lightly over the fact that we will be levying revenues from online providers. That might have a significant impact on the markets. I have some specific questions about this proposed worldwide revenue method but I welcome these amendments and that we will now be getting a better procedure. This will also allow the Minister to say, “All these detailed points can be addressed when these instruments come before Parliament”. That is a good development. However, there are three questions that are worth putting on the record now so that we have time to think about them.
First, what consideration will be given to the impact on services that do not follow a classic revenue model but instead rely on donations and other sorts of support? I know that we will come back to this question in a later group but there are some very large internet service providers that are not the classic advertising-funded model, instead relying on foundations and other things. They will have significant questions about what we would judge their qualifying worldwide revenue to be, given that they operate to these very different models.
The second question concerns the impact on services that may have a very large footprint outside the UK, and significant worldwide revenues, but which do very little business within the UK. The amendment that the Minister has tabled about group revenues is also relevant here. You can imagine an entity which may be part of a very large worldwide group making very significant revenues around the world. It has a relatively small subsidiary that is offering a service in the UK, with relatively low revenues. There are some important questions there around the potential impact of the fees on decision-making within that group. We have discussed how we do not want to end up with less choice for consumers of services in the UK. There is an interesting question there as to whether getting the fee level wrong might lead to worldwide entities saying, “If you’re going to ask me to pay a fee based on my qualifying worldwide revenue, the UK market is just not worth it”. That may particularly true if, for example, the European Union and other markets are also levying a fee. You can see a rational business choice of, “We’re happy to pay the fee to the EU but not to Ofcom if it is levied at a rate that is disproportionate to the business that we do here”.
The third and very topical question is about the Government’s thinking about services with declining revenues but whose safety needs are not reducing and may even be increasing. I hope as I say this that people have Twitter in mind, which has very publicly told us that its revenue is going down significantly. It has also very publicly fired most of its trust and safety staff. You can imagine a model within which, because its revenue is declining, it is paying less to Ofcom precisely when Ofcom needs to do more supervision of it.
I hope that we can get some clarity around the Government’s intentions in these circumstances. I have referenced three areas where the worldwide qualifying revenue calculation may go a little awry. The first is where the revenue is not classic commercial income but comes from other sources. The second is where the footprint in the UK is very small but it is otherwise a large global company which we might worry will withdraw from the market. The third, and perhaps most important, is what the Government’s intention is where a company’s revenue is declining and it is managing its platform less well and its Ofcom needs increase, and what we would expect to happen to the fee level in those circumstances.
I appreciate the tone of the Minister’s comments very much, but they are not entirely reassuring me. There is a debate going on out there: there are people saying, “We’ve got these fabulous technologies that we would like Ofcom to order companies to install” and there are companies saying, “That would be disastrous and break encryption if we had to install them”. That is a dualistic situation where there is a contest going on. My amendment seeks to make sure the conflict can be properly resolved. I do not think Ofcom on its own can ever do that, because Ofcom will always be defending what it is doing and saying “This is fine”. So, there has to be some other mechanism whereby people can say it is not fine and contest that. As I say, in this debate we are ignoring the fact that they are already out there: people saying “We think you should deploy this” and companies saying “It would be disastrous if we did”. We cannot resolve that by just saying “Trust Ofcom”.
To meet the expectation the noble Lord voiced earlier, I will indeed point out that Ofcom can consult the ICO as a skilled person if it wishes to. It is important that we square the circle and look at these issues. The ICO will be able to be involved in the way I have set out as a skilled person.
Before I conclude, I want to address my noble friend Lady Harding’s questions on skilled persons. Given that notices will be issued on a case-by-case basis, and Ofcom will need to look at specific service design and existing systems of a provider to work out how a particular technology would interact with that design system, a skilled person’s report better fits this process by requiring Ofcom to obtain tailored advice rather than general technical advice from an advisory board. The skilled person’s report will be largely focused on the technical side of Ofcom’s assessment: that is to say, how the technology would interact with the service’s design and existing systems. In this way, it offers something similar to but more tailored than a technical advisory board. Ofcom already has a large and expert technology group, whose role it is to advice policy teams on new and existing technologies, to anticipate the impact of technologies and so on. It already has strong links with academia and with external researchers. A technical advisory board would duplicate that function. I hope that reassures my noble friend that the points she raised have been taken into account.
So I hope the noble Lord, Lord Allan, will not feel the need to divide—
My Lords, in moving Amendment 262A, I will speak also to the other government amendments in the group. These amendments address the Bill’s enforcement powers. Government Amendments 262A, 262B, 262C, 264A and 266A, Amendments 265, 266 and 267, tabled by my noble friend Lord Bethell, and Amendment 268 tabled by the noble Lord, Lord Stevenson of Balmacara, relate to senior management liability. Amendment 268C from the noble Lord, Lord Weir of Ballyholme, addresses interim service restriction orders.
In Committee, we amended the Bill to create an offence of non-compliance with steps set out in confirmation decisions that relate to specific children’s online safety duties, to ensure that providers and individuals can be held to account where their non-compliance risks serious harm to children. Since then, we have listened to concerns raised by noble Lords and others, in particular that the confirmation decision offence would not tackle child sexual exploitation and abuse. That is why the government amendments in this group will create a new offence of a failure to comply with a child sexual exploitation and abuse requirement imposed by a confirmation decision. This will mean that providers and senior managers can be held liable if they fail to comply with requirements to take specific steps as set out in Ofcom’s confirmation decision in relation to child sexual exploitation and abuse on their service.
Ofcom must designate a step in a confirmation decision as a child sexual exploitation and abuse requirement, where that step relates, whether or not exclusively, to a failure to comply with specific safety duties in respect of child sexual exploitation and abuse content. Failure to comply with such a requirement will be an offence. This approach is necessary, given that steps may relate to multiple or specific kinds of illegal content, or systems and process failures more generally. This approach will ensure that services know from the confirmation decision when they risk criminal liability, while providing sufficient legal certainty via the specified steps to ensure that the offence can be prosecuted effectively.
The penalty for this offence is up to two years in prison, a fine or both. Through Clause 182, where an offence is committed with the consent or connivance of a senior manager, or attributable to his or her neglect, the senior manager, as well as the entity, will have committed the offence and can face up to two years in prison, a fine or both.
I thank my noble friend Lord Bethell, as well as our honourable friends Miriam Cates and Sir William Cash in another place, for their important work in raising this issue and their collaborative approach as we have worked to strengthen the Bill in this area. I am glad that we have reached a position that will help to keep children safe online and drive a change in culture in technology companies. I hope this amendment reassures them and noble Lords that the confirmation decision offence will tackle harms to children effectively by ensuring that technology executives take the necessary steps to keep children safe online. I beg to move.
My Lords, I will briefly comment positively on the Minister’s explanation of how these offences might work, particularly the association of the liability with the failure to enforce a confirmation decision, which seems entirely sensible. In an earlier stage of the debate, there was a sense that we might associate liability with more general failures to enforce a duty of care. That would have been problematic, because the duty of care is very broad and requires a lot of pieces to be put in place. Associating the offences with the confirmation decision makes absolute sense. Having been in that position, if, as an executive in a tech company, I received a confirmation decision that said, “You must do these things”, and I chose wilfully to ignore that decision, it would be entirely reasonable for me to be held potentially criminally liable for that. That association is a good step forward.
Yes, even if the content is not harmful. We keep saying “content” because it is the way the content is disseminated, as the Bill sets out, but the features and functionalities can increase the risks of harm as well. We have addressed this through looking at the cumulative effects and in other ways.
This is the key question. For example, let us take a feature that is pushing something at you constantly; if it was pushing poison at you then it would obviously be harmful, but if it was pushing marshmallows then they would be singularly not harmful but cumulatively harmful. Is the Minister saying that the second scenario is still a problem and that the surfeit of marshmallows is problematic and will still be captured, even if each individual marshmallow is not harmful?
Yes, because the cumulative harm—the accumulation of marshmallows in that example—has been addressed.
Noble Lords should also be aware that the drafting of Amendment 281FA has the effect of saying that harm can arise from proposed new paragraphs (a), (b) and (c)—for example, from the
“age or characteristics of the likely user group”.
In effect, being a child or possessing a particular characteristic may be harmful. This may not be the intention of the noble Baronesses who tabled the amendment, but it highlights the important distinction between something being a risk factor that influences the risk of harm occurring and something being harmful.
The Government are clear that these aspects should properly be treated as risk factors. Other parts of the Bill already make it clear that the ways in which a service is designed and used may impact on the risk of harm suffered by users. I point again to paragraphs (f) to (h) of Clause 10(6); paragraph (e) talks about the level of risk of functionalities of the service, paragraph (f) talks about the different ways in which the service is used, and so on.
We have addressed these points in the Bill, though clearly not to the satisfaction of my noble friend, the noble Baroness, Lady Kidron, and others. As we conclude Report, I recognise that we have not yet convinced everyone that our approach achieves what we all seek, though I am grateful for my noble friend’s recognition that we all share the same aim in this endeavour. As I explained to the noble Baroness, Lady Kidron, on her Amendment 35, I was asking her not to press it because, if she did, the matter would have been dealt with on Report and we would not be able to return to it at Third Reading.
As the Bill heads towards another place with this philosophical disagreement still bubbling away, I am very happy to commit to continuing to talk to your Lordships—particularly when the Bill is in another place, so that noble Lords can follow the debates there. I am conscious that my right honourable friend Michelle Donelan, who has had a busy maternity leave and has spoken to a number of your Lordships while on leave, returns tomorrow in preparation for the Bill heading to her House. I am sure she will be very happy to speak even more when she is back fully at work, but we will both be happy to continue to do so.
I think it is appropriate, in some ways, that we end on this issue, which remains an area of difference. With that promise to continue these discussions as the Bill moves towards another place, I hope that my noble friend will be content not to press these amendments, recognising particularly that the noble Baroness, Lady Kidron, has already inserted this thinking into the Bill for consideration in the other House.
(1 year, 5 months ago)
Lords ChamberMy Lords, transparency and accountability are at the heart of the regulatory framework that the Bill seeks to establish. It is vital that Ofcom has the powers it needs to require companies to publish online safety information and to scrutinise their systems and processes, particularly their algorithms. The Government agree about the importance of improving data sharing with independent researchers while recognising the nascent evidence base and the complexities of this issue, which we explored in Committee. We are pleased to be bringing forward a number of amendments to strengthen platforms’ transparency, which confer on Ofcom new powers to assess how providers’ algorithms work, which accelerate the development of the evidence base regarding researchers’ access to information and which require Ofcom to produce guidance on this issue.
Amendment 187 in my name makes changes to Clause 65 on category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. The amendment tightens the clause to ensure that all the providers’ terms through which they might indicate that a certain kind of content is not allowed on its service are captured by these duties.
Amendment 252G is a drafting change, removing a redundant paragraph from the Bill in relation to exceptions to the legislative definition of an enforceable requirement in Schedule 12.
In relation to transparency, government Amendments 195, 196, 198 and 199 expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports. With thanks to the noble Lord, Lord Stevenson of Balmacara, for his engagement on this issue, we are pleased to table these amendments, which will allow Ofcom to require providers to publish information relating to the formulation, development and scope of user-to-user service providers’ terms of service and search service providers’ public statements of policies and procedures. This is in addition to the existing transparency provision regarding their application.
Amendments 196 and 199 would enable Ofcom to require providers to publish more information in relation to algorithms, specifically information about the design and operation of algorithms that affect the display, promotion, restriction, discovery or recommendation of content subject to the duties in the Bill. These changes will enable greater public scrutiny of providers’ terms of service and their algorithms, providing valuable information to users about the platforms that they are using.
As well as publicly holding platforms to account, the regulator must be able to get under the bonnet and scrutinise the algorithms’ functionalities and the other systems and processes that they use. Empirical tests are a standard method for understanding the performance of an algorithmic system. They involve taking a test data set, running it through an algorithmic system and observing the output. These tests may be relevant for assessing the efficacy and wider impacts of content moderation technology, age-verification systems and recommender systems.
Government Amendments 247A, 250A, 252A, 252B, 252C, 252D, 252E and 252F will ensure that Ofcom has the powers to enable it to direct and observe such tests remotely. This will significantly bolster Ofcom’s ability to assess how a provider’s algorithms work, and therefore to assess its compliance with the duties in the Bill. I understand that certain technology companies have voiced some concerns about these powers, but I reassure your Lordships that they are necessary and proportionate.
The powers will be subject to a number of safeguards. First, they are limited to viewing information. Ofcom will be unable to remotely access or interfere with the service for any other purpose when exercising the power. These tests would be performed offline, meaning that they would not affect the services’ provision or the experience of users. Assessing systems, processes, features and functionalities is the focus of the powers. As such, individual user data and content are unlikely to be the focus of any remote access to view information.
Additionally, the power can be used only where it is proportionate to use in the exercise of Ofcom’s functions—for example, when investigating whether a regulated service has complied with relevant safety duties. A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was unlawful. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.
The Bill contains no restriction on services making the existence and detail of the information notice public. Should a regulated service wish to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. In addition, the amendments create no restrictions on the use of this power being viewable to members of the public through a request, such as those under the Freedom of Information Act—noting that under Section 393 of the Communications Act, Ofcom will not be able to disclose information it has obtained through its exercise of these powers without the provider’s consent, unless permitted for specific, defined purposes. These powers are necessary and proportionate and will that ensure Ofcom has the tools to understand features and functionalities and the risks associated with them, and therefore the tools to assess companies’ compliance with the Bill.
Finally, I turn to researchers’ access to data. We recognise the valuable work of researchers in improving our collective understanding of the issues we have debated throughout our scrutiny of the Bill. However, we are also aware that we need to develop the evidence base to ensure that any sharing of sensitive information between companies and researchers can be done safely and securely. To this end, we are pleased to table government Amendments 272B, 272C and 272D.
Government Amendment 272B would require Ofcom to publish its report into researcher access to information within 18 months, rather than two years. This report will provide the evidence base for government Amendments 272C and 272D, which would require Ofcom to publish guidance on this issue. This will provide valuable, evidence-based guidance on how to improve access for researchers safely and securely.
That said, we understand the calls for further action in this area. The Government will explore this issue further and report back to your Lordships’ House on whether further measures to support researchers’ access to data are required—and if so, whether they could be implemented through other legislation, such as the Data Protection and Digital Information Bill. I beg to move.
My Lords, Amendment 247B in my name was triggered by government Amendment 247A, which the Minister just introduced. I want to explain it, because the government amendment is quite late—it has arrived on Report—so we need to look in some detail at what the Government have proposed. The phrasing that has caused so much concern, which the Minister has acknowledged, is that Ofcom will be able to
“remotely access the service provided by the person”.
It is those words—“remotely access”—which are trigger words for anyone who lived through the Snowden disclosures, where everyone was so concerned about remote access by government agencies to precisely the same services we are talking about today: social media services.
My Lords, I am grateful to noble Lords for their contributions in this group. On the point made by the noble Lord, Lord Knight of Weymouth, on why we are bringing in some of these powers now, I say that the power to direct and observe algorithms was previously implicit within Ofcom’s information powers and, where a provider has UK premises, under powers of entry, inspection and audit under Schedule 12. However, the Digital Markets, Competition and Consumers Bill, which is set to confer similar powers on the Competition and Markets Authority and its digital markets unit, makes these powers explicit. We wanted to ensure that there was no ambiguity over whether Ofcom had equivalent powers in the light of that. Furthermore, the changes we are making ensure that Ofcom can direct and observe algorithmic assessments even if a provider does not have relevant premises or equipment in the UK.
I am grateful to the noble Lord, Lord Allan of Hallam, for inviting me to re-emphasise points and allay the concerns that have been triggered, as his noble friend Lord Clement-Jones put it. I am happy to set out again a bit of what I said in opening this debate. The powers will be subject to a number of safeguards. First, they are limited to “viewing information”. They can be used only where they are proportionate in the exercise of Ofcom’s functions, and a provider would have the right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was done unlawfully. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.
These are not secret powers, as the noble Lord rightly noted. The Bill contains no restriction on services making the existence and detail of the information notice public. If a regulated service wished to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. I also mentioned the recourse that people have through existing legislation, such as the Freedom of Information Act, to give them safeguards, noting that, under Section 393 of the Communications Act, Ofcom will not be able to disclose information that it has obtained through its exercise of these powers without the provider’s consent unless that is permitted for specific, defined purposes.
The noble Lord’s Amendment 247B seeks to place further safeguards on Ofcom’s use of its new power to access providers’ systems remotely to observe tests. While I largely agree with the intention behind it, there are already a number of safeguards in place for the use of that power, including in relation to data protection, legally privileged material and the disclosure of information, as I have outlined. Ofcom will not be able to gain remote access simply for exploratory or fishing purposes, and indeed Ofcom expects to have conversations with services about how to provide the information requested.
Furthermore, before exercising the power, Ofcom will be required to issue an information notice specifying the information to be provided, setting out the parameters of access and why Ofcom requires the information, among other things. Following the receipt of an information notice, a notice requiring an inspection or an audit notice, if a company has identified that there is an obvious security risk in Ofcom exercising the power as set out in the notice, it may not be proportionate to do so. As set out in Ofcom’s duties, Ofcom must have regard to the principles under which regulatory activities should be proportionate and targeted only at cases where action is needed.
In line with current practice, we anticipate Ofcom will issue information notice requests in draft form to identify and address any issues, including in relation to security, before the information notice is issued formally. Ofcom will have a legal duty to exercise its remote access powers in a way that is proportionate, ensuring that undue burdens are not placed on businesses. In assessing proportionality in line with this requirement, Ofcom would need to consider the size and resource capacity of a service when choosing the most appropriate way of gathering information, and whether there was a less onerous method of obtaining the necessary information to ensure that the use of this power is proportionate. As I said, the remote access power is limited to “viewing information”. Under this power, Ofcom will be unable to interfere or access the service for any other purpose.
In practice, Ofcom will work with services during the process. It is required to specify, among other things, the information to be provided, which will set the parameters of its access, and why it requires the information, which will explain the link between the information it seeks and the online safety function that it is exercising or deciding whether to exercise.
As noble Lords know, Ofcom must comply with the UK’s data protection law. As we have discussed in relation to other issues, it is required to act compatibly with the European Convention on Human Rights, including Article 8 privacy rights. In addition, under Clause 91(7), Ofcom is explicitly prohibited from requiring the provision of legally privileged information. It will also be under a legal obligation to ensure that the information gathered from services is protected from disclosure unless clearly defined exemptions apply, such as those under Section 393(2) of the Communications Act 2003—for example, the carrying out of any of Ofcom’s functions. I hope that provides reassurance to the noble Lord, Lord Allan, and the noble Baroness, Lady Fox, who raised these questions.
I am grateful to the Minister. That was helpful, particularly the description of the process and the fact that drafts have to be issued early on. However, it still leaves open a couple of questions, one of which was very helpfully raised by the noble Lord, Lord Knight. We have in Schedule 12 this other set of protections that could be applied. There is a genuine question as to why this has been put in this place and not there.
The second question is to dig a little more into the question of what happens when there is a dispute. The noble Lord, Lord Moylan, pointed out that if you have created a backdoor then you have created a backdoor, and it is dangerous. If we end up in a situation where a company believes that what it is being asked to do by Ofcom is fundamentally problematic and would create a security risk, it will not be good enough to open up the backdoor and then have a judicial review. It needs to be able to say no at that stage, yet the Bill says that it could be committing a serious criminal offence by failing to comply with an information notice. We want some more assurances, in some form, about what would happen in a scenario where a company genuinely and sincerely believes that what Ofcom is asking for is inappropriate and/or dangerous and it wants not to have to offer it unless and until its challenge has been looked at, rather than having to offer it and then later judicially review a decision. The damage would already have been done by opening up an inappropriate backdoor.
A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the remote access power was unlawful. I am sure that would be looked at swiftly, but I will write to the noble Lord on the anticipated timelines while that judicial review was pending. Given the serious nature of the issues under consideration, I am sure that would be looked at swiftly. I will write further on that.
We do not think that six weeks is enough time for the evidence base to develop sufficiently, our assessment being that to endow the Secretary of State with that power at this point is premature.
Amendment 262AA would require Ofcom to consider whether it is appropriate to require providers to take steps to comply with Ofcom’s researcher access guidance when including a requirement to take steps in a confirmation decision. This would be inappropriate because the researcher access provisions are not enforceable requirements; as such, compliance with them should not be subject to enforcement by the regulator. Furthermore, enforcement action may relate to a wide variety of very important issues, and the steps needed should be sufficient to address a failure to comply with an enforceable requirement. Singling out compliance with researcher access guidance alone risks implying that this will be adequate to address core failures.
Amendment 272AB would require Ofcom to give consideration to whether greater access to data could be achieved through legal requirements or incentives for regulated services. I reassure noble Lords that the scope of Ofcom’s report will already cover how greater access to data could be achieved, including through enforceable requirements on providers.
Amendment 272E would require Ofcom to take a provider’s compliance with Ofcom’s guidance on researcher access to data into account when assessing risks from regulated services and determining whether to take enforcement action and what enforcement action to take. However, we do not believe that this is a relevant factor for consideration of these issues. I hope noble Lords will agree that whether or not a company has enabled researcher access to its data should not be a mitigating factor against Ofcom requiring companies to deal with terrorism or child sexual exploitation or abuse content, for example.
On my noble friend Lord Bethell’s remaining Amendments 272BA, 273A and 273B, the first of these would require Ofcom to publish its report on researchers’ access to information within six months. While six months would not be deliverable given other priorities and the complexity of this issue, the government amendment to which I have spoken would reduce the timelines from two years to 18 months. That recognises the importance of the issue while ensuring that Ofcom can deliver the key priorities in establishing the core parts of the regulatory framework; for example, the illegal content and child safety duties.
Just on the timescale, one of the issues that we talked about in Committee was the fact that there needs to be some kind of mechanism created, with a code of practice with reference to data protection law and an approving body to approve researchers as suitable to take information; the noble Baroness, Lady Kidron, referred to the DSA process, which the European Union has been working on. I hope the Minister can confirm that Ofcom might get moving on establishing that. It is not dependent on there being a report in 18 months; in fact, you need to have it in place when you report in 18 months, which means you need to start building it now. I hope the Minister would want Ofcom, within its existing framework, to be encouraging the creation of that researcher approval body and code of practice, not waiting to start that process in 18 months’ time.
I will continue my train of thought on my noble friend’s amendments, which I hope will cover that and more.
My noble friend’s Amendment 273A would allow Ofcom to appoint approved independent researchers to access information. Again, given the nascent evidence base here, it is important to focus on understanding these issues before we commit to a researcher access framework.
Under the skilled persons provisions, Ofcom will already have the powers to appoint a skilled person to assess compliance with the regulatory framework; that includes the ability to leverage the expertise of independent researchers. My noble friend’s Amendment 273B would require Ofcom to produce a code of practice on access to data by researchers. The government amendments I spoke to earlier will require Ofcom to produce guidance on that issue, which will help to promote information sharing in a safe and secure way.
To the question asked by the noble Lord, Lord Allan: yes, Ofcom can start the process and do it quickly. The question here is really about the timeframe in which it does so. As I said in opening, we understand the calls for further action in this area.
I am happy to say to my noble friend Lord Bethell, to whom we are grateful for his work on this and the conversations we have had, that we will explore the issue further and report back on whether further measures to support researchers’ access to data are required and, if so, whether they can be implemented through other legislation, such as the Data Protection and Digital Information (No.2) Bill.
My Lords, I am grateful for the opportunity to set out the need for Clauses 158 and 159. The amendments in this group consider the role of government in two specific areas: the power for the Secretary of State to direct Ofcom about its media literacy functions in special circumstances and the power for the Secretary of State to issue non-binding guidance to Ofcom. I will take each in turn.
Amendment 219 relates to Clause 158, on the Secretary of State’s power to direct Ofcom in special circumstances. These include where there is a significant threat to public safety, public health or national security. This is a limited power to enable the Secretary of State to set specific objectives for Ofcom’s media literacy activity in such circumstances. It allows the Secretary of State to direct Ofcom to issue public statement notices to regulated service providers, requiring providers to set out the steps they are taking to address the threat. The regulator and online platforms are thereby compelled to take essential and transparent actions to keep the public sufficiently informed during crises. The powers ensure that the regulatory framework is future-proofed and well equipped to respond in such circumstances.
As the noble Lord, Lord Clement-Jones, outlined, I corresponded with him very shortly before today’s debate and am happy to set out a bit more detail for the benefit of the rest of the House. As I said to him by email, we expect the media literacy powers to be used only in exceptional circumstances, where it is right that the Secretary of State should have the power to direct Ofcom. The Government see the need for an agile response to risk in times of acute crisis, such as we saw during the Covid-19 pandemic or in relation to the war in Ukraine. There may be a situation in which the Government have access to information, through the work of the security services or otherwise, which Ofcom does not. This power enables the Secretary of State to make quick decisions when the public are at risk.
Our expectation is that, in exceptional circumstances, Ofcom would already be taking steps to address harm arising from the provision of regulated services through its existing media literacy functions. However, these powers will allow the Secretary of State to step in if necessary to ensure that the regulator is responding effectively to these sudden threats. It is important to note that, for transparency, the Secretary of State will be required to publish the reasons for issuing a direction to Ofcom in these circumstances. This requirement does not apply should the circumstances relate to national security, to protect sensitive information.
The noble Lord asked why we have the powers under Clause 158 when they do not exist in relation to broadcast media. We believe that these powers are needed with respect to social media because, as we have seen during international crises such as the Covid-19 pandemic, social media platforms can sadly serve as hubs for low-quality, user-generated information that is not required to meet journalistic standards, and that can pose a direct threat to public health. By contrast, Ofcom’s Broadcasting Code ensures that broadcast news, in whatever form, is reported with due accuracy and presented with due impartiality. Ofcom can fine, or ultimately revoke a licence to broadcast in the most extreme cases, if that code is breached. This means that regulated broadcasters can be trusted to strive to communicate credible, authoritative information to their audiences in a way that social media cannot.
We established in our last debate that the notion of a recognised news publisher will go much broader than a broadcaster. I put it to the Minister that we could end up in an interesting situation where one bit of the Bill says, “You have to protect content from these people because they are recognised news publishers”. Another bit, however, will be a direction to the Secretary of State saying that, to deal with this crisis, we are going to give a media literacy direction that says, “Please get rid of all the content from this same news publisher”. That is an anomaly that we risk setting up with these different provisions.
(1 year, 5 months ago)
Lords ChamberMy Lords, this is indeed an apposite day to be discussing ongoing ping-pong. I am very happy to speak enthusiastically and more slowly about my noble friend Lady Stowell of Beeston’s Amendments 139 and 140. We are happy to support those, subject to some tidying up at Third Reading. We agree with the points that she has made and are keen to bring something forward which would mean broadly that a statement would be laid before Parliament when the power to direct had been used. My noble friend Lady Harding characterised them as the infinite ping-pong question and the secretive ping-pong question; I hope that deals with the secretive ping-pong point.
My noble friend Lady Stowell’s other amendments focus on the infinite ping-pong question, and the power to direct Ofcom to modify a code. Her Amendments 139, 140, 144 and 145 seek to address those concerns: that the Secretary of State could enter into a private form of ping-pong with Ofcom, making an unlimited number of directions on a code to prevent it from ever coming before Parliament. Let me first be clear that we do not foresee that happening. As the amendments I have spoken to today show, the power can be used only when specific exceptional reasons apply. In that sense, we agree with the intent of the amendments tabled by my noble friend Lady Stowell. However, we cannot accept them as drafted because they rely on concepts— such as the “objective” of a direction—which are not consistent with the procedure for making a direction set out in the Bill.
The amendments I have brought forward mean that private ping-pong between the Secretary of State and Ofcom on a code is very unlikely to happen. Let me set out for my noble friend and other noble Lords why that is. The Secretary of State would need exceptional reasons for making any direction, and the Bill then requires that the code be laid before Parliament as soon as is reasonably practicable once the Secretary of State is satisfied that no further modifications to the draft are required. That does not leave room for the power to be used inappropriately. A code could be delayed in this way and in the way that noble Lords have set out only if the Secretary of State could show that there remained exceptional reasons once a code had been modified. This test, which is a very high bar, would need to be met each time. Under the amendments in my name, Parliament would also be made aware straightaway each time a direction was made, and when the modified code came before Parliament, it would now come under greater scrutiny using the affirmative procedure.
I certainly agree with the points that the noble Lord, Lord Allan, and others made that any directions should be made in as transparent a way as possible, which is why we have tabled these amendments. There may be some circumstances where the Secretary of State has access to information—for example, from the security services—the disclosure of which would have an adverse effect on national security. In our amendments, we have sought to retain the existing provisions in the Bill to make sure that we strike the right balance between transparency and protecting national security.
As the noble Lord mentioned, the Freedom of Information Act provides an additional route to transparency while also containing existing safeguards in relation to national security and other important areas. He asked me to think of an example of something that would be exceptional but not require that level of secrecy. By dropping economic policy and burden to business, I would point him to an example in those areas, but a concrete example evades me this afternoon. Those are the areas to which I would turn his attention.
Can the Minister confirm that the fact that a direction has been made will always be known to the public, even if the substance of it is not because it is withheld under the secrecy provision? In other words, will the public always have a before and after knowledge of the fact of the direction, even if its substance is absent?
Yes; that is right.
I hope noble Lords will agree that the changes we have made and that I have outlined today as a package mean that we have reached the right balance in this area. I am very grateful to my noble friend Lady Stowell —who I see wants to come in—for the time that she too has given this issue, along with members of her committee.
My Lords, child sexual exploitation or abuse is an abhorrent crime. Reporting allows victims to be identified and offenders apprehended. It is vital that in-scope companies retain the data included in reports made to the National Crime Agency. This will enable effective prosecutions and ensure that children can be protected.
The amendments in my name in this group will enable the Secretary of State to include in the regulations about the reporting of child sexual exploitation or abuse content a requirement for providers to retain data. This requirement will be triggered only by a provider making a report of suspected child sexual exploitation or abuse to the National Crime Agency. The provider will need to retain the data included in the report, along with any associated account data. This is vital to enabling prosecutions and to ensuring that children can be protected, because data in reports cannot be used as evidence. Law enforcement agencies request this data only when they have determined that the content is in fact illegal and that it is necessary to progress investigations.
Details such as the types of data and the period of time for which providers must retain this data will be specified in regulations. This will ensure that the requirement is future-proofed against new types of data and will prevent companies retaining types of data that may have become obsolete. The amendments will also enable regulations to include any necessary safeguards in relation to data protection. However, providers will be expected to store, process and share this personal data within the UK GDPR framework.
Regulations about child sexual exploitation or abuse reporting will undergo a robust consultation with relevant parties and will be subject to parliamentary scrutiny. This process will ensure that the regulations about retaining data will be well-informed, effective and fit for purpose. These amendments bring the child sexual exploitation and abuse reporting requirements into line with international standards. I beg to move.
My Lords, these seem very sensible amendments. I am curious about why they have arrived only at this stage, given this was a known problem and that the Bill has been drafted over a long period. I am genuinely curious as to why this issue has been raised only now.
On the substance of the amendments, it seems entirely sensible that, given that we are now going to have 20,000 to 25,000 regulated entities in scope, some of which will never have encountered child sexual exploitation or abuse material or understood that they have a legal duty in relation to it, it will be helpful for them to have a clear set of regulations that tell them how to treat their material.
Child sexual exploitation or abuse material is toxic in both a moral and a legal sense. It needs to be treated almost literally as toxic material inside a company, and sometimes that is not well understood. People feel that they can forward material to someone else, not understanding that in doing so they will break the law. I have had experiences where well-meaning people acting in a vigilante capacity sent material to me, and at that point you have to report them to police. There are no ifs or buts. They have committed an offence in doing so. As somebody who works inside a company, your computer has to be quarantined and taken off and cleaned, just as it would be for any other toxic material, because we framed the law, quite correctly, to say that we do not want to offer people the defence of saying “I was forwarding this material because I’m a good guy”. Forwarding the material is a strict liability offence, so to have regulations that explain, particularly to organisations that have never dealt with this material, exactly how they have to deal with it in order to be legally compliant will be extremely helpful.
One thing I want to flag is that there are going to be some really fundamental cross-border issues that have to be addressed. In many instances of child sexual exploitation or abuse material, the material has been shared between people in different jurisdictions. The provider may not be in a UK jurisdiction, and we have got to avoid any conflicts of laws. I am sure the Government are thinking about this, but in drafting those regulations, what we cannot do, for example, is order a provider to retain data in a way that would be illegal in the jurisdiction from which it originates or in which it has its headquarters. The same would apply vice versa. We would not expect a foreign Government to order a UK company to act in a way that was against UK law in dealing with child sexual exploitation or abuse material. This all has to be worked out. I hope the Government are conscious of that.
I think the public interest is best served if the United Kingdom, the United States and the European Union, in particular, adopt common standards around this. I do not think there is anything between us in terms of how we would want to approach child sexual exploitation or abuse material, so the extent to which we end up having common legal standards will be extraordinarily helpful.
As a general matter, to have regulations that help companies with their compliance is going to be very helpful. I am curious as to how we have got there with the amendment only at this very late stage.
(1 year, 5 months ago)
Lords ChamberI just realised I forgot to thank the Government for Amendment 271, which reflected something I raised in Committee. I will reflect back to the Minister that, as is reinforced by his response now, it goes precisely where I wanted to. That is to make sure—I have raised this many times—that we are not implementing another cookie banner, but are implementing something and then going back to say, “Did it work as we intended? Were the costs proportionate to what we achieved?” I want to put on the record that I appreciate Amendment 271.
I appreciate the noble Lord’s interjection and, indeed, his engagement on this issue, which has informed the amendments that we have tabled.
In relation to the amendment of the noble Baroness, Lady Fox, as I set out, there are already robust safeguards for user privacy in the Bill. I have already mentioned Amendment 124, which puts age-assurance principles in the Bill. These require Ofcom to have regard, when producing its codes of practice on the use of age assurance, to the principle of protecting the privacy of users, including data protection. We think that the noble Baroness’s amendment is also unnecessary. I hope that she and the noble Baroness, Lady Kidron, will be willing to not move their amendments and to support the government amendments in the group.
As I set out, I think my noble friend and the noble Baroness, Lady Fox, are not right to point to the European Convention on Human Rights here. That concerns individuals’ and entities’ rights
“to receive and impart ideas without undue interference”
by public authorities, not private entities. We do not see how a service provider deciding not to allow certain types of content on its platform would engage the Article 10 rights of the user, but I would be very happy to discuss this further with my noble friend and the noble Baroness in case we are talking at cross-purposes.
On that point specifically, having worked inside one of the companies, they fear legal action under all sorts of laws, but not under the European Convention on Human Rights. As the Minister explained, it is for public bodies; if people are going to take a case on Article 10 grounds, they will be taking it against a public body. There are lots of other grounds to go after a private company but not ECHR compliance.
(1 year, 5 months ago)
Lords ChamberMy Lords, I am grateful for the broad, if not universal, support for the amendments that we have brought forward following the points raised in Committee. I apologise for anticipating noble Lords’ arguments, but I am happy to expand on my remarks in light of what they have said.
My noble friend Lord Moylan raised the question of non-verified user duties and crowdsourced platforms. The Government recognise concerns about how the non-verified user duties will work with different functionalities and platforms, and we have engaged extensively on this issue. These duties are only applicable to category 1 platforms, those with the largest reach and influence over public discourse. It is therefore right that such platforms have additional duties to empower their adult users. We anticipate that these features will be used in circumstances where vulnerable adults wish to shield themselves from anonymous abuse. If users decide that they are restricting their experience on a particular platform, they can simply choose not to use them. In addition, before these duties come into force, Ofcom will be required to consult effective providers regarding the codes of practice, at which point they will consider how these duties might interact with various functionalities.
My noble friend and the noble Lord, Lord Allan of Hallam, raised the potential for being bombarded with pop-ups because of the forced-choice approach that we have taken. These amendments have been carefully drafted to minimise unnecessary prompts or pop-ups. That is why we have specified that the requirement to proactively ask users how they want these tools to be applied is applicable only to registered users. This approach ensures that users will be prompted to make a decision only once, unless they choose to ignore it. After a decision has been made, the provider should save this preference and the user should not be prompted to make the choice again.
The noble Lord, Lord Clement-Jones, talked further about his amendments on the cost of user empowerment tools as a core safety duty in the Bill. Category 1 providers will not be able to put the user empowerment tools in Clause 12 behind a pay wall and still be compliant with their duties. That is because they will need to offer them to users at the first possible opportunity, which they will be unable to do if they are behind a pay wall. The wording of Clause 12(2) makes it clear that providers have a duty to include user empowerment features that an adult user may use or apply.
The Minister may not have the information today, but I would be happy to get it in writing. Can he clarify exactly what will be expected of a service that already prohibits all the Clause 12 bad stuff in their terms of service?
I will happily write to the noble Lord on that.
Clause 12(4) further sets out that all search user empowerment content tools must be made available to all adult users and be easy to access.
The noble Lord, Lord Clement-Jones, on behalf of the noble Baroness, Lady Finlay, talked about people who will seek out suicide, self-harm or eating-disorder content. While the Bill will not prevent adults from seeking out legal content, it will introduce significant protections for adults from some of the most harmful content. The duties relating to category 1 services’ terms of service are expected hugely to improve companies’ own policing of their sites. Where this content is legal and in breach of the company’s terms of service, the Bill will force the company to take it down.
We are going even further by introducing a new user empowerment content-assessment duty. This will mean that where content relates to eating disorders, for instance, but which is not illegal, category 1 providers need fully to assess the incidence of this content on their service. They will need clearly to publish this information in accessible terms of service, so users will be able to find out what they can expect on a particular service. Alternatively, if they choose to allow suicide, self-harm or eating content disorder which falls into the definition set out in Clause 12, they will need proactively to ask users how they would like the user empowerment content features to be applied.
My noble friend Lady Morgan was right to raise the impact on vulnerable people or people with disabilities. While we anticipate that the changes we have made will benefit all adult users, we expect them particularly to benefit those who may otherwise have found it difficult to find and use the user empowerment content features independently—for instance, some users with types of disabilities. That is because the onus will now be on category 1 providers proactively to ask their registered adult users whether they would like these tools to be applied at the first possible opportunity. The requirement also remains to ensure that the tools are easy to access and to set out clearly what tools are on offer and how users can take advantage of them.
(1 year, 5 months ago)
Lords ChamberI am grateful to noble Lords for their contributions during this debate. I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, and particularly that the Bill should not inhibit services from providing valuable information which is of benefit to the public. However, I want to be clear that that is why the Bill has been designed in the way that it has. It has a broad scope in order to capture a range of services, but it has exemptions and categorisations built into it. The alternative would be a narrow scope, which would be more likely inadvertently to exempt risky sites or to displace harm on to services which we would find are out of scope of the Bill. I will disappoint noble Lords by saying that I cannot accept their amendments in this group but will seek to address the concerns that they have raised through them.
The noble Lord, Lord Allan, asked me helpfully at the outset three questions, to which the answers are yes, no and maybe. Yes, Wikipedia and OpenStreetMap will be in scope of the Bill because they allow users to interact online; no, we do not believe that they would fall under any of the current exemptions in the Bill; and the maybe is that Ofcom does not have the discretion to exempt services but the Secretary of State can create additional exemptions for further categories of services if she sees fit.
I must also say maybe to my noble friend Lord Moylan on his point about Wikipedia—and with good reason. Wikipedia, as I have just explained, is in scope of the Bill and is not subject to any of its exemptions. I cannot say how it will be categorised, because that is based on an assessment made by the independent regulator, but I reassure my noble friend that it is not the regulator but the Secretary of State who will set the categorisation thresholds through secondary legislation; that is to say, a member of the democratically elected Government, accountable to Parliament, through legislation laid before that Parliament. It will then be for Ofcom to designate services based on whether or not they meet those thresholds.
It would be wrong—indeed, nigh on impossible—for me to second-guess that designation process from the Dispatch Box. In many cases it is inherently a complex and nuanced matter since, as my noble friend Lady Harding said, many services change over time. We want to keep the Bill’s provisions flexible as services change what they do and new services are invented.
I would just like to finish my thought on Wikipedia. Noble Lords are right to mention it and to highlight the great work that it does. My honourable friend the Minister for Technology and the Digital Economy, Paul Scully, met Wikipedia yesterday to discuss its concerns about the Bill. He explained that the requirements for platforms in this legislation will be proportionate to the risk of harm, and that as such we do not expect the requirements for Wikipedia to be unduly burdensome.
I am computing the various pieces of information that have just been given, and I hope the Minister can clarify whether I have understood them correctly. These services will be in scope as user-to-user services and do not have an exemption, as he said. The Secretary of State will write a piece of secondary legislation that will say, “This will make you a category 1 service”—or a category 2 or 2B service—but, within that, there could be text that has the effect that Wikipedia is in none of those categories. So it and services like it could be entirely exempt from the framework by virtue of that secondary legislation. Is that a correct interpretation of what he said?
The Secretary of State could create further exemptions but would have to bring those before Parliament for it to scrutinise. That is why there is a “maybe” in answer to his third question in relation to any service. It is important for the legislation to be future-proofed that the Secretary of State has the power to bring further categorisations before Parliament for it to discuss and scrutinise.
My Lords, I will keep pressing this point because it is quite important, particularly in the context of the point made by the noble Baroness, Lady Kidron, about categorisation, which we will debate later. There is a big difference when it comes to Schedule 11, which defines the categorisation scheme: whether in the normal run of business we might create an exemption in the categorisation secondary legislation, or whether it would be the Secretary of State coming back with one of those exceptional powers that the Minister knows we do not like. He could almost be making a case for why the Secretary of State has to have these exceptional powers. We would be much less comfortable with that than if the Schedule 11 categorisation piece effectively allowed another class to be created, rather than it being an exceptional Secretary of State power.
I will check what I said but I hope that I have set out why we have taken the approach that we have with the broad scope and the exemptions and categorisations that are contained in it. With that, I urge the noble Lord to withdraw his amendment.
My Lords, that was a very useful debate. I appreciate the Minister’s response and his “yes, no, maybe” succinctness, but I think he has left us all more worried than when the debate started. My noble friend Lord Clement-Jones tied it together nicely. What we want is for the regulator to be focused on the greatest areas of citizen risk. If there are risks that are missing, or things that we will be asking the regulator to do that are a complete waste of time because they are low risk, then we have a problem. We highlighted both those areas. The noble Lord, Lord Russell, rightly highlighted that we are not content with just “content” as the primary focus of the legislation; it is about a lot more than content. In my amendment and those by the noble Lord, Lord Moylan, we are extremely worried—and remain so—that the Bill creates a framework that will trap Wikipedia and services like it, without that being our primary intention. We certainly will come back to this in later groups; I will not seek to press the amendment now, because there is a lot we all need to digest. However, at the end of this process, we want to get to point where the regulator is focused on things that are high risk to the citizen and not wasting time on services that are very low risk. With that, I beg leave to withdraw my amendment.
(1 year, 5 months ago)
Lords ChamberI appreciate the Minister’s response. Could he also respond to my suggestion that it would be helpful for some of the people working on the front line to meet officials to go through their concerns in more detail?
I am very happy to make that commitment. It would be useful to have their continued engagement, as we have had throughout the drafting of the Bill.
The noble Baroness, Lady Burt of Solihull, has tabled a number of amendments related to the new offence of cyberflashing. I will start with her Amendment 6. We believe that this amendment reduces the threshold of the new offence to too great an extent. It could, for example, criminalise a person sending a picture of naked performance art to a group of people, where one person might be alarmed by the image but the sender sends it anyway because he or she believes that it would be well received. That may be incorrect, unwise and insensitive, but we do not think it should carry the risk of being convicted of a serious sexual offence.
Crucially, the noble Baroness’s amendment requires that the harm against the victim be proven in court. Not only does this add an extra step for the prosecution to prove in order for the perpetrator to be convicted, it creates an undue burden on the victim, who would be cross-examined about his or her—usually her—experience of harm. For example, she might have to explain why she felt humiliated; this in itself could be retraumatising and humiliating for the victim. By contrast, Clause 170 as drafted means that the prosecution has only to prove and focus on the perpetrator’s intent.
My Lords, I am grateful for the opportunity to continue some of the themes we touched on in the last group and the debate we have had throughout the passage of the Bill on the importance of tackling intimate image abuse. I shall introduce the government amendments in this group that will make a real difference to victims of this abhorrent behaviour.
Before starting, I take the opportunity again to thank the Law Commission for the work it has done in its review of the criminal law relating to the non-consensual taking, making and sharing of intimate images. I also thank my right honourable friend Dame Maria Miller, who has long campaigned for and championed the victims of online abuse. Her sterling efforts have contributed greatly to the Government’s approach and to the formulation of policy in this sensitive area, as well as to the reform of criminal law.
As we announced last November, we intend to bring forward a more expansive package of measures based on the Law Commission’s recommendations as soon as parliamentary time allows, but the Government agree with the need to take swift action. That is why we are bringing forward these amendments now, to deliver on the recommendations which fall within the scope of the Bill, thereby ensuring justice for victims sooner.
These amendments repeal the offence of disclosing private sexual photographs and films with intent to cause distress and replace it with four new sexual offences in the Sexual Offences Act 2003. The first is a base offence of sharing an intimate photograph or film without consent or reasonable belief in consent. This recognises that the sharing of such images, whatever the intent of the perpetrator, should be considered a criminal violation of the victim’s bodily autonomy.
The amendments create two more serious offences of sharing an intimate photograph or film without consent with intent to cause alarm, distress or humiliation, or for the purpose of obtaining sexual gratification. Offenders committing the latter offence may also be subject to notification requirements, commonly referred to as being on the sex-offenders register. The amendments create an offence of threatening to share an intimate image. These new sharing offences are based on the Law Commission’s recommended approach to the idea of intimate photographs or films to include images which show or appear to show a person nude or partially nude, or which depict sexual or toileting activity. This will protect more victims than the current Section 33 offence, which protects only images of a private and sexual nature.
Finally, these clauses will, for the first time, make it a criminal offence to share a manufactured or so-called deepfake image of another person without his or her consent. This form of intimate image abuse is becoming more prevalent, and we want to send a clear message that it will not be tolerated.
By virtue of placing these offences in the Sexual Offences Act 2003, we are extending to these offences also the current special measures, so that victims can benefit from them in court, and from anonymity provisions, which are so important when something so intimate has been shared without consent. This is only the first stage in our reform of the law in this area. We are committed to introducing additional changes, giving effect to further recommendations of the Law Commission’s report which are beyond the scope of the Bill, when parliamentary time allows.
I hope that noble Lords from across your Lordships’ House will agree that these amendments represent an important step forward in tackling intimate image abuse and protecting victims. I commend them to the House, and I beg to move.
My Lords, I welcome these new offences. From my professional experience, I know that what came to be known as “sextortion” created some of the most distressing cases you could experience, where an individual would obtain intimate images, often by deception, and then use them to make threats. This is where a social network is particularly challenging; it enables people to access a network of all the family and friends of an individual whose photo they now hold and to threaten to distribute it to their nearest and dearest. This affects men and women; many of the victims were men who were honey-potted into sharing intimate images and in the worst cases it led to suicide. It was not uncommon that people would feel that there was no way out; the threat was so severe that they would take their own lives. It is extremely welcome that we are doing something about it, and making it more obvious to anyone who is thinking about committing this kind of offence that they run the risk of criminal prosecution.
I have a few specific questions. The first is on the definitions in proposed new Section 66D, inserted by government Amendment 8, where the Government are trying to define what “intimate” or “nudity” represents. This takes me back again to my professional experience of going through slide decks and trying to decide what was on the right or wrong side of a nudity policy line. I will not go into the detail of everything it said, not least because I keep noticing younger people in the audience here, but I will leave you with the thought that you ended up looking at images that involved typically fishnets, in the case of women, and socks, in the case of men—I will leave the rest to your Lordships’ imaginations to determine at what point someone has gone from being clothed to nude. I can see in this amendment that the courts are going to have to deal with the same issues.
The serious point is that, where there is alignment between platform policies, definitions and what we do not want to be distributed, that is extremely helpful, because it then means that if someone does try to put an intimate image out across one of the major platforms, the platform does not have to ask whether there was consent. They can just say that it is in breach of their policy and take it down. It actually has quite a beneficial effect on slowing transmission.
The other point that comes out of that is that some of these questions of intimacy are quite culturally subjective. In some cultures, even a swimsuit photo could be used to cause humiliation and distress. I know this is extremely difficult; we do not want to be overly censorious but, at the same time, we do not want to leave people exposed to threats, and if you come from a culture where a swimsuit photo would be a threat, the definitions may not work for you. So I hope that, as we go through this, there will be a continued dialogue between experts in the platforms who have to deal with these questions and people working on the criminal offence side. To the extent that we can achieve it, there should be alignment and the message should go out that if you are thinking of distributing an image like this, you run the risk of being censored by the platforms but also of running into a criminal prosecution. That is on the mechanics of making it work.
(1 year, 6 months ago)
Lords ChamberWe are very aware that we are not the only jurisdiction looking at the important issues the Bill addresses. The Government and, I am sure, academic researchers will observe the implementation of the European Union’s Digital Services Act with interest, including the provisions about researchers’ access. We will carefully consider any implications of our own online safety regime. As noble Lords know, the Secretary of State will be required to undertake a review of the framework between two and five years after the Bill comes into force. We expect that to include an assessment of how the Bill’s existing transparency provisions facilitate researcher access.
I do not expect the Minister to have an answer to this today, but it will be useful to get this on the record as it is quite important. Can he let us know the Government’s thinking on the other piece of the equation? We are getting the platforms to disclose the data, and an important regulatory element is the research organisations that receive it. In the EU, that is being addressed with a code of conduct, which is a mechanism enabled by the general data protection regulation that has been approved by the European Data Protection Board and creates this legal framework. I am not aware of equivalent work having been done in the UK, but that is an essential element. We do not want to find that we have the teeth to persuade the companies to disclose the data, but not the other piece we need—probably overseen by the Information Commissioner’s Office rather than Ofcom—which is a mechanism for approving researchers to receive and then use the data.
We are watching with interest what is happening in other jurisdictions. If I can furnish the Committee with any information in the area the noble Lord mentions, I will certainly follow up in writing.
(1 year, 6 months ago)
Lords ChamberI am reminded by my noble friend Lord Foster of Bath, particularly relating to the gambling sector, that some of these issues may run across various regulators that are all seeking business disruption. He reminded me that if you type into a search engine, which would be regulated and subject to business disruption measures here, “Casinos not regulated by GAMSTOP”, you will get a bunch of people who are evading GAMSTOP’s regulation. Noble Lords can imagine similar for financial services—something that I know the noble Baroness, Lady Morgan of Cotes, is also very interested in. It may not be for answer now, but I would be interested to understand what thinking the Government have on how all the different business disruption regimes—financial, gambling, Ofcom-regulated search services, et cetera—will all mesh together. They could all come before the courts under slightly different legal regimes.
When I saw the noble Lord, Lord Foster of Bath, and the noble Baroness, Lady Armstrong of Hill Top, in their places, I wondered whether they were intending to raise these points. I will certainly take on board what the noble Lord says and, if there is further information I can furnish your Lordships with, I certainly will.
The noble Baroness, Lady Kidron, asked whether the powers can be used on out-of-scope services. “No” is the direct answer to her direct question. The powers can be used only in relation to regulated services, but if sites not regulated by the Bill are publishing illegal content, existing law enforcement powers—such as those frequently deployed in cases of copyright infringement—can be used. I could set out a bit more in writing if that would be helpful.
My noble friend Lord Bethell’s amendments seek to set out in the Bill that Ofcom will be able to make a single application to the courts for an order enabling business disruption measures that apply against multiple platforms and operators. I must repeat, as he anticipated, the point made by my right honourable friend Chris Philp that the civil procedure rules allow for a multi-party claim to be made. These rules permit any number of claimants or defendants and any number of claims to be covered by one claim form. The overriding objective of the civil procedure rules is that cases are dealt with justly and proportionately. I want to reassure my noble friend that the Government are confident that the civil procedure rules will provide the necessary flexibility to ensure that services can be blocked or restricted.
The amendment in the name of the noble Lord, Lord Allan of Hallam, seeks to clarify what services might be subject to access restriction orders by removing the two examples provided in the Bill: internet access services and application stores. I would like to reassure him that these are simply indicative examples, highlighting two kinds of service on which access restriction requirements may be imposed. It is not an exhaustive list. Orders could be imposed on any services that meet the definition—that is, a person who provides a facility that is able to withdraw, adapt or manipulate it in such a way as to impede access to the regulated service in question. This provides Ofcom with the flexibility to identify where business disruption measures should be targeted, and it future-proofs the Bill by ensuring that the power remains functional and effective as technologies develop.
As the noble Lord highlighted, these are significant powers that can require that services be blocked in the UK. Clearly, limiting access to services in this way substantially affects the business interests of the service in question and the interests of the relevant third-party service, and it could affect users’ freedom of expression. It is therefore essential that appropriate safeguards are included and that due process is followed. That is why Ofcom will be required to seek a court order to be able to use these powers, ensuring that the courts have proper oversight.
To ensure that due process is upheld, an application by the regulator for a court order will have to specify the non-compliant provider, the grounds of the order and the steps that Ofcom considers should be imposed on the third parties in order to withdraw services and block users’ access. These requirements will ensure that the need to act quickly to tackle harm is appropriately balanced against upholding fundamental rights.
It might be useful to say a little about how blocking works—
Yes; he made a helpful point, and I will come back on it.
We share a common interest in understanding whether it would be used against VPNs, but we may not necessarily have the same view about whether it should be. Do not take that as an encouragement—take it as a request for information.
(1 year, 6 months ago)
Lords ChamberWill the review also cover an understanding of what has been happening in criminal cases where, in some of the examples that have been described, people have tried to take online activity to court? We will at that point understand whether the judges believe that existing offences cover some of these novel forms of activity. I hope the review will also extend not just to what Ofcom does as a regulator but to understand what the courts are doing in terms of the definitions of criminal activity and whether they are being effective in the new online spaces.
I believe it will. Certainly, both government and Parliament will take into account judgments in the court on this Bill and in related areas of law, and will, I am sure, want to respond.
(1 year, 7 months ago)
Lords ChamberI think it is actually quite important that there is—to use the language of the Bill—a risk assessment around the notion that people might game it. I thought the noble Baroness, Lady Gohir, made a very good point. People are very inventive and, if you have ever engaged with the people who run some of those big US misinformation sites—let us just call them that—you will know that they have very inventive, very clever people. They will be looking at this legislation and if they figure out that by opening a UK office and ticking all the boxes they will now get some sorts of privileges in terms of distributing their misinformation around the world, they will do it. They will try it, so I certainly think it is worth there being at least some kind of risk assessment against that happening.
In two years’ time we will be able to see whether the bad thing happened, but whether or not it is the Minister having a conversation with Ofcom now, I just think that forewarned is forearmed. We know that that is a possibility and it would be helpful for some work to be done now to make sure that that is not a loophole that none of us want, I think.
I am mindful of the examples the noble Lord gave in his speech. Looking at some of the provisions set out in subsection (2) about a body being
“subject to a standards code”
or having
“policies and procedures for handling and resolving complaints”,
I think on first response that those examples he gave would be covered. But I will certainly take on board the comments he made and those the noble Baroness, Lady Gohir, made as well and reflect on them. I hope—
(1 year, 7 months ago)
Lords ChamberMy Lords, like everyone who spoke, I and the Government recognise the tragic consequences of suicide and self-harm, and how so many lives and families have been devastated by it. I am grateful to the noble Baroness and all noble Lords, as well as the bereaved families who campaigned so bravely and for so long to spare others that heartache and to create a safer online environment for everyone. I am grateful to the noble Baroness, Lady Finlay of Llandaff, who raised these issues in her Private Member’s Bill, on which we had exchanges. My noble friend Lady Morgan is right to raise the case of Frankie Thomas and her parents, and to call that to mind as we debate these issues.
Amendments 96 and 296, tabled by the noble Baroness, Lady Finlay, would, in effect, reintroduce the former adult safety duties whereby category 1 companies were required to assess the risk of harm associated with legal content accessed by adults, and to set and enforce terms of service in relation to it. As noble Lords will know, those duties were removed in another place after extensive consideration. Those provisions risked creating incentives for the excessive removal of legal content, which would unduly interfere with adults’ free expression.
However, the new transparency, accountability and freedom of expression duties in Part 4, combined with the illegal and child safety duties in Part 3, will provide a robust approach that will hold companies to account for the way they deal with this content. Under the Part 4 duties, category 1 services will need to have appropriate systems and processes in place to deal with content or activity that is banned or restricted by their terms of service.
Many platforms—such as Twitter, Facebook and TikTok, which the noble Baroness raised—say in their terms of service that they restrict suicide and self-harm content, but they do not always enforce these policies effectively. The Bill will require category 1 companies—the largest platforms—fully to enforce their terms of service for this content, which will be a significant improvement for users’ safety. Where companies allow this content, the user-empowerment duties will give adults tools to limit their exposure to it, if they wish to do so.
The noble Baroness is right to raise the issue of algorithms. As the noble Lord, Lord Stevenson, said, amplification lies at the heart of many cases. The Bill will require providers specifically to consider as part of their risk assessments how algorithms could affect children’s and adults’ exposure to illegal content, and content that is harmful to children, on their services. Providers will need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet the illegal content and child safety duties in the Bill.
Following our earlier discussion, we were going to have a response on super-complaints. I am curious to understand whether we had a pattern of complaints—such as those the noble Baroness, Lady Kidron, and others received—about a platform saying, under its terms of service, that it would remove suicide and self-harm content but failing to do so. Does the Minister think that is precisely the kind of thing that could be substantive material for an organisation to bring as a super-complaint to Ofcom?
My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.
I will plant a flag in reference to the new offences, which I know we will come back to again. It is always helpful to look at real-world examples. There is a lot of meme-based self-harm content. Two examples are the Tide Pods challenge—the eating of detergent capsules—and choking games, both of which have been very common and widespread. It would be helpful, ahead of our debate on the new offences, to understand whether they are below or above the threshold of serious self-harm and what the Government’s intention is. There are arguments both ways: obviously, criminalising children for being foolish carries certain consequences, but we also want to stop the spread of the content. So, when we come to that offence, it would be helpful if the Minister could use specific examples, such as the meme-based self-harm content, which is quite common.
I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.
The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.
It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.
The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.
Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.
(1 year, 7 months ago)
Lords ChamberI thank the noble Lord.
I was pleased to hear about Wicipedia Cymraeg—there being no “k” in Welsh. As the noble Lord, Lord Stevenson, said, there has been a very good conversational discussion in this debate, as befits Committee and a self-regulating House. My noble friend Lady Stowell is right to point out matters of procedure, although we were grateful to know why the noble Viscount, Lord Colville, supports the amendments in question.
Again, I think that that is clear. I understood from the Bill that, if an American says something that would be illegal were they to be in the United Kingdom, we would still want to exclude that content. But that still leaves it open, and I just ask the question again, for confirmation. If all of the activities are outside the United Kingdom—Americans talking to each other, as it were—and a British person objects, at what point would the platform be required to restrict the content of the Americans talking to each other? Is it pre-emptively or only as and when somebody in the United Kingdom objects to it? We should flesh out that kind of practical detail before this becomes law.
If it has been committed in the UK and is viewed by a UK user, it can be treated as illegal. I will follow up on the noble Lord’s further points ahead of the next stage.
Amendment 272 explicitly provides that relevant information that is reasonably available to a provider includes information submitted by users in complaints. Providers will already need to do this when making judgments about content, as it will be both relevant and reasonably available.
My noble friend Lord Moylan returned to the question that arose on day 2 in Committee, querying the distinction between “protect” and “prevent”, and suggesting that a duty to protect would or could lead to the excessive removal of content. To be clear, the duty requires platforms to put in place proportionate systems and processes designed to prevent users encountering content. I draw my noble friend’s attention to the focus on systems and processes in that. This requires platforms to design their services to achieve the outcome of preventing users encountering such content. That could include upstream design measures, as well as content identification measures, once content appears on a service. By contrast, a duty to protect is a less stringent duty and would undermine the proactive nature of the illegal content duties for priority offences.
(1 year, 7 months ago)
Lords ChamberMy Lords, I must first apologise for my slightly dishevelled appearance as I managed to spill coffee down my shirt on my way to the Chamber. I apologise for that—as the fumes from the dried coffee suffuse the air around me. It will certainly keep me caffeinated for the day ahead.
Search services play a critical role in users’ online experience, allowing them easily to find and access a broad range of information online. Their gateway function, as we have discussed previously, means that they also play an important role in keeping users safe online because they have significant influence over the content people encounter. The Bill therefore imposes stringent requirements on search services to tackle the risks from illegal content and to protect children.
Amendments 13, 15, 66 to 69 and 73 tabled by my noble friend Lord Moylan seek to narrow the scope of the Bill so that its safety search duties apply only to the largest search services—categorised in the Bill as category 2A services—rather than to all search services. Narrowing the scope in this way would have an adverse impact on the safety of people using search services, including children. Search services, including combined services, below the category 2A threshold would no longer have a duty to minimise the risk of users encountering illegal content or children encountering harmful content in or via search results. This would increase the likelihood of users, including children, accessing illegal content and children accessing harmful content through these services.
The Bill already takes a targeted approach and the duties on search services will be proportionate to the risk of harm and the capacity of companies. This means that services which are smaller and lower-risk will have a lighter regulatory burden than those which are larger and higher-risk. All search services will be required to conduct regular illegal content risk assessments and, where relevant, children’s risk assessments, and then implement proportionate mitigations to protect users, including children. Ofcom will set out in its codes of practice specific steps search services can take to ensure compliance and must ensure that these are proportionate to the size and capacity of the service.
The noble Baroness, Lady Kidron, and my noble friend Lady Harding of Winscombe asked how search services should conduct their risk assessments. Regulated search services will have a duty to conduct regular illegal content risk assessments, and where a service is likely to be accessed by children it will have a duty to conduct regular children’s risk assessments, as I say. They will be required to assess the level and nature of the risk of individuals encountering illegal content on their service, to implement proportionate mitigations to protect people from illegal content, and to monitor them for effectiveness. Services likely to be accessed by children will also be required to assess the nature and level of risk of their service specifically for children to identify and implement proportionate mitigations to keep children safe, and to monitor them for effectiveness as well.
Companies will also need to assess how the design and operation of the service may increase or reduce the risks identified and Ofcom will have a duty to issue guidance to assist providers in carrying out their risk assessments. That will ensure that providers have, for instance, sufficient clarity about what an appropriate risk assessment looks like for their type of service.
The noble Lord, Lord Allan, and others asked about definitions and I congratulate noble Lords on avoiding the obvious
“To be, or not to be”
pun in the debate we have just had. The noble Lord, Lord Allan, is right in the definition he set out. On the rationale for it, it is simply that we have designated as category 1 the largest and riskiest services and as category 2 the smaller and less risky ones, splitting them between 2A, search services, and 2B, user-to-user services. We think that is a clear framework. The definitions are set out a bit more in the Explanatory Notes but that is the rationale.
I am grateful to the Minister for that clarification. I take it then that the Government’s working assumption is that all search services, including the biggest ones, are by definition less risky than the larger user-to-user services. It is just a clarification that that is their thinking that has informed this.
As I said, the largest and riskiest sites may involve some which have search functions, so the test of large and most risky applies. Smaller and less risky search services are captured in category 2A.
Amendment 157 in the name of my noble friend Lord Pickles, and spoken to by the noble Baroness, Lady Deech, seeks to apply new duties on the largest search services. I agree with the objectives in my noble friend’s amendment of increasing transparency about the search services’ operations and enabling users to hold them to account. It is not, however, an amendment I can accept because it would duplicate existing duties while imposing new duties which we do not think are appropriate for search services.
As I say, the Bill will already require search services to set out how they are fulfilling their illegal content and child safety duties in publicly available statements. The largest search services—category 2A—will also be obliged to publish a summary of their risk assessments and to share this with Ofcom. That will ensure that users know what to expect on those search services. In addition, they will be subject to the Bill’s requirements relating to user reporting and redress. These will ensure that search services put in place effective and accessible mechanisms for users to report illegal content and content which is harmful to children.
My noble friend’s amendment would ensure that the requirements to comply with its publicly available statements applied to all actions taken by a search service to prevent harm, not just those relating to illegal content and child safety. This would be a significant expansion of the duties, resulting in Ofcom overseeing how search services treat legal content which is accessed by adults. That runs counter to the Government’s stated desire to avoid labelling legal content which is accessed by adults as harmful. It is for adult users themselves to determine what legal content they consider harmful. It is not for us to put in place measures which could limit their access to legal content, however distasteful. That is not to say, of course, that where material becomes illegal in its nature that we do not share the determination of the noble Baroness, my noble friend and others to make sure that it is properly tackled. The Secretary of State and Ministers have had extensive meetings with groups making representations on this point and I am very happy to continue speaking to my noble friend, the noble Baroness and others if they would welcome it.
I hope that that provides enough reassurance for the amendment to be withdrawn at this stage.
Yes, I think it is right. The investigatory powers Act is a tool for law enforcement and intelligence agencies, whereas the Bill is designed to regulate technology companies—an important high-level distinction. As such, the Bill does not grant investigatory powers to state bodies. It does not allow the Government or the regulator to access private messages. Instead, it requires companies to implement proportionate systems and processes to tackle illegal content on their platforms. I will come on to say a little about legal redress and the role of the courts in looking at Ofcom’s decisions so, if I may, I will respond to that in a moment.
The investigatory powers Act includes a different form of technical notice, which is to put in place surveillance equipment. The noble Lord, Lord Stevenson, has a good point: we need to ensure that we do not have two regimes, both requiring companies to put in place technical equipment but with quite different standards applying.
I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.
I am about to talk about the safeguards for journalists in the context of the Bill and the questions posed by the noble Baroness, Lady Bennett. However, I take my noble friend’s point about the implications of other Acts that are already on the statute book in that context as well.
Just to finish the train of thought of what I was saying on Amendment 202, making a reference to encryption, as it suggests, would be out of step with the wider approach of the Bill, which is to remain technology-neutral.
I come to the safeguards for journalistic protections, as touched on by the noble Baroness, Lady Bennett. The Government are fully committed to protecting the integrity of journalistic sources, and there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources. Any tools required on private communications must be accredited by Ofcom as highly accurate only in detecting child sexual abuse and exploitation content. These minimum standards of accuracy will be approved and published by the Secretary of State, following advice from Ofcom. We therefore expect it to be very unlikely that journalistic content will be falsely detected by the tools being required.
Under Clause 59, companies are obliged to report child sexual abuse material which is detected on their service to the National Crime Agency; this echoes a point made by the noble Lord, Lord Allan, in an earlier contribution. That would include child sexual abuse and exploitation material identified through tools required by a notice and, even in this event, the appropriate protections in relation to journalistic sources would be applied by the National Crime Agency if it were necessary to identify individuals involved in sharing illegal material.
I want to flag that in the context of terrorist content, this is quite high risk for journalists. It is quite common for them, for example, to be circulating a horrific ISIS video not because they support ISIS but because it is part of a news article they are putting together. We should flag that terrorist content in particular is commonly distributed by journalists and it could be picked up by any system that is not sufficiently sophisticated.
I see that my noble friend Lord Murray of Blidworth has joined the Front Bench in anticipation of the lunch-break business for the Home Office. That gives me the opportunity to say that I will discuss some of these points with him, my noble friend Lord Sharpe of Epsom and others at the Home Office.
Amendment 246 aims to ensure that there is no requirement for a provider to comply with a notice until the High Court has determined the appeal. The Government have ensured that, in addition to judicial review through the High Court, there is an accessible and relatively affordable alternative means of appealing Ofcom’s decisions via the Upper Tribunal. We cannot accept amendments such as this, which could unacceptably delay Ofcom’s ability to issue a notice, because that would leave children vulnerable.
To ensure that Ofcom’s use of its powers under Clause 110, and the technology that underpins it, are transparent, Ofcom will produce an annual report about the exercise of its functions using these powers. This must be submitted to the Secretary of State and laid before Parliament. The report must also provide the details of technology that has been assessed as meeting minimum standards of accuracy, and Ofcom may also consider other factors, including the impact of technologies on privacy. That will be separate to Ofcom’s annual report to allow for full scrutiny of this power.
The legislation also places a statutory requirement on Ofcom to publish guidance before its functions with regard to Clause 110 come into force. This will be after Royal Assent, given that the legislation is subject to change until that point. Before producing the guidance, Ofcom must consult the Information Commissioner. As I said, there are already strong safeguards regarding Ofcom’s use of these powers, so we think that this additional oversight is unnecessary.
Amendments 203 and 204, tabled by the noble Lord, Lord Clement-Jones, seek to probe the privacy implications of Ofcom’s powers to require technology under Clause 110. I reiterate that the Bill will not ban or weaken any design, including end-to-end encryption. But, given the scale of child sexual abuse and exploitation taking place on private communications, it is important that Ofcom has effective powers to require companies to tackle this abhorrent activity. Data from the Office for National Statistics show that in nearly three-quarters of cases where children are contacted online by someone they do not know, this takes place by private message. This highlights the scale of the threat and the importance of technology providers taking steps to safeguard children in private spaces online.
As already set out, there are already strong safeguards regarding the use of this power, and these will prevent Ofcom from requiring the use of any technology that would undermine a platform’s security and put users’ privacy at risk. These safeguards will also ensure that platforms will not be required to conduct mass scanning of private communications by default.
Until the regime comes into force, it is of course not possible to say with certainty which tools would be accredited. However, some illustrative examples of the kinds of current tools we might expect to be used—providing that they are highly accurate and compatible with a service’s design—are machine learning or artificial intelligence, which assess content to determine whether it is illegal, and hashing technology, which works by assigning a unique number to an image that has been identified as illegal.
Given the particularly abhorrent nature of the crimes we are discussing, it is important that services giving rise to a risk of child sexual abuse and exploitation in the UK are covered, wherever they are based. The Bill, including Ofcom’s ability to issue notices in relation to this or to terrorism, will therefore have extraterritorial effect. The Bill will apply to any relevant service that is linked to the UK. A service is linked to the UK if it has a significant number of UK users, if UK users form a target market or if the service is capable of being used in the UK and there is a material risk of significant harm to individuals in the UK arising from the service. I hope that that reassures the noble Lord, on behalf of his noble friend, about why that amendment is not needed.
Amendments 209 to 214 seek to place additional requirements on Ofcom to consider the effect on user privacy when using its powers under Clause 110. I agree that tackling online harm needs to take place while protecting privacy and security online, which is why Ofcom already has to consider user privacy before issuing notices under Section 110, among the other stringent safeguards I have set out. Amendment 202A would impose a duty on Ofcom to issue a notice under Clause 110, where it is satisfied that it is necessary and proportionate to do so—this will have involved ensuring that the safeguards have been met.
Ofcom will have access to a wide range of information and must have the discretion to decide the most appropriate course of action in any particular scenario, including where this action lies outside the powers and procedures conferred by Clause 110; for instance, an initial period of voluntary engagement. This is an in extremis power. It is essential that we balance users’ rights with the need to enable a strong response, so Ofcom must be able to assess whether any alternative, less intrusive measures would effectively reduce the level of child sexual exploitation and abuse or terrorist content occurring on a service before issuing a notice.
I hope that that provides reassurance to noble Lords on the amendments in this group, and I invite the noble Lord to withdraw Amendment 14.
My Lords, this has been a very useful debate and serves as a good appetite builder for lunch, which I understand we will be able to take shortly.
I am grateful to the Minister for his response and to all noble Lords who have taken part in the debate. As always, the noble Baroness, Lady Kidron, gave us a balanced view of digital rights—the right to privacy and to security—and the fact that we should be trying to advance these two things simultaneously. She was right again to remind us that this is a real problem and there is a lot we can do. I know she has worked on this through things such as metadata—understanding who is communicating with whom—which might strike that nice balance where we are not infringing on people’s privacy too grossly but are still able to identify those who wish harm on our society and in particular on our children.
The noble Baroness, Lady Bennett, was right to pick up this tension between everything, everywhere, all at once and targeted surveillance. Again, that is really interesting to tease out. I am personally quite comfortable with quite intrusive targeted surveillance. I do not know whether noble Lords have been reading the Pegasus spyware stories: I am not comfortable with some Governments placing such spyware on the phones of human rights defenders but I would be much more relaxed about the British authorities placing something similar on the phones of people who are going to plant bombs in Manchester. We need to be really honest about where we are drawing our red lines if we want to go in the direction of targeted surveillance.
The noble Lord, Lord Moylan, was right again to remind us about the importance of private conversations. I cited the example of police officers whose conversations have been exposed. Although it is hard, we should remember that if ordinary citizens want to exchange horrible racist jokes with each other and so on in private groups that is not a matter for the state, but it is when it is somebody in a position of public authority; we have a right to intervene there. Again, we have to remember that as long as it is not illegal people can say horrible things in private, and we should not encourage any situation where we suggest that the state would interfere unless there are legitimate grounds—for example, it is a police officer or somebody is doing something that crosses the line of legality.
The noble Baroness, Lady Fox, reminded us that it is either encrypted or it is not. That is really helpful, as things cannot be half encrypted. If a service provider makes a commitment it is critical that it is truthful. That is what our privacy law tells us. If I say, “This service is encrypted between you and the person you send the message to”, and I know that there is somebody in between who could access it, I am lying. I cannot say it is a private service unless it is truly private. We have to bear that in mind. Historically, people might have been more comfortable with fudging it, but not in 2023, when have this raft of privacy legislation.
The noble Baroness is also right to remind us that privacy can be safety. There is almost nothing more devastating than the leaking of intimate images. When services such as iCloud move to encrypted storage that dramatically reduces the risk that somebody will get access to your intimate images if you store them there, which you are legally entitled to do. Privacy can be a critical part of an individual maintaining their own security and we should not lose that.
The noble Baroness, Lady Stowell, was right again to talk about general monitoring. I am pleased that she found the WhatsApp briefing useful. I was unable to attend but I know from previous contact that there are people doing good work and it is sad that that often does not come out. We end up with this very polarised debate, which my noble friend Lord McNally was right to remind us is unhelpful. The people south of the river are often working very closely in the public interest with people in tech companies. Public rhetoric tends to focus on why more is not being done; there are very few thanks for what is being done. I would like to see the debate move a little more in that direction.
The noble Lord, Lord Knight, opened up a whole new world of pain with VPNs, which I am sure we will come back to. I say simply that if we get the regulatory frameworks right, most people in Britain will continue to use mainstream services as long as they are allowed to be offered. If those services are regulated by the European Union under its Digital Services Act and pertain to the UK and the US in a similar way, they will in effect have global standards, so it will not matter where you VPN from. The scenario the noble Lord painted, which I worry about, is where those mainstream services are not available and we drive people into small, new services that are not regulated by anyone. We would then end up inadvertently driving people back to the wild west that we complain about, when most of them would prefer to use mainstream services that are properly regulated by Ofcom, the European Commission and the US authorities.
(1 year, 7 months ago)
Lords ChamberServices already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.
My Lords, on that point, my reading of Clauses 131 to 135, where the Bill sets out the business disruption measures, is that they could be used precisely in that way. It would be helpful for the Minister responding later to clarify that Ofcom would use those business disruption measures, as the Government explicitly anticipate, were an app store, in a rogue way, to continue to list a service that Ofcom has said should not be made available to people in the United Kingdom.
I will be very happy to set that out in more detail.
Amendments 33A and 217A in the name of the noble Lord, Lord Storey, would place a new duty on user-to-user services that predominantly enable online gaming. Specifically, they would require them to have a classification certificate stating the age group for which they are suitable. We do not think that is necessary, given that there is already widespread, voluntary uptake of approval classification systems in online gaming.
My Lords, this group of government amendments relates to risk assessments; it may be helpful if I speak to them now as the final group before the dinner break.
Risk management is at the heart of the Bill’s regulatory framework. Ofcom and services’ risk assessments will form the foundation for protecting users from illegal content and content which is harmful to children. They will ensure that providers thoroughly identify the risks on their own websites, enabling them to manage and mitigate the potential harms arising from them. Ofcom will set out the risks across the sector and issue guidance to companies on how to conduct their assessments effectively. All providers will be required to carry out risk assessments, keep them up-to-date and update them before making a significant change to the design or operation of their service which could put their users at risk. Providers will then need to put in place measures to manage and mitigate the risks they identify in their risk assessments, including any emerging risks.
Given how crucial the risk assessments are to this framework, it is essential that we enable them to be properly scrutinised by the public. The government amendments in this group will place new duties on providers of the largest services—that is, category 1 and 2A services—to publish summaries of their illegal and child safety risk assessments. Through these amendments, providers of these services will also have a new duty to send full records of their risk assessments to Ofcom. This will increase transparency about the risk of harm on the largest platforms, clearly showing how risk is affected by factors such as the design, user base or functionality of their services. These amendments will further ensure that the risk assessments can be properly assessed by internet users, including by children and their parents and guardians, by ensuring that summaries of the assessments are publicly available. This will empower users to make informed decisions when choosing whether and how to use these services.
It is also important that Ofcom is fully appraised of the risks identified by service providers. That is why these amendments introduce duties for both category 1 and 2A services to send their records of these risk assessments, in full, to Ofcom. This will make it easier for Ofcom to supervise compliance with the risk assessment duties, as well as other duties linked to the findings of the risk assessments, rather than having to request the assessments from companies under its information-gathering powers.
These amendments also clarify that companies must keep a record of all aspects of their risk assessments, which strengthens the existing record-keeping duties on services. I hope that noble Lords will welcome these amendments. I beg to move.
My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.
The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.
The UK holds the number one spot for life sciences investment in Europe, second only to the United States globally. However, the noble Lord is right about ensuring that we have the skilled talent pool across the industry and from academia and our health service to continue that growth. The Life Sciences Vision sets out our commitment to developing a strong talent pool across all those areas and the Government have developed several skills programmes that are delivering against our commitments by developing a pipeline of onshore talent, including through supporting apprenticeships and improving access to talent from overseas.
My Lords, the report referred to in the Question highlights the incredible success of Ireland in establishing itself as a global pharmaceutical manufacturing hub. Can the Minister explain what steps the Government are taking to learn from Ireland’s success and apply some of those lessons to the United Kingdom?
Of course, we keep an eye on what is happening around the world to ensure that we maintain our competitive edge but, as I said, we are second only to the United States for life sciences investment. We are supported by a mature and sophisticated capital market and the second biggest hub for private equity and venture capital after the US, so we have many advantages to be proud of as well.
(1 year, 10 months ago)
Lords ChamberOfcom does have an important role to play here as the independent regulator, but, as I say, mindful of the particular challenges that households are facing, my right honourable friend the Secretary of State spoke directly to companies, asking them to consider very carefully the decisions they are making and the impact on their customers.
My Lords, was the Minister struck, as I was, by the observation in Ofcom’s December pricing trends report that there are millions of consumers who are out of contract, and so free to switch, but have not yet done so? Does he agree that these people could make significant savings, often without having to switch at all, as many providers will drop their prices as soon as you ring and threaten to leave? What are the Government doing to make this group aware that they can do this?
Yes, it is very striking. Many people could be saving money and are not aware of it. That is why it is important that contracts are clear, but it also highlights the importance of consumer advice groups and, indeed, debates such as this, to draw the attention of people to the contracts they have signed.