Online Safety Bill Debate
Full Debate: Read Full DebateLord Allan of Hallam
Main Page: Lord Allan of Hallam (Non-affiliated - Life peer)Department Debates - View all Lord Allan of Hallam's debates with the Department for Digital, Culture, Media & Sport
(1 year, 2 months ago)
Lords ChamberMy Lords, I will make some arguments in favour of Amendment 191A, in the name of the noble Baroness, Lady Kidron, and inject some notes of caution around Amendment 186A.
On Amendment 191A, it has been my experience that when people frequently investigate something that has happened on online services, they do it well, and well-formed requests are critical to making this work effectively. This was the case with law enforcement: when an individual police officer is investigating something online for the first time, they often ask the wrong questions. They do not understand what they can get and what they cannot get. It is like everything in life: the more you do it, the better you get at it.
Fortunately, in a sense, most coroners will only very occasionally have to deal with these awful circumstances where they need data related to the death of a child. At that point, they are going to be very dependent on Ofcom—which will be dealing with the companies day in and day out across a range of issues—for its expertise. Therefore, it makes absolute sense that Ofcom’s expertise should be distributed widely and that coroners—at the point where they need to access this information—should be able to rely on that. So Amendment 191A is very well intended and, from a practical point of view, very necessary if we are going to make this new system work as I know the noble Baroness, Lady Kidron, and I would like to see it work.
On Amendment 186A around consumer law, I can see the attraction of this, as well as some of the read-across from the United States. A lot of the enforcement against online platforms in the US takes place through the Federal Trade Commission precisely in this area of consumer law and looking at unfair and deceptive practices. I can see the attraction of seeking to align with European Union law, as the noble Lord, Lord Moylan, argued we should be doing with respect to consumer law. However, I think this would be much better dealt with in the context of the digital markets Bill and it would be a mistake to squeeze it in here. My reasons for this are about both process and substance.
In terms of process, we have not done the impact assessment on this. It is quite a major change, for two reasons. First, it could potentially have a huge impact in terms of legal costs and the way businesses will have to deal with that—although I know nobody is going to get too upset if the impact assessment says there will be a significant increase in legal costs for category 1 companies. However, we should at least flesh these things out when we are making regulations and have them in an impact assessment before going ahead and doing something that would have a material impact.
Secondly in process terms, there are some really interesting questions about the way this might affect the market. The consumer law we have does exclude services that are offered for free, because so much of consumer law is about saying, “If the goods are not delivered correctly, you get your money back”. With free services, we are clearly dealing with a different model, so the notion that we have a law that is geared towards making sure you either get the goods or you get the money may not be the best fit. To try to shoehorn in these free-at-the-point-of-use services may not be the best way to do it, even from a markets and consumer point of view. Taking our time to think about how to get this right would make sense.
More fundamentally, in terms of the substance, we need to recognise that, as a result of the Online Safety Bill, Ofcom will be requiring regulated services to rewrite their terms of service in quite a lot of detail. We see this throughout the Bill. We are going to have to do all sorts of things—we will debate other amendments in this area today—to make sure that their terms of service are conformant with what we want from them in this Bill. They are going to have to redo their complaints and redress mechanisms. All of this is going to have to change and Ofcom is going to be the regulator that tells them how to do it; that is what we are asking Ofcom to tell them to do.
My fundamental concern here, if we introduce another element, is that there is a whole different structure under consumer law where you might go to local trading standards or the CMA, or you might launch a private action. In many cases, this may overlap. The overlap is where consumer law states that goods must be provided with reasonable care and skill and in a reasonable time. That sounds great, but it is also what the Online Safety Bill is going to be doing. We do not want consumer law saying, “You need to write your terms of service this way and handle complaints this way”, and then Ofcom coming along and saying, “No, you must write your terms of service that way and handle complaints that way”. We will end up in a mess. So I just think that, from a practical point of view, we should be very focused in this Bill on getting all of this right from an Online Safety Bill point of view, and very cautious about introducing another element.
Perhaps one of the attractions of the consumer law point for those who support the amendment is that it says, “Your terms must be fair”. It is the US model; you cannot have unfair terms. Again, I can imagine a scenario in which somebody goes to court and tries to get the terms struck down because they are unfair but the platform says, “They’re the terms Ofcom told me to write. Sort this out, please, because Ofcom is saying I need to do this but the courts are now saying the thing I did was unfair because somebody feels that they were badly treated”.
Does the noble Lord accept that that is already a possibility? You can bring an action in contract law against them on the grounds that it is an unfair contract. This could happen already. It is as if the noble Lord is not aware that the possibility of individual action for breach of contract is already built into Clause 65. This measure simply supplements it.
I am certainly aware that it is there but, again, the noble Lord has just made the point himself: this supplements it. The intent of the amendment is to give consumers more rights under this additional piece of legislation; otherwise, why bother with the amendment at all? The noble Lord may be arguing against himself in saying that this is unnecessary and, at the same time, that we need to make the change. If we make the change, it is, in a sense, a material change to open the door to more claims being made under consumer law that terms are unfair. As I say, we may want this outcome to happen eventually, but I find it potentially conflicting to do it precisely at a time when we are getting Ofcom to intervene much more closely in setting those terms. I am simply arguing, “Let’s let that regime settle down”.
The net result and rational outcome—again, I am speaking to my noble friend’s Amendment 253 here—may be that other regulators end up deferring to Ofcom. If Ofcom is the primary regulator and we have told it, under the terms of the Online Safety Bill, “You must require platforms to operate in this way, handle complaints in this way and have terms that do these things, such as excluding particular forms of language and in effect outlawing them on platforms”, the other regulators will eventually end up deferring to it. All I am arguing is that, at this stage, it is premature to try to introduce a second, parallel route for people to seek changes to terms or different forms of redress, however tempting that may be. So I am suggesting a note of caution. It is not that we are starting from Ground Zero—people have routes to go forward today—but I worry about introducing something that I think people will see as material at this late stage, having not looked at the full impact of it and potentially running in conflict with everything else that we are trying to do in this legislation.
My Lords, transparency and accountability are at the heart of the regulatory framework that the Bill seeks to establish. It is vital that Ofcom has the powers it needs to require companies to publish online safety information and to scrutinise their systems and processes, particularly their algorithms. The Government agree about the importance of improving data sharing with independent researchers while recognising the nascent evidence base and the complexities of this issue, which we explored in Committee. We are pleased to be bringing forward a number of amendments to strengthen platforms’ transparency, which confer on Ofcom new powers to assess how providers’ algorithms work, which accelerate the development of the evidence base regarding researchers’ access to information and which require Ofcom to produce guidance on this issue.
Amendment 187 in my name makes changes to Clause 65 on category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. The amendment tightens the clause to ensure that all the providers’ terms through which they might indicate that a certain kind of content is not allowed on its service are captured by these duties.
Amendment 252G is a drafting change, removing a redundant paragraph from the Bill in relation to exceptions to the legislative definition of an enforceable requirement in Schedule 12.
In relation to transparency, government Amendments 195, 196, 198 and 199 expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports. With thanks to the noble Lord, Lord Stevenson of Balmacara, for his engagement on this issue, we are pleased to table these amendments, which will allow Ofcom to require providers to publish information relating to the formulation, development and scope of user-to-user service providers’ terms of service and search service providers’ public statements of policies and procedures. This is in addition to the existing transparency provision regarding their application.
Amendments 196 and 199 would enable Ofcom to require providers to publish more information in relation to algorithms, specifically information about the design and operation of algorithms that affect the display, promotion, restriction, discovery or recommendation of content subject to the duties in the Bill. These changes will enable greater public scrutiny of providers’ terms of service and their algorithms, providing valuable information to users about the platforms that they are using.
As well as publicly holding platforms to account, the regulator must be able to get under the bonnet and scrutinise the algorithms’ functionalities and the other systems and processes that they use. Empirical tests are a standard method for understanding the performance of an algorithmic system. They involve taking a test data set, running it through an algorithmic system and observing the output. These tests may be relevant for assessing the efficacy and wider impacts of content moderation technology, age-verification systems and recommender systems.
Government Amendments 247A, 250A, 252A, 252B, 252C, 252D, 252E and 252F will ensure that Ofcom has the powers to enable it to direct and observe such tests remotely. This will significantly bolster Ofcom’s ability to assess how a provider’s algorithms work, and therefore to assess its compliance with the duties in the Bill. I understand that certain technology companies have voiced some concerns about these powers, but I reassure your Lordships that they are necessary and proportionate.
The powers will be subject to a number of safeguards. First, they are limited to viewing information. Ofcom will be unable to remotely access or interfere with the service for any other purpose when exercising the power. These tests would be performed offline, meaning that they would not affect the services’ provision or the experience of users. Assessing systems, processes, features and functionalities is the focus of the powers. As such, individual user data and content are unlikely to be the focus of any remote access to view information.
Additionally, the power can be used only where it is proportionate to use in the exercise of Ofcom’s functions—for example, when investigating whether a regulated service has complied with relevant safety duties. A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was unlawful. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.
The Bill contains no restriction on services making the existence and detail of the information notice public. Should a regulated service wish to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. In addition, the amendments create no restrictions on the use of this power being viewable to members of the public through a request, such as those under the Freedom of Information Act—noting that under Section 393 of the Communications Act, Ofcom will not be able to disclose information it has obtained through its exercise of these powers without the provider’s consent, unless permitted for specific, defined purposes. These powers are necessary and proportionate and will that ensure Ofcom has the tools to understand features and functionalities and the risks associated with them, and therefore the tools to assess companies’ compliance with the Bill.
Finally, I turn to researchers’ access to data. We recognise the valuable work of researchers in improving our collective understanding of the issues we have debated throughout our scrutiny of the Bill. However, we are also aware that we need to develop the evidence base to ensure that any sharing of sensitive information between companies and researchers can be done safely and securely. To this end, we are pleased to table government Amendments 272B, 272C and 272D.
Government Amendment 272B would require Ofcom to publish its report into researcher access to information within 18 months, rather than two years. This report will provide the evidence base for government Amendments 272C and 272D, which would require Ofcom to publish guidance on this issue. This will provide valuable, evidence-based guidance on how to improve access for researchers safely and securely.
That said, we understand the calls for further action in this area. The Government will explore this issue further and report back to your Lordships’ House on whether further measures to support researchers’ access to data are required—and if so, whether they could be implemented through other legislation, such as the Data Protection and Digital Information Bill. I beg to move.
My Lords, Amendment 247B in my name was triggered by government Amendment 247A, which the Minister just introduced. I want to explain it, because the government amendment is quite late—it has arrived on Report—so we need to look in some detail at what the Government have proposed. The phrasing that has caused so much concern, which the Minister has acknowledged, is that Ofcom will be able to
“remotely access the service provided by the person”.
It is those words—“remotely access”—which are trigger words for anyone who lived through the Snowden disclosures, where everyone was so concerned about remote access by government agencies to precisely the same services we are talking about today: social media services.
My Lords, I am grateful to noble Lords for their contributions in this group. On the point made by the noble Lord, Lord Knight of Weymouth, on why we are bringing in some of these powers now, I say that the power to direct and observe algorithms was previously implicit within Ofcom’s information powers and, where a provider has UK premises, under powers of entry, inspection and audit under Schedule 12. However, the Digital Markets, Competition and Consumers Bill, which is set to confer similar powers on the Competition and Markets Authority and its digital markets unit, makes these powers explicit. We wanted to ensure that there was no ambiguity over whether Ofcom had equivalent powers in the light of that. Furthermore, the changes we are making ensure that Ofcom can direct and observe algorithmic assessments even if a provider does not have relevant premises or equipment in the UK.
I am grateful to the noble Lord, Lord Allan of Hallam, for inviting me to re-emphasise points and allay the concerns that have been triggered, as his noble friend Lord Clement-Jones put it. I am happy to set out again a bit of what I said in opening this debate. The powers will be subject to a number of safeguards. First, they are limited to “viewing information”. They can be used only where they are proportionate in the exercise of Ofcom’s functions, and a provider would have the right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was done unlawfully. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.
These are not secret powers, as the noble Lord rightly noted. The Bill contains no restriction on services making the existence and detail of the information notice public. If a regulated service wished to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. I also mentioned the recourse that people have through existing legislation, such as the Freedom of Information Act, to give them safeguards, noting that, under Section 393 of the Communications Act, Ofcom will not be able to disclose information that it has obtained through its exercise of these powers without the provider’s consent unless that is permitted for specific, defined purposes.
The noble Lord’s Amendment 247B seeks to place further safeguards on Ofcom’s use of its new power to access providers’ systems remotely to observe tests. While I largely agree with the intention behind it, there are already a number of safeguards in place for the use of that power, including in relation to data protection, legally privileged material and the disclosure of information, as I have outlined. Ofcom will not be able to gain remote access simply for exploratory or fishing purposes, and indeed Ofcom expects to have conversations with services about how to provide the information requested.
Furthermore, before exercising the power, Ofcom will be required to issue an information notice specifying the information to be provided, setting out the parameters of access and why Ofcom requires the information, among other things. Following the receipt of an information notice, a notice requiring an inspection or an audit notice, if a company has identified that there is an obvious security risk in Ofcom exercising the power as set out in the notice, it may not be proportionate to do so. As set out in Ofcom’s duties, Ofcom must have regard to the principles under which regulatory activities should be proportionate and targeted only at cases where action is needed.
In line with current practice, we anticipate Ofcom will issue information notice requests in draft form to identify and address any issues, including in relation to security, before the information notice is issued formally. Ofcom will have a legal duty to exercise its remote access powers in a way that is proportionate, ensuring that undue burdens are not placed on businesses. In assessing proportionality in line with this requirement, Ofcom would need to consider the size and resource capacity of a service when choosing the most appropriate way of gathering information, and whether there was a less onerous method of obtaining the necessary information to ensure that the use of this power is proportionate. As I said, the remote access power is limited to “viewing information”. Under this power, Ofcom will be unable to interfere or access the service for any other purpose.
In practice, Ofcom will work with services during the process. It is required to specify, among other things, the information to be provided, which will set the parameters of its access, and why it requires the information, which will explain the link between the information it seeks and the online safety function that it is exercising or deciding whether to exercise.
As noble Lords know, Ofcom must comply with the UK’s data protection law. As we have discussed in relation to other issues, it is required to act compatibly with the European Convention on Human Rights, including Article 8 privacy rights. In addition, under Clause 91(7), Ofcom is explicitly prohibited from requiring the provision of legally privileged information. It will also be under a legal obligation to ensure that the information gathered from services is protected from disclosure unless clearly defined exemptions apply, such as those under Section 393(2) of the Communications Act 2003—for example, the carrying out of any of Ofcom’s functions. I hope that provides reassurance to the noble Lord, Lord Allan, and the noble Baroness, Lady Fox, who raised these questions.
I am grateful to the Minister. That was helpful, particularly the description of the process and the fact that drafts have to be issued early on. However, it still leaves open a couple of questions, one of which was very helpfully raised by the noble Lord, Lord Knight. We have in Schedule 12 this other set of protections that could be applied. There is a genuine question as to why this has been put in this place and not there.
The second question is to dig a little more into the question of what happens when there is a dispute. The noble Lord, Lord Moylan, pointed out that if you have created a backdoor then you have created a backdoor, and it is dangerous. If we end up in a situation where a company believes that what it is being asked to do by Ofcom is fundamentally problematic and would create a security risk, it will not be good enough to open up the backdoor and then have a judicial review. It needs to be able to say no at that stage, yet the Bill says that it could be committing a serious criminal offence by failing to comply with an information notice. We want some more assurances, in some form, about what would happen in a scenario where a company genuinely and sincerely believes that what Ofcom is asking for is inappropriate and/or dangerous and it wants not to have to offer it unless and until its challenge has been looked at, rather than having to offer it and then later judicially review a decision. The damage would already have been done by opening up an inappropriate backdoor.
A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the remote access power was unlawful. I am sure that would be looked at swiftly, but I will write to the noble Lord on the anticipated timelines while that judicial review was pending. Given the serious nature of the issues under consideration, I am sure that would be looked at swiftly. I will write further on that.
We do not think that six weeks is enough time for the evidence base to develop sufficiently, our assessment being that to endow the Secretary of State with that power at this point is premature.
Amendment 262AA would require Ofcom to consider whether it is appropriate to require providers to take steps to comply with Ofcom’s researcher access guidance when including a requirement to take steps in a confirmation decision. This would be inappropriate because the researcher access provisions are not enforceable requirements; as such, compliance with them should not be subject to enforcement by the regulator. Furthermore, enforcement action may relate to a wide variety of very important issues, and the steps needed should be sufficient to address a failure to comply with an enforceable requirement. Singling out compliance with researcher access guidance alone risks implying that this will be adequate to address core failures.
Amendment 272AB would require Ofcom to give consideration to whether greater access to data could be achieved through legal requirements or incentives for regulated services. I reassure noble Lords that the scope of Ofcom’s report will already cover how greater access to data could be achieved, including through enforceable requirements on providers.
Amendment 272E would require Ofcom to take a provider’s compliance with Ofcom’s guidance on researcher access to data into account when assessing risks from regulated services and determining whether to take enforcement action and what enforcement action to take. However, we do not believe that this is a relevant factor for consideration of these issues. I hope noble Lords will agree that whether or not a company has enabled researcher access to its data should not be a mitigating factor against Ofcom requiring companies to deal with terrorism or child sexual exploitation or abuse content, for example.
On my noble friend Lord Bethell’s remaining Amendments 272BA, 273A and 273B, the first of these would require Ofcom to publish its report on researchers’ access to information within six months. While six months would not be deliverable given other priorities and the complexity of this issue, the government amendment to which I have spoken would reduce the timelines from two years to 18 months. That recognises the importance of the issue while ensuring that Ofcom can deliver the key priorities in establishing the core parts of the regulatory framework; for example, the illegal content and child safety duties.
Just on the timescale, one of the issues that we talked about in Committee was the fact that there needs to be some kind of mechanism created, with a code of practice with reference to data protection law and an approving body to approve researchers as suitable to take information; the noble Baroness, Lady Kidron, referred to the DSA process, which the European Union has been working on. I hope the Minister can confirm that Ofcom might get moving on establishing that. It is not dependent on there being a report in 18 months; in fact, you need to have it in place when you report in 18 months, which means you need to start building it now. I hope the Minister would want Ofcom, within its existing framework, to be encouraging the creation of that researcher approval body and code of practice, not waiting to start that process in 18 months’ time.
I will continue my train of thought on my noble friend’s amendments, which I hope will cover that and more.
My noble friend’s Amendment 273A would allow Ofcom to appoint approved independent researchers to access information. Again, given the nascent evidence base here, it is important to focus on understanding these issues before we commit to a researcher access framework.
Under the skilled persons provisions, Ofcom will already have the powers to appoint a skilled person to assess compliance with the regulatory framework; that includes the ability to leverage the expertise of independent researchers. My noble friend’s Amendment 273B would require Ofcom to produce a code of practice on access to data by researchers. The government amendments I spoke to earlier will require Ofcom to produce guidance on that issue, which will help to promote information sharing in a safe and secure way.
To the question asked by the noble Lord, Lord Allan: yes, Ofcom can start the process and do it quickly. The question here is really about the timeframe in which it does so. As I said in opening, we understand the calls for further action in this area.
I am happy to say to my noble friend Lord Bethell, to whom we are grateful for his work on this and the conversations we have had, that we will explore the issue further and report back on whether further measures to support researchers’ access to data are required and, if so, whether they can be implemented through other legislation, such as the Data Protection and Digital Information (No.2) Bill.
My Lords, Clause 158 is one of the more mysterious clauses in the Bill and it would greatly benefit from a clear elucidation by the Minister of how it is intended to work to reduce harm. I thank him for having sent me an email this afternoon as we started on the Bill, for which I am grateful; I had only a short time to consider it but I very much hope that he will put its content on the record.
My amendment is designed to ask how the Minister envisages using the power to direct if, say, there is a new contagious disease or riots, and social media is a major factor in the spread of the problem. I am trying to erect some kind of hypothetical situation through which the Minister can say how the power will be used. Is the intention, for example, to set Ofcom the objective of preventing the spread of information on regulated services injurious to public health or safety on a particular network for six months? The direction then forces the regulator and the social media companies to confront the issue and perhaps publicly shame an individual company into using their tools to slow the spread of disinformation. The direction might give Ofcom powers to gather sufficient information from the company to make directions to the company to tackle the problem.
If that is envisaged, which of Ofcom’s media literacy powers does the Minister envisage being used? Might it be Section 11(1)(e) of the Communications Act 2003, which talks about encouraging
“the development and use of technologies and systems for regulating access to such material, and for facilitating control over what material is received, that are both effective and easy to use”.
By this means, Ofcom might encourage a social media company to regulate access to and control over the material that is a threat.
Perhaps the Minister could set out clearly how he intends all this to work, because on a straight reading of Clause 158, we on these Benches have considerable concerns. The threshold for direction is low—merely having
“reasonable grounds for believing that circumstances exist”—
and there is no sense here of the emergency that the then Minister, Mr Philp, cited in the Commons Public Bill Committee on 26 May 2022, nor even of the exceptional circumstances in Amendment 138 to Clause 39, which the Minister tabled recently. The Minister is not compelled by the clause to consult experts in public health, safety or national security. The Minister can set any objectives for Ofcom, it seems. There is no time limit for the effect of the direction and it seems that the direction can be repeatedly extended with no limit. If the Minister directs because they believe there is a threat to national security, we will have the curious situation of a public process being initiated for reasons the Minister is not obliged to explain.
Against this background, there does not seem to be a case for breaching the international convention of the Government not directing a media regulator. Independence of media regulators is the norm in developed democracies, and the UK has signed many international statements in this vein. As recently as April 2022, the Council of Europe stated:
“Media and communication governance should be independent and impartial to avoid undue influence on policymaking or”
the discriminatory and
“preferential treatment of powerful groups”,
including those with significant political or economic power. The Secretary of State, by contrast, has no powers over Ofcom regarding the content of broadcast regulation and has limited powers to direct over radio spectrum and wireless, but not content. Ofcom’s independence in day-to-day decision-making is paramount to preserving freedom of expression. There are insufficient safeguards in this clause, which is why I argue that it should not stand part of the Bill.
I will be brief about Clause 159 because, by and large, we went through it in our debate on a previous group. Now that we can see the final shape of the Bill, it really does behove us to stand back and see where the balance has settled on Ofcom’s independence and whether this clause needs to stand part of the Bill. The Secretary of State has extensive powers under various other provisions in the Bill. The Minister has tabled welcome amendments to Clause 39, which have been incorporated into the Bill, but Clause 155 still allows the Secretary of State to issue a “statement of strategic priorities”, including specific outcomes, every five years.
Clause 159 is in addition to this comprehensive list, but the approach in the clause is incredibly broad. We have discussed this, and the noble Lord, Lord Moylan, has tabled an amendment that would require parliamentary scrutiny. The Secretary of State can issue guidance to Ofcom on more or less anything encompassed by the exercise of its functions under this Act, with no consultation of the public or Parliament prior to making such guidance. The time limit for producing strategic guidance is three years rather than five. Even if it is merely “have regard” guidance, it represents an unwelcome intervention in Ofcom going about its business. If the Minister responds that the guidance is merely “to have regard”, I will ask him to consider this: why have it all, then, when there are so many other opportunities for the Government to intervene? For the regulated companies, it represents a regulatory hazard of interference in independent regulation and a lack of stability. As the noble Lord, Lord Bethell, said in Committee, a clear benefit of regulatory independence is that it reduces lobbying of the Minister by powerful corporate interests.
Now that we can see it in context, I very much hope that the Minister will agree that Clause 159 is a set of guidance too many that compromises Ofcom’s independence and should not stand part of the Bill.
My Lords, I will add to my noble friend’s call for us to consider whether Clause 158 should be struck from the Bill as an unnecessary power for the Secretary of State to take. We have discussed powers for the Secretary of State throughout the Bill, with some helpful improvements led by the noble Baroness, Lady Stowell. This one jars in particular because it is about media literacy; some of the other powers related to whether the Secretary of State could intervene on the codes of practice that Ofcom would issue. The core question is whether we trust Ofcom’s discretion in delivering media literacy and whether we need the Secretary of State to have any kind of power to intervene.
I single out media literacy because the clue is in the name: literacy is a generic skill that you acquire about dealing with the online world; it is not about any specific text. Literacy is a broader set of skills, yet Clause 158 has a suggestion that, in response to specific forms of content or a specific crisis happening in the world, the Secretary of State would want to takesb this power to direct the media literacy efforts. To take something specific and immediate to direct something that is generic and long-term jars and seems inappropriate.
I have a series of questions for the Minister to elucidate why this power should exist at all. It would be helpful to have an example of what kind of “public statement notice”—to use the language in the clause—the Government might want to issue that Ofcom would not come up with on its own. Part of the argument we have been presented with is that, somehow, the Government might have additional information, but it seems quite a stretch that they could come up with that. In an area such as national security, my experience has been that companies often have a better idea of what is going on than anybody in government.
Thousands of people out there in the industry are familiar with APT 28 and APT 29 which, as I am sure all noble Lords know, are better known by their names Fancy Bear and Cozy Bear. These are agents of the Russian state that put out misinformation. There is nothing that UK agencies or the Secretary of State might know about them that is not already widely known. I remember talking about the famous troll factory run by Prigozhin, the Internet Research Agency, with people in government in the context of Russian interference—they would say “Who?” and have to go off and find out. In dealing with threats such as that between the people in the companies and Ofcom, you certainly want a media literacy campaign which tells you about these troll agencies and how they operate and gives warnings to the public, but I struggle to see why you need the Secretary of State to intervene as opposed to allowing Ofcom’s experts to work with company experts and come up with a strategy to deal with those kinds of threat.
The other example cited of an area where the Secretary of State might want to intervene is public health and safety. It would be helpful to be specific; had they had it, how would the Government have used this power during the pandemic in 2020 and 2021? Does the Minister have examples of what they were frustrated about and would have done with these powers that Ofcom would not do anyway in working with the companies directly? I do not see that they would have had secret information which would have meant that they had to intervene rather than trusting Ofcom and the companies to do it.
Perhaps there has been an interdepartmental workshop between DHSC, DCMS and others to cook up this provision. I assume that Clause 158 did not come from nowhere. Someone must have thought, “We need these powers in Clause 158 because we were missing them previously”. Are there specific examples of media literacy campaigns that could not be run, where people in government were frustrated and therefore wanted a power to offer it in future? It would be really helpful to hear about them so that we can understand exactly how the Clause 158 powers will be used before we allow this additional power on to the statute book.
In the view of most people in this Chamber, the Bill as a whole quite rightly grants the Government and Ofcom, the independent regulator, a wide range of powers. Here we are looking specifically at where the Government will, in a sense, overrule the independent regulator by giving it orders to do something it had not thought of doing itself. It is incumbent on the Government to flesh that out with some concrete examples so that we can understand why they need this power. At the moment, as noble Lords may be able to tell, these Benches are not convinced that they do.
My Lords, I will be very brief. The danger with Clause 158 is that it discredits media literacy as something benign or anodyne; it will become a political plaything. I am already sceptical, but if ever there was anything to add to this debate then it is that.
My Lords, I am grateful for the opportunity to set out the need for Clauses 158 and 159. The amendments in this group consider the role of government in two specific areas: the power for the Secretary of State to direct Ofcom about its media literacy functions in special circumstances and the power for the Secretary of State to issue non-binding guidance to Ofcom. I will take each in turn.
Amendment 219 relates to Clause 158, on the Secretary of State’s power to direct Ofcom in special circumstances. These include where there is a significant threat to public safety, public health or national security. This is a limited power to enable the Secretary of State to set specific objectives for Ofcom’s media literacy activity in such circumstances. It allows the Secretary of State to direct Ofcom to issue public statement notices to regulated service providers, requiring providers to set out the steps they are taking to address the threat. The regulator and online platforms are thereby compelled to take essential and transparent actions to keep the public sufficiently informed during crises. The powers ensure that the regulatory framework is future-proofed and well equipped to respond in such circumstances.
As the noble Lord, Lord Clement-Jones, outlined, I corresponded with him very shortly before today’s debate and am happy to set out a bit more detail for the benefit of the rest of the House. As I said to him by email, we expect the media literacy powers to be used only in exceptional circumstances, where it is right that the Secretary of State should have the power to direct Ofcom. The Government see the need for an agile response to risk in times of acute crisis, such as we saw during the Covid-19 pandemic or in relation to the war in Ukraine. There may be a situation in which the Government have access to information, through the work of the security services or otherwise, which Ofcom does not. This power enables the Secretary of State to make quick decisions when the public are at risk.
Our expectation is that, in exceptional circumstances, Ofcom would already be taking steps to address harm arising from the provision of regulated services through its existing media literacy functions. However, these powers will allow the Secretary of State to step in if necessary to ensure that the regulator is responding effectively to these sudden threats. It is important to note that, for transparency, the Secretary of State will be required to publish the reasons for issuing a direction to Ofcom in these circumstances. This requirement does not apply should the circumstances relate to national security, to protect sensitive information.
The noble Lord asked why we have the powers under Clause 158 when they do not exist in relation to broadcast media. We believe that these powers are needed with respect to social media because, as we have seen during international crises such as the Covid-19 pandemic, social media platforms can sadly serve as hubs for low-quality, user-generated information that is not required to meet journalistic standards, and that can pose a direct threat to public health. By contrast, Ofcom’s Broadcasting Code ensures that broadcast news, in whatever form, is reported with due accuracy and presented with due impartiality. Ofcom can fine, or ultimately revoke a licence to broadcast in the most extreme cases, if that code is breached. This means that regulated broadcasters can be trusted to strive to communicate credible, authoritative information to their audiences in a way that social media cannot.
We established in our last debate that the notion of a recognised news publisher will go much broader than a broadcaster. I put it to the Minister that we could end up in an interesting situation where one bit of the Bill says, “You have to protect content from these people because they are recognised news publishers”. Another bit, however, will be a direction to the Secretary of State saying that, to deal with this crisis, we are going to give a media literacy direction that says, “Please get rid of all the content from this same news publisher”. That is an anomaly that we risk setting up with these different provisions.
On the previous group, I raised the issue of legal speech that was labelled as misinformation and removed in the extreme situation of a public health panic. This was seemingly because the Government were keen that particular public health information was made available. Subsequently, we discovered that those things were not necessarily untrue and should not have been removed. Is the Minister arguing that this power is necessary for the Government to direct that certain things are removed on the basis that they are misinformation—in which case, that is a direct attempt at censorship? After we have had a public health emergency in which “facts” have been contested and shown to not be as black and white or true as the Government claimed, saying that the power will be used only in extreme circumstances does not fill me with great confidence.
My Lords, the noble Lord, Lord Moylan, and the noble Baroness, Lady Fox, have a very strong point to make with this amendment. I have tried in our discussions to bring some colour to the debate from my own experience so I will tell your Lordships that in my former professional life I received representations from many Ministers in many countries about the content we should allow or disallow on the Facebook platform that I worked for.
That was a frequent occurrence in the United Kingdom and extended to Governments of all parties. Almost as soon as I moved into the job, we had a Labour Home Secretary come in and suggest that we should deal with particular forms of content. It happened through the coalition years. Indeed, I remember meeting the Minister’s former boss at No. 10 in Davos, of all places, to receive some lobbying about what the UK Government thought should be on or off the platform at that time. In that case it was to do with terrorist content; there was nothing between us in terms of wanting to see that content gone. I recognise that this amendment is about misinformation and disinformation, which is perhaps a more contentious area.
As we have discussed throughout the debate, transparency is good. It keeps everybody on the straight and narrow. I do not see any reason why the Government should not be forthcoming. My experience was that the Government would often want to go to the Daily Telegraph, the Daily Mail or some other upright publication and tell it how they had been leaning on the internet companies—it was part of their communications strategy and they were extremely proud of it—but there will be other circumstances where they are doing it more behind the scenes. Those are the ones we should be worried about.
If those in government have good reason to lean on an internet company, fine—but knowing that they have to be transparent about it, as in this amendment, will instil a certain level of discipline that would be quite healthy.
My Lords, clearly, there is a limited number of speakers in this debate. We should thank the noble Lord, Lord Moylan, for tabling this amendment because it raises a very interesting point about the transparency—or not—of the Counter Disinformation Unit. Of course, it is subject to an Oral Question tomorrow as well, which I am sure the noble Viscount will be answering.
There is some concern about the transparency of the activities of the Counter Disinformation Unit. In its report, Ministry of Truth, which deals at some length with the activities of the Counter Disinformation Unit, Big Brother Watch says:
“Giving officials an unaccountable hotline to flag lawful speech for removal from the digital public square is a worrying threat to free speech”.
Its complaint is not only about oversight; it is about the activities. Others such as Full Fact have stressed the fact that there is little or no parliamentary scrutiny. For instance, freedom of information requests have been turned down and Written Questions which try to probe what the activities of the Counter Disinformation Unit are have had very little response. As it says, when the Government
“lobby internet companies about content on their platforms … this is a threat to freedom of expression”.
We need proper oversight, so I am interested to hear the Minister’s response.
My Lords, we are coming to some critical amendments on a very important issue relatively late in the Bill, having had relatively little discussion on it. It is not often that committees of this House sit around and say, “We need more lawyers”, but this is one of those areas where that was true.
Notwithstanding the blushes of my noble friend on the Front Bench here, interestingly we have not had in our debate significant input from people who understand the law of freedom of expression and wish to contribute to our discussions on how online platforms should deal with questions of the legality of content. These questions are crucial to the Bill, which, if it does nothing else, tells online platforms that they have to be really robust in taking action against content that is deemed to be illegal under a broad swathe of law in the United Kingdom that criminalises certain forms of speech.
We are heavy with providers, and we are saying to them, “If you fail at this, you’re in big trouble”. The pressure to deal with illegal content will be huge, yet illegality itself covers a broad spectrum, from child sexual exploitation and abuse material, where in many cases it is obvious from the material that it is illegal and there is strict liability—there is never any excuse for distributing that material—and pretty much everyone everywhere in the world would agree that it should be criminalised and removed from the internet, through to things that we discussed in Committee, such as public order offences, where, under some interpretations of Section 5 of the Public Order Act, swearing at somebody or looking at them in a funny way in the street could be deemed alarming and harassing. There are people who interpret public order offences in this very broad sense, where there would be a lot less agreement about whether a specific action is or is not illegal and whether the law is correctly calibrated or being used oppressively. So we have this broad spectrum of illegality.
The question we need to consider is where we want providers to draw the line. They will be making judgments on a daily basis. I said previously that I had to make those judgments in my job. I would write to lawyers and they would send back an expensive piece of paper that said, “This is likely to be illegal”, or, “This is likely not to be illegal”. It never said that it was definitely illegal or definitely not illegal, apart from the content I have described, such as child sexual abuse. You would not need to send that, but you would send the bulk of the issues that we are dealing with to a lawyer. If you sent it to a second lawyer, you would get another “likely” or “not likely”, and you would have to come to some kind of consensus view as to the level of risk you wished to take on that particular form of speech or piece of content.
This is really challenging in areas such as hate speech, where exactly the same language has a completely different meaning in different contexts, and may or may not be illegal. Again, to give a concrete example, we would often deal with anti-Semitic content being shared by anti-anti-Semitic groups—people trying to raise awareness of anti-Semitic speech. Our reviewers would quite commonly remove the speech: they would see it and it would look like grossly violating anti-Semitic speech. Only later would they realise that the person was sharing it for awareness. The N-word is a gross term of racial abuse, but if you are an online platform you permit it a lot of the time, because if people use it self-referentially they expect to be able to use it. If you start removing it they would naturally get very upset. People expect to use it if it is in song lyrics and they are sharing music. I could give thousands of examples of speech that may or may not be illegal depending entirely on the context in which it is being used.
We will be asking platforms to make those judgments on our behalf. They will have to take it seriously, because if they let something through that is illegal they will be in serious trouble. If they misjudged it and thought the anti-Semitic hate speech was being circulated by Jewish groups to promote awareness but it turned out it was being circulated by a Nazi group to attack people and that fell foul of UK law, they would be in trouble. These judgments are critical.
We have the test in Clause 173, which says that platforms should decide whether they have “reasonable grounds to infer” that something is illegal. In Committee, we debated changing that to a higher bar, and said that we wanted a stronger evidential basis. That did not find favour with the Government. We hoped they might raise the bar themselves unilaterally, but they have not. However, we come back again in a different way to try to be helpful, because I do not think that the Government want excessive censorship. They have said throughout the Bill’s passage that they are not looking for platforms to be overly censorious. We looked at the wording again and thought about how we could ensure that the bar is not operated in a way that I do not think that the Government intend. We certainly would not want that to happen.
We look at the current wording in Clause 173 and see that the test there has two elements. One is: “Do you have reasonable grounds to infer?” and then a clause in brackets after that says, “If you do have reasonable grounds to infer, you must treat the content as illegal”. In this amendment we seek to remove the second part of that phrasing because it seems problematic. If we say to the platform, “Reasonable grounds to infer, not certainty”—and it is weird to put “inference”, which is by definition mushy, with “must”, which is very certain, into the same clause—we are saying, “If you have this mushy inference, you must treat it as illegal”, which seems quite problematic. Certainly, if I were working at a platform, the way I would interpret that is: “If in doubt, take it out”. That is the only way you can interpret that “must”, and that is really problematic. Again, I know that that is not the Government’s intention, and if it were child sexual exploitation material, of course you “must”. However, if it is the kind of abusive content that you have reasonable grounds to infer may be an offence under the Public Order Act, “must” you always treat that as illegal? As I read the rest of the Bill, if you are treating it as illegal, the sense is that you should remove it.
That is what we are trying to get at. There is a clear understanding from the Government that their intention is “must” when it comes to that hard end of very bad, very clearly bad content. However, we need something else—a different kind of behaviour where we are dealing with content where it is much more marginal. Otherwise, the price we will pay will be in freedom of expression.
People in the United Kingdom publish quite robust, sweary language. I sometimes think that some of the rules we apply penalise the vernacular. People who use sweary, robust language may be doing so entirely legally—the United Kingdom does not generally restrict people from using that kind of language. However, we risk heading towards a scenario where people post such content in future, and they will find that the platform takes it down. They will complain to the platform, saying, “Why the hell did you take my content down?”—in fact, they will probably use stronger words than that to register their complaint. When they do, the platform will say, “We had reasonable grounds to infer that that was in breach of the Public Order Act, for example, because somebody might feel alarmed, harassed or distressed by it. Oh, and look—in this clause, it says we ‘must’ treat it as illegal. Sorry—there is nothing else we can do. We would have loved to have been able to exercise the benefit of the doubt and to allow you to carry on using that kind of language, because we think there is some margin where you have not behaved in an illegal way. But unfortunately, because of the way that Clause 173 has been drafted, our lawyers tell us we cannot afford to take the risk”.
In the amendment we are trying to—I think—help the Government to get out of a situation which, as I say, I do not think they want. However, I fear that the totality of the wording of Clause 173, this low bar for the test and the “must treat as” language, will lead to that outcome where platforms will take the attitude: “Safety first; if in doubt, take it out”, and I do not think that that is the regime we want. I beg to move.
My Lords, I regret I was unable to be present in Committee to deliver my speech about the chilling effect that the present definition of illegality in the Bill will have on free speech on the internet.
I am still concerned about Clause 173, which directs platforms how to come to the judgment on what is illegal. My concern is that the criterion for illegality, “reasonable grounds to infer” that elements of the content are illegal, will encourage the tech companies to take down content which is not necessarily illegal but which they infer could be. Indeed, the noble Lord, Lord Allan, gave us a whole list of examples of where that might happen. Unfortunately, in Committee there was little support for a higher bar when asking the platforms to judge what illegal content is. However, I have added my name to Amendment 228, put forward by the noble Lord, Lord Allan, because, as he has just said, it is a much less radical way of enhancing free speech when platforms are not certain whether to take down content which they infer is illegal.
The deletion of part of Clause 173(5) is a moderate proposal. It still leaves intact the definition for the platforms of how they are to make the judgment on the illegality of content, but it takes out the compulsory element in this judgment. I believe that it will have the biggest impact on the moderation system. Some of those systems are run by machines, but many of the moderation processes, such as Meta’s Facebook, involve thousands of human beings. The deletion of the second part of Clause 173(5), which demands that they take down content that they infer is illegal, will give them more leeway to err on the side of freedom of speech. I hope that this extra leeway to encourage free speech will also be included in the way that algorithms moderate our content.
My Lords, I remain concerned that people who use more choice words of Anglo-Saxon origin will find their speech more restricted than those who use more Latinate words, such as “inference” and “reasonable”, but the Minister has given some important clarifications.
The first is that no single decision could result in a problem for a platform, so it will know that it is about a pattern of bad decision-making rather than a single decision; that will be helpful in terms of taking a bit of the pressure off. The Minister also gave an important clarification around—I hate this language, but we have to say it—priority versus primary priority. If everything is a priority, nothing is a priority but, in this Bill, some things are more of a priority than others. The public order offences are priority offences; therefore, they have a little bit more leeway over those offences than they do over primary priority offences, which include the really bad stuff that we all agree we want to get rid of.
As I say, I do not think that we are going to get much further in our debates today although those were important clarifications. The Minister is trying to give us reasonable grounds to infer that the guidance from Ofcom will result in a gentle landing rather than a cliff edge, which the noble Baroness, Lady Kidron, rightly suggested is what we want. With that, I beg leave to withdraw the amendment.
My Lords, I pay tribute to the noble Baroness, Lady Harding, for her role in bringing this issue forward. I too welcome the government amendments. It is important to underline that adding the potential role of app stores to the Bill is neither an opportunity for other companies to fail to comply and wait for the gatekeepers to do the job nor a one-stop shop in itself. It is worth reminding ourselves that digital journeys rarely start and finish in one place. In spite of the incredible war for our attention, in which products and services attempt to keep us rapt on a single platform, it is quite important for everyone in the ecosystem to play their part.
I have two minor points. First, I was not entirely sure why the government amendment requires the Secretary of State to consult as opposed to Ofcom. Can the Minister reassure me that, whoever undertakes the consultation, it will include children and children’s organisations as well as tech companies? Secondly, like the noble Baroness, Lady Harding, I was a little surprised that the amendment does not define an app store but uses the term “the ordinary meaning of”. That seems like it may have the possibility for change. If there is a good reason for that—I am sure there is—then it must be stated that app stores cannot suddenly rebrand to something else and that that gatekeeper function will be kept absolutely front and centre.
Notwithstanding those comments, and associating myself with the idea that nothing should wait until 2025-26, I am very grateful to the Government for bringing this forward.
My Lords, I will make a brief contribution because I was the misery guts when this was proposed first time round. I congratulate the noble Baroness, Lady Harding, not just on working with colleagues to come up with a really good solution but on seeking me out. If I heard someone be as miserable as I was, I might try to avoid them. She did not; she came and asked me, “Why are you miserable? What is the problem here?”, and took steps to address it. Through her work with the Government, their amendments address my main concerns.
My first concern, as we discussed in Committee, was that we would be asking large companies to regulate their competitors, because the app stores are run by large tech companies. She certainly understood that concern. The second was that I felt we had not necessarily yet clearly defined the problem. There are lots of problems. Before you can come up with a solution, you need a real consensus on what problem you are trying to address. The government amendment will very much help in saying, “Let’s get really crunchy about the actual problem that we need app stores to address”.
Finally, I am a glass-half-full kind of guy as well as a misery guts—there is a contradiction there—and so I genuinely think that these large tech businesses will start to change their behaviour and address some of the concerns, such as getting age ratings correct, just by virtue of our having this regulatory framework in place. Even if today the app stores are technically outside, the fact that the sector is inside and that this amendment tells them that they are on notice will, I think and hope, have a hugely positive effect and we will get the benefits much more quickly than the timescale envisaged in the Bill. That feels like a true backstop. I sincerely hope that the people in those companies, who I am sure will be glued to our debate, will be thinking that they need to get their act together much more quickly. It is better for them to do it themselves than wait for someone to do it to them.
My Lords, I add my congratulations to the noble Baroness, Lady Harding, on her tenacity, and to the Minister on his flexibility. I believe that where we have reached is pretty much the right balance. There are the questions that the noble Baroness, Lady Harding, and others have asked of the Minister, and I hope he will answer those, but this is a game-changer, quite frankly. Rightly, the noble Baroness has paid tribute to the companies which have put their head above the parapet. That was not that easy for them to do when you consider that those are the platforms they have to depend on for their services to reach the public.
Unlike the research report, they have reserved powers that the Secretary of State can use if the report is positive, which I hope it will be. I believe this could be a turning point. The digital markets and consumers Bill is coming down the track this autumn and that is going to give greater powers to make sure that the app stores can be tackled—after all, there are only two of them and they are an oligopoly. They are the essence of big tech, and they need to function in a much more competitive way.
The noble Baroness talked about timing, and it needs to be digital timing, not analogue. Four years does seem a heck of a long time. I hope the Minister will address that.
Then there is the really important aspect of harmful content. In the last group, the Minister reassured us about systems and processes and the illegality threshold. Throughout, he has tried to reassure us that this is all about systems and processes and not so much about content. However, every time we look, we see that content is there almost by default, unless the subject is raised. We do not yet have a Bill that is actually fit for purpose in that sense. I hope the Minister will use his summer break wisely and read through the Bill to make sure that it meets its purpose, and then come back at Third Reading with a whole bunch of amendments that add functionalities. How about that for a suggestion? It is said in the spirit of good will and summer friendship.
The noble Baroness raised a point about transparency when it comes to Ofcom publishing its review. I hope the Minister can give that assurance as well.
The noble Baroness, Lady Kidron, asked about the definition of app store. That is the gatekeeper function, and we need to be sure that that is what we are talking about.
I end by congratulating once again the noble Baroness and the Minister on where we have got to so far.