(1 year, 7 months ago)
Lords ChamberMy Lords, I share noble Lords’ determination to deliver the strongest protections for children and to develop a robust and future-proofed regulatory regime. However, it will not be possible to solve every problem on the internet through this Bill, nor through any piece of legislation, flagship or otherwise. The Bill has been designed to confer duties on the services that pose the greatest risk of harm—user-to-user services and search services—and where there are proportionate measures that companies can take to protect their users.
As the noble Baroness, Lady Kidron, and others anticipated, I must say that these services act as a gateway for users to discover and access other online content through search results and links shared on social media. Conferring duties on these services will therefore significantly reduce the risk of users going on to access illegal or harmful content on non-regulated services, while keeping the scope of the Bill manageable and enforceable.
As noble Lords anticipated, there is also a practical consideration for Ofcom in all this. I know that many noble Lords are extremely keen to see this Bill implemented as swiftly as possible; so am I. However, as the noble Lord, Lord Allan, rightly pointed out, making major changes to the Bill’s scope at this stage would have significant implications for Ofcom’s implementation timelines. I say this at the outset because I want to make sure that noble Lords are aware of those implications as we look at these issues.
I turn first to Amendments 2, 3, 5, 92 and 193, tabled by the noble Baroness, Lady Kidron. These aim to expand the number of services covered by the Bill to incorporate a broader range of services accessed by children and a broader range of harms. I will cover the broader range of harms more fully in a separate debate when we come to Amendment 93, but I am very grateful to the noble Baroness for her constructive and detailed discussions on these issues over the past few weeks and months.
These amendments would bring new services into scope of the duties beyond user-to-user and search services. This could include services which enable or promote commercial harms, including consumer businesses such as online retailers. As I have just mentioned in relation to the previous amendments, bringing many more services into scope would delay the implementation of Ofcom’s priorities and risk detracting from its work overseeing existing regulated services where the greatest risk of harm exists—we are talking here about the services run by about 2.5 million businesses in the UK alone. I hope noble Lords will appreciate from the recent communications from Ofcom how challenging the implementation timelines already are, without adding further complication.
Amendment 92 seeks to change the child-user condition in the children’s access assessment to the test in the age-appropriate design code. The test in the Bill is already aligned with the test in that code, which determines whether a service is likely to be accessed by children, in order to ensure consistency for providers. The current child-user condition determines that a service is likely to be accessed by children where it has a significant number or proportion of child users, or where it is of a kind likely to attract a significant number or proportion of child users. This will already bring into scope services of the kind set out in this amendment, such as those which are designed or intended for use by children, or where children form a—
I am sorry to interrupt. Will the Minister take the opportunity to say what “significant” means, because that is not aligned with the ICO code, which has different criteria?
If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.
On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.
The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.
I know it has been said that the large language models, such as that used by ChatGPT, will be in scope when they are embedded in search, but are they in scope generally?
They are when they apply to companies enabling users to share content online and interact with each other or in terms of search. They apply in the context of the other duties set out in the Bill.
Amendments 19, 22, 298 and 299, tabled by my noble friend Lady Harding of Winscombe, seek to impose child safety duties on application stores. I am grateful to my noble friend and others for the collaborative approach that they have shown and for the time that they have dedicated to discussing this issue since Second Reading. I appreciate that she has tabled these amendments in the spirit of facilitating a conversation, which I am willing to continue to have as the Bill progresses.
As my noble friend knows from our discussions, there are challenges with bringing application stores—or “app stores” as they are popularly called—into the scope of the Bill. Introducing new duties on such stores at this stage risks slowing the implementation of the existing child safety duties, in the way that I have just outlined. App stores operate differently from user-to-user and search services; they pose different levels of risk and play a different role in users’ experiences online. Ofcom would therefore need to recruit different people, or bring in new expertise, to supervise effectively a substantially different regime. That would take time and resources away from its existing priorities.
We do not think that that would be a worthwhile new route for Ofcom, given that placing child safety duties on app stores is unlikely to deliver any additional protections for children using services that are already in the scope of the Bill. Those services must already comply with their duties to keep children safe or will face enforcement action if they do not. If companies do not comply, Ofcom can rely on its existing enforcement powers to require app stores to remove applications that are harmful to children. I am happy to continue to discuss this matter with my noble friend and the noble Lord, Lord Knight, in the context of the differing implementation timelines, as he has asked.
The Minister just said something that was material to this debate. He said that Ofcom has existing powers to prevent app stores from providing material that would have caused problems for the services to which they allow access. Can he confirm that?
Perhaps the noble Lord could clarify his question; I was too busy finishing my answer to the noble Lord, Lord Knight.
It is a continuation of the point raised by the noble Baroness, Lady Harding, and it seems that it will go part of the way towards resolving the differences that remain between the Minister and the noble Baroness, which I hope can be bridged. Let me put it this way: is it the case that Ofcom either now has powers or will have powers, as a result of the Bill, to require app stores to stop supplying children with material that is deemed in breach of the law? That may be the basis for understanding how you can get through this. Is that right?
Services already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.
My Lords, on that point, my reading of Clauses 131 to 135, where the Bill sets out the business disruption measures, is that they could be used precisely in that way. It would be helpful for the Minister responding later to clarify that Ofcom would use those business disruption measures, as the Government explicitly anticipate, were an app store, in a rogue way, to continue to list a service that Ofcom has said should not be made available to people in the United Kingdom.
I will be very happy to set that out in more detail.
Amendments 33A and 217A in the name of the noble Lord, Lord Storey, would place a new duty on user-to-user services that predominantly enable online gaming. Specifically, they would require them to have a classification certificate stating the age group for which they are suitable. We do not think that is necessary, given that there is already widespread, voluntary uptake of approval classification systems in online gaming.
My Lords, it has certainly been an interesting debate, and I am grateful to noble Lords on all sides of the Committee for their contributions and considerations. I particularly thank the noble Lords who tabled the amendments which have shaped the debate today.
In general, on these Benches, we believe that the Bill offers a proportionate approach to tackling online harms. We feel that granting some of the exemptions proposed in this group would be unintentionally counterproductive and would raise some unforeseen difficulties. The key here—and it has been raised by a number of noble Lords, including the noble Baronesses, Lady Harding and Lady Kidron, and, just now, the noble Lord, Lord Clement-Jones, who talked about the wider considerations of the Joint Committee and factors that should be taken into account—is that we endorse a risk-based approach. In this debate, it is very important that we take ourselves back to that, because that is the key.
My view is that using other factors, such as funding sources or volunteer engagement in moderation, cuts right across this risk-based approach. To refer to Amendment 4, it is absolutely the case that platforms with fewer than 1 million UK monthly users have scope to create considerable harm. Indeed, noble Lords will have seen that later amendments call for certain small platforms to be categorised on the basis of the risk—and that is the important word—that they engender, rather than the size of the platform, which, unfortunately, is something of a crude measure. The point that I want to make to the noble Baroness, Lady Fox, is that it is not about the size of the businesses and how they are categorised but what they actually do. The noble Baroness, Lady Kidron, rightly said that small is not safe, for all the reasons that were explained, including by the noble Baroness, Lady Harding.
Amendment 9 would exempt small and medium-sized enterprises and certain other organisations from most of the Bill’s provisions. I am in no doubt about the well-meaning nature of this amendment, tabled by the noble Lord, Lord Moylan, and supported by the noble Lord, Lord Vaizey. Indeed, there may well be an issue about how start-ups and entrepreneur unicorns cope with the regulatory framework. We should attend to that, and I am sure that the Minister will have something to say about it. But I also expect that the Minister will outline why this would actually be unhelpful in combating many of the issues that this Bill is fundamentally designed to deal with if we were to go down the road of these exclusions.
In particular, granting exemptions simply on the basis of a service’s size could lead to a situation where user numbers are capped or perhaps even where platforms are deliberately broken up to avoid regulation. This would have an effect that none of us in this Chamber would want to see because it would embed harmful content and behaviour rather than helping to reduce them.
Referring back to the comments of the noble Lord, Lord Moylan, I agree with the noble Lord, Lord Vaizey, in his reflection. I, too, have not experienced the two sides of the Chamber that the noble Lord, Lord Moylan, described. I feel that the Chamber has always been united on the matter of child safety and in understanding the ramifications for business. It is the case that good legislation must always seek a balance, but, to go back to the point about excluding small and medium-sized enterprises, to call them a major part of the British economy is a bit of an understatement when they account for 99.9% of the business population. In respect of the exclusion of community-based services, including Wikipedia—and we will return to this in the next group—there is nothing for platforms to fear if they have appropriate systems in place. Indeed, there are many gains to be had for community-based services such as Wikipedia from being inside the system. I look forward to the further debate that we will have on that.
I turn to Amendment 9A in the name of my noble friend Lord Knight of Weymouth, who is unable to participate in this section of the debate. It probes how the Bill’s measures would apply to specialised search services. Metasearch engines such as Skyscanner have expressed concern that the legislation might impose unnecessary burdens on services that pose little risk of hosting the illegal content targeted by the Bill. Perhaps the Minister, in his response, could confirm whether or not such search engines are in scope. That would perhaps be helpful to our deliberations today.
While we on these Benches are not generally supportive of exemptions, the reality is that there are a number of online search services that return content that would not ordinarily be considered harmful. Sites such as Skyscanner and Expedia, as we all know, allow people to search for and book flights and other travel services such as car hire. Obviously, as long as appropriate due diligence is carried out on partners and travel agents, the scope for users to encounter illegal or harmful material appears to be minimal and returns us to the point of having a risk-based approach. We are not necessarily advocating for a carve-out from the Bill, but it would perhaps be helpful to our deliberations if the Minister could outline how such platforms will be expected to interact with the Ofcom-run online safety regime.
My Lords, I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, but I cannot accept the amendments tabled by the noble Baroness, Lady Fox, and others. Doing so would greatly reduce the strong protections that the Bill offers to internet users, particularly to children. I agree with the noble Baroness, Lady Merron, that that has long been the shared focus across your Lordships’ House as we seek to strike the right balance through the Bill. I hope to reassure noble Lords about the justification for the existing balance and scope, and the safeguards built in to prevent undue burdens to business.
I will start with the amendments tabled by the noble Baroness, Lady Fox of Buckley—Amendments 4, 6 to 8, 12, 288 and 305—which would significantly narrow the definition of services in scope of regulation. The current scope of the Bill reflects evidence of where harm is manifested online. There is clear evidence that smaller services can pose a significant risk of harm from illegal content, as well as to children, as the noble Baroness, Lady Kidron, rightly echoed. Moreover, harmful content and activity often range across a number of services. While illegal content or activity may originate on larger platforms, offenders often seek to move to smaller platforms with less effective systems for tackling criminal activity in order to circumvent those protections. Exempting smaller services from regulation would likely accelerate that process, resulting in illegal content being displaced on to smaller services, putting users at risk.
These amendments would create significant new loopholes in regulation. Rather than relying on platforms and search services to identify and manage risk proactively, they would require Ofcom to monitor smaller harmful services, which would further annoy my noble friend Lord Moylan. Let me reassure the noble Baroness, however, that the Bill has been designed to avoid disproportionate or unnecessary burdens on smaller services. All duties on services are proportionate to the risk of harm and the capacity of companies. This means that small, low-risk services will have minimal duties imposed on them. Ofcom’s guidance and codes of practice will set out how they can comply with their duties, in a way that I hope is even clearer than the Explanatory Notes to the Bill, but certainly allowing for companies to have a conversation and ask for areas of clarification, if that is still needed. They will ensure that low-risk services do not have to undertake unnecessary measures if they do not pose a risk of harm to their users.
My Lords, while my noble friend is talking about the possibility of excessive and disproportionate burden on businesses, can I just ask him about the possibility of excessive and disproportionate burden on the regulator? He seems to be saying that Ofcom is going to have to maintain, and keep up to date regularly, 25,000 risk assessments—this is on the Government’s own assessment, produced 15 months ago, of the state of the market then—even if those assessments carried out by Ofcom result in very little consequence for the regulated entity.
We know from regulation in this country that regulators already cannot cope with the burdens placed on them. They become inefficient, sclerotic and unresponsive; they have difficulty in recruiting staff of the same level and skills as the entities that they regulate. We have a Financial Services and Markets Bill going through at the moment, and the FCA is a very good example of that. Do we really think that this is a sensible burden to place on a regulator that is actually able to discharge it?
The Bill creates a substantial new role for Ofcom, but it has already substantially recruited and prepared for the effective carrying out of that new duty. I do not know whether my noble friend was in some of the briefings with officials from Ofcom, but it is very happy to set out directly the ways in which it is already discharging, or preparing to discharge, those duties. The Government have provided it with further resource to enable it to do so. It may be helpful for my noble friend to have some of those discussions directly with the regulator, but we are confident that it is ready to discharge its duties, as set out in the Bill.
I was about to say that we have already had a bit of discussion on Wikipedia. I am conscious that we are going to touch on it again in the debate on the next group of amendments so, at the risk of being marked down for repetition, which is a black mark on that platform, I shall not pre-empt what I will say shortly. But I emphasise that the Bill does not impose prescriptive, one-size-fits-all duties on services. The codes of practice from Ofcom will set out a range of measures that are appropriate for different types of services in scope. Companies can follow their own routes to compliance, so long as they are confident that they are effectively managing risks associated with legal content and, where relevant, harm to children. That will ensure that services that already use community moderation effectively can continue to do so—such as Wikipedia, which successfully uses that to moderate content. As I say, we will touch on that more in the debate on the next group.
Amendment 9, in the name of my noble friend Lord Moylan, is designed to exempt small and medium sized-enterprises working to benefit the public from the scope of the Bill. Again, I am sympathetic to the objective of ensuring that the Bill does not impose undue burdens on small businesses, and particularly that it should not inhibit services from providing valuable content of public benefit, but I do not think it would be feasible to exempt service providers deemed to be
“working to benefit the public”.
I appreciate that this is a probing amendment, but the wording that my noble friend has alighted on highlights the difficulties of finding something suitably precise and not contestable. It would be challenging to identify which services should qualify for such an exemption.
Taking small services out of scope would significantly undermine the framework established by the Bill, as we know that many smaller services host illegal content and pose a threat to children. Again, let me reassure noble Lords that the Bill has been designed to avoid disproportionate or unnecessary regulatory burdens on small and low-risk services. It will not impose a disproportionate burden on services or impede users’ access to value content on smaller services.
Amendment 9A in the name of the noble Lord, Lord Knight of Weymouth, is designed to exempt “sector specific search services” from the scope of the Bill, as the noble Baroness, Lady Merron, explained. Again, I am sympathetic to the intention here of ensuring that the Bill does not impose a disproportionate burden on services, but this is another amendment that is not needed as it would exempt search services that may pose a significant risk of harm to children, or because of illegal content on them. The amendment aims to exempt specialised search services—that is, those that allow users to
“search for … products or services … in a particular sector”.
It would exempt specialised search services that could cause harm to children or host illegal content—for example, pornographic search services or commercial search services that could facilitate online fraud. I know the noble Lord would not want to see that.
The regulatory duties apply only where there is a significant risk of harm and the scope has been designed to exclude low-risk search services. The duties therefore do not apply to search engines that search a single database or website, for example those of many retailers or other commercial websites. Even where a search service is in scope, the duties on services are proportionate to the risk of harm that they pose to users, as well as to a company’s size and capacity. Low-risk services, for example, will have minimal duties. Ofcom will ensure that these services can quickly and easily comply by publishing risk profiles for low-risk services, enabling them easily to understand their risk levels and, if necessary, take steps to mitigate them.
The noble Lord, Lord McCrea, asked some questions about the 200 most popular pornographic websites. If I may, I will respond to the questions he posed, along with others that I am sure will come in the debate on the fifth group, when we debate the amendments in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Ritchie of Downpatrick, because that will take us on to the same territory.
I hope that provides some assurance to my noble friend Lord Moylan, the noble Baroness, Lady Fox, and others, and that they will be willing not to press their amendments in this group.
My Lords, I thank people for such a wide-ranging and interesting set of contributions. I take comfort from the fact that so many people understood what the amendments were trying to do, even if they did not fully succeed in that. I thought it was quite interesting that in the first debate the noble Lord, Lord Allan of Hallam, said that he might be a bit isolated on the apps, but I actually agreed with him—which might not do his reputation any good. However, when he said that, I thought, “Welcome to my world”, so I am quite pleased that this has not all been shot down in flames before we started. My amendment really was a serious attempt to tackle something that is a real problem.
The Minister says that the Bill is designed to avoid disproportionate burdens on services. All I can say is, “Sack the designer”. It is absolutely going to have a disproportionate burden on a wide range of small services, which will not be able to cope, and that is why so many of them are worried about it. Some 80% of the companies that will be caught up in this red tape are small and micro-businesses. I will come to the small business point in a moment.
The noble Baroness, Lady Harding, warned us that small tech businesses become big tech businesses. As far as I am concerned, that is a success story—it is what I want; is it not what we all want? Personally, I think economic development and growth is a positive thing—I do not want them to fail. However, I do not think it will ever happen; I do not think that small tech businesses will ever grow into big tech businesses if they face a disproportionate burden in the regulatory sense, as I have tried to describe. That is what I am worried about, and it is not a positive thing to be celebrated.
I stress that it is not small tech and big tech. There are also community sites, based on collective moderation. Wikipedia has had a lot of discussion here. For a Bill that stresses that it wants to empower users, we should think about what it means when these user-moderated community sites are telling us that they will not be able to carry on and get through. That is what they are saying. It was interesting that the noble Lord, Lord Clement-Jones, said that he relies on Wikipedia—many of us do, although please do not believe what it says about me. There are all of these things, but then there was a feeling that, well, Reddit is a bit dodgy. The Bill is not meant to be deciding which ones to trust in quite that way, or people’s tastes.
I was struck that the noble Baroness, Lady Kidron, said that small is not safe, and used the incel example. I am not emphasising that small is safe; I am saying that the small entities will not survive this process. That is my fear. I do not mean that the big ones are nasty and dangerous and the small ones are cosy, lovely and Wikipedia-like. I am suggesting that smaller entities will not be able to survive the regulatory onslaught. That is the main reason I raised this.
The noble Baroness, Lady Merron, said that these entities can cause great harm. I am worried about a culture of fear, in which we demonise tens of thousands of innocent tech businesses and communities and end up destroying them when we do not intend to. I tried to put in the amendment an ability for Ofcom, if there are problematic sites that are risky, to deal with them. As the Minister kept saying, low-risk search engines have been exempted. I am suggesting that low-risk small and micro-businesses are exempted, which is the majority of them. That is what I am suggesting, rather than that we assume they are all guilty and then they have to get exempted.
Interestingly, the noble Lord, Lord McCrea, asked how many pornography sites are in scope and which pornographic websites have a million or fewer users. I am glad I do not know the answer to that, otherwise people might wonder why I did. The point is that there are always going to be sites that are threatening or a risk to children, as we are discussing. But we must always bear in mind—this was the important point that the noble Lord, Lord Moylan, made—that in our absolute determination to protect children via this Bill we do not unintendedly damage society as a whole. Adult access to free speech, for example, is one of my concerns, as are businesses and so on. We should not have that as an outcome.
Like others, I had prepared quite extensive notes to respond to what I thought the noble Lord was going to say about his amendments in this group, and I have not been able to find anything left that I can use, so I am going to have to extemporise slightly. I think it is very helpful to have a little non-focused discussion about what we are about to talk about in terms of age, because there is a snare and a delusion in quite a lot of it. I was put in mind of that in the discussions on the Digital Economy Act, which of course precedes the Minister but is certainly still alive in our thinking: in fact, we were talking about it earlier today.
The problem I see is that we have to find a way of squaring two quite different approaches. One is to prevent those who should not be able to see material, because it is illegal for them to see it. The other is to find a way of ensuring that we do not end up with an age-gated internet, which I am grateful to find that we are all, I think, agreed about: that is very good to know.
Age is very tricky, as we have heard, and it is not the only consideration we have to bear in mind in wondering whether people should be able to gain access to areas of the internet which we know will be bad and difficult for them. That leads us, of course, to the question about legal but harmful, now resolved—or is it? We are going to have this debate about age assurance and what it is. What is age verification? How do they differ? How does it matter? Is 18 a fixed and final point at which we are going to say that childhood ends and adulthood begins, and therefore one is open for everything? It is exactly the point made earlier about how to care for those who should not be exposed to material which, although legal for them by a number called age, is not appropriate for them in any of the circumstances which, clinically, we might want to bring to bear.
I do not think we are going to resolve these issues today—I hope not. We are going to talk about them for ever, but at this stage I think we still need a bit of thinking outside a box which says that age is the answer to a lot of the problems we have. I do not think it is, but whether the Bill is going to carry that forward I have my doubts. How we get that to the next stage, I do not know, but I am looking forward to hearing the Minister’s comments on it.
My Lords, I agree that this has been a rather unfortunate grouping and has led to a slightly strange debate. I apologise if it is the result of advice given to my noble friend. I know there has been some degrouping as well, which has led to slightly odd combinations today. However, as promised, I shall say a bit more about Wikipedia in relation to my noble friend’s Amendments 10 and 11.
The effect of these amendments would be that moderation actions carried out by users—in other words, community moderation of user-to-user and search services —would not be in scope of the Bill. The Government support the use of effective user or community moderation by services where this is appropriate for the service in question. As I said on the previous group, as demonstrated by services such as Wikipedia, this can be a valuable and effective means of moderating content and sharing information. That is why the Bill does not impose a one-size-fits-all requirement on services, but instead allows services to adopt their own approaches to compliance, so long as these are effective. The noble Lord, Lord Allan of Hallam, dwelt on this. I should be clear that duties will not be imposed on individual community moderators; the duties are on platforms to tackle illegal content and protect children. Platforms can achieve this through, among other things, centralised or community moderation. Ultimately, however, it is they who are responsible for ensuring compliance and it is platforms, not community moderators, who will face enforcement action if they fail to do so.
My Lords, this group of government amendments relates to risk assessments; it may be helpful if I speak to them now as the final group before the dinner break.
Risk management is at the heart of the Bill’s regulatory framework. Ofcom and services’ risk assessments will form the foundation for protecting users from illegal content and content which is harmful to children. They will ensure that providers thoroughly identify the risks on their own websites, enabling them to manage and mitigate the potential harms arising from them. Ofcom will set out the risks across the sector and issue guidance to companies on how to conduct their assessments effectively. All providers will be required to carry out risk assessments, keep them up-to-date and update them before making a significant change to the design or operation of their service which could put their users at risk. Providers will then need to put in place measures to manage and mitigate the risks they identify in their risk assessments, including any emerging risks.
Given how crucial the risk assessments are to this framework, it is essential that we enable them to be properly scrutinised by the public. The government amendments in this group will place new duties on providers of the largest services—that is, category 1 and 2A services—to publish summaries of their illegal and child safety risk assessments. Through these amendments, providers of these services will also have a new duty to send full records of their risk assessments to Ofcom. This will increase transparency about the risk of harm on the largest platforms, clearly showing how risk is affected by factors such as the design, user base or functionality of their services. These amendments will further ensure that the risk assessments can be properly assessed by internet users, including by children and their parents and guardians, by ensuring that summaries of the assessments are publicly available. This will empower users to make informed decisions when choosing whether and how to use these services.
It is also important that Ofcom is fully appraised of the risks identified by service providers. That is why these amendments introduce duties for both category 1 and 2A services to send their records of these risk assessments, in full, to Ofcom. This will make it easier for Ofcom to supervise compliance with the risk assessment duties, as well as other duties linked to the findings of the risk assessments, rather than having to request the assessments from companies under its information-gathering powers.
These amendments also clarify that companies must keep a record of all aspects of their risk assessments, which strengthens the existing record-keeping duties on services. I hope that noble Lords will welcome these amendments. I beg to move.
My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.
The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.
My Lords, I am grateful to the Minister for introducing this group, and we certainly welcome this tranche of government amendments. We know that there are more to come both in Committee and as we proceed to Report, and we look forward to seeing them.
The amendments in this group, as other noble Lords have said, amount to a very sensible series of changes to services’ risk-assessment duties. This perhaps begs the question of why they were not included in earlier drafts of the Bill, but we are glad to see them now.
There is, of course, the issue of precisely where some of the information will appear, as well as the wider status of terms of service. I am sure those issues will be discussed in later debates. It is certainly welcome that the department is introducing stronger requirements around the information that must be made available to users; it will all help to make this a stronger and more practical Bill.
We all know that users need to be able to make informed decisions, and it will not be possible if they are required to view multiple statements and various documents. It seems that the requirements for information to be provided to Ofcom go to the very heart of the Bill, and I suggest that the proposed system will work best if there is trust and transparency between the regulator and those who are regulated. I am sure that there will be further debate on the scope of risk assessments, particularly on issues that were dropped from previous iterations of the Bill, and certainly this is a reasonable starting point today.
I will try to be as swift as possible as I raise a few key issues. One is about avoiding warnings that are at such a high level of generality that they get put on to everything. Perhaps the Minister could indicate how Ofcom will ensure that the summaries are useful and accessible to the reader. The test, of course, should be that a summary is suitable and sufficient for a prospective user to form an assessment of the likely risk they would encounter when using the service, taking into account any special vulnerabilities that they might have. That needs to be the test; perhaps the Minister could confirm that.
Is the terms of service section the correct place to put a summary of the illegal content risk assessment? Research suggests, unsurprisingly, that only 3% of people read terms before signing up—although I recall that, in an earlier debate, the Minister confessed that he had read all the terms and conditions of his mobile phone contract, so he may be one of the 3%. It is without doubt that any individual should be supported in their ability to make choices, and the duty should perhaps instead be to display a summary of the risks with due prominence, to ensure that anyone who is considering signing up to a service is really able to read it.
I also ask the Minister to confirm that, despite the changes to Clause 19 in Amendment 16B, the duty to keep records of risk assessments will continue to apply to all companies, but with an enhanced responsibility for category 1 companies.
I am grateful to noble Lords for their questions on this, and particularly grateful to the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, for their chorus of welcome. Where we are able to make changes, we will of course bring them forward, and I am glad to be able to bring forward this tranche now.
As the noble Lord, Lord Allan, said, ensuring the transparency of services’ risk assessments will further ensure that the framework of the Bill delivers its core objectives relating to effective risk management and increased accountability regarding regulated services. As we have discussed, it is imperative that these providers take a thorough approach to identifying risks, including emerging risks. The Government believe that it is of the utmost importance that the public are able effectively to scrutinise the risk assessments of the largest in-scope services, so that users can be empowered to make informed decisions about whether and how to use their services.
On the questions from the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, about why it is just category 1 and category 2A services, we estimate that there will be around 25,000 UK service providers in scope of the Bill’s illegal and child safety duties. Requiring all these companies to publish full risk assessments and proactively to send them to Ofcom could undermine the Bill’s risk-based and proportionate approach, as we have discussed in previous groups on the burdens to business. A large number of these companies are likely to be low risk and it is unlikely that many people will seek out their risk assessments, so requiring all companies to publish them would be an excessive regulatory burden.
There would also be an expectation that Ofcom would proactively monitor a whole range of services, even ones that posed a minimal risk to users. That in turn could distract Ofcom from taking a risk-based approach in its regulation by overwhelming it with paperwork from thousands of low-risk services. If Ofcom wants to see records of the risk assessments of providers that are not category 1 or category 2A services, it has extensive information-gathering powers that it can use to require a provider to send it such records.
The noble Baroness, Lady Merron, was right to say that I read the terms of my broadband supply—I plead guilty to the nerdiness of doing that—but I have not read all the terms and conditions of every application and social medium I have downloaded, and I agree that many people do skim through them. They say the most commonly told lie on the planet at the moment is “I agree to the terms and conditions”, and the noble Baroness is right to point to the need for these to be intelligible, easily accessible and transparent—which of course we want to see.
In answer to her other question, the record-keeping duty will apply to all companies, but the requirement to publish is only for category 1 and category 2A companies.
The noble Baroness, Lady Kidron, asked me about Amendment 27A. If she will permit me, I will write to her with the best and fullest answer to that question.
I am grateful to noble Lords for their questions on this group of amendments.