(1 year, 4 months ago)
Lords ChamberMy Lords, let me add to this miscellany by speaking to the government amendments that stand in my name as part of this group. The first is Amendment 288A, which we mentioned on the first group of amendments on Report because it relates to the new introductory clause, Clause 1, and responds to the points raised by the noble Lord, Lord Stevenson of Balmacara. I am very happy to say again that the Government recognise that people with multiple and combined characteristics suffer disproportionately online and are often at greater risk of harm. This amendment therefore adds a provision in the new interpretation clause, Clause 1, to put beyond doubt that all the references to people with “a certain characteristic” throughout the Bill include people with a combination of characteristics. We had a good debate about the Interpretation Act 1978, which sets that out, but we are happy to set it out clearly here.
In his Amendment 186A, my noble friend Lord Moylan seeks to clarify a broader issue relating to consumer rights and online platforms. He got some general support—certainly gratitude—for raising this issue, although there was a bit of a Committee-style airing of it and a mixture of views on whether this is the right way or the right place. The amendment seeks to make it clear that certain protections for consumers in the Consumer Rights Act 2015 apply when people use online services and do not pay for them but rather give up their personal data in exchange. The Government are aware that the application of the law in that area is not always clear in relation to free digital services and, like many noble Lords, express our gratitude to my noble friend for highlighting the issue through his amendment.
We do not think that the Bill is the right vehicle for attempting to provide clarification on this point, however. We share some of the cautions that the noble Lord, Lord Allan of Hallam, raised and agree with my noble friend Lady Harding of Winscombe that this is part of a broader question about consumer rights online beyond the services with which the Bill is principally concerned. It could be preferable that the principle that my noble friend Lord Moylan seeks to establish through his amendment should apply more widely than merely to category 1 services regulated under the Bill. I assure him that the Bill will create a number of duties on providers which will benefit users and clarify that they have existing rights of action in the courts. We discussed these new protections in depth in Committee and earlier on Report. He drew attention to Clause 65(1), which puts a requirement on all services, not just category 1 services, to include clear and accessible provisions in their terms of service informing users about their right to bring a claim for breach of contract. Therefore, while we are grateful, we agree with noble Lords who suggested that this is a debate for another day and another Bill.
Amendment 191A from the noble Baroness, Lady Kidron, would require Ofcom to issue guidance for coroners and procurators fiscal to aid them in submitting requests to Ofcom to exercise its power to obtain information from providers about the use of a service by a deceased child. While I am sympathetic to her intention, I do not think that her amendment is the right answer. It would be inappropriate for an agency of the Executive to issue guidance to a branch of the judiciary. As I explained in Committee, it is for the Chief Coroner to provide detailed guidance to coroners. This is written to assist coroners with the law and their legal duties and to provide commentary and advice on policy and practice.
The amendment tabled by the noble Baroness cuts across the role of the Chief Coroner and risks compromising the judicial independence of the coroner, as set out in the Constitutional Reform Act 2005. As she is aware, the Chief Coroner has agreed to consider issuing guidance to coroners on social media and to consider the issues covered in the Bill. He has also agreed to explore whether coroners would benefit from additional training, with the offer of consultation with experts including Ofcom and the Information Commissioner’s Office. I suggest that the better approach would be for Ofcom and the Information Commissioner’s Office to support the Chief Coroner in his consideration of these issues where he would find that helpful.
I agree with the noble Lord, Lord Allan, that coroners must have access to online safety expertise given the technical and fast-moving nature of this sector. As we have discussed previously, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest following a request from a coroner which will provide that expertise. I hope that this reassures the noble Baroness.
I understand the report on a specific death, which is very welcome and part of the regime as we all see it. The very long list of things that the coroner may not know that they do not know, as I set out in the amendment, is the issue which I and other noble Lords are concerned about. If the Government could find a way to make that possible, I would be very grateful.
We are keen to ensure that coroners have access to the information and expertise that they need, while respecting the independence of the judicial process to decide what they do not know and would like to know more about and the role of the Chief Coroner there. It is a point that I have discussed a lot with the noble Baroness and with my noble friend Lady Newlove in her former role as Victims’ Commissioner. I am very happy to continue doing so because it is important that there is access to that.
The noble Lord, Lord Stevenson, spoke to the amendments tabled by the noble Baroness, Lady Merron, about supposedly gendered language in relation to Clauses 141 and 157. As I made clear in Committee, I appreciate the intention—as does Lady Deben—of making clear that a person of either sex can perform the role of chairman, just as they can perform the role of ombudsman. We have discussed in Committee the semantic point there. The Government have used “chairman” here to be consistent with terminology in the Office of Communications Act 2002. I appreciate that this predates the Written Ministerial Statement which the noble Lord cited, but that itself made clear that the Government at the time recognised that in practice, parliamentary counsel would need to adopt a flexible approach to this change—for example, in at least some of the cases where existing legislation originally drafted in the former style is being amended.
The noble Lord may be aware of a further Written Ministerial Statement, made on 23 May last year, following our debates on gendered language on another Bill, when the then Lord President of the Council and Leader of the House of Commons said that the Office of the Parliamentary Counsel would update its drafting guidance in light of that. That guidance is still forthcoming. However, importantly, the term here will have no bearing on Ofcom’s decision-making on who would chair the advisory committees. It must establish that this could indeed be a person of either sex.
Amendment 253 seeks to enable co-operation, particularly via information-sharing, between Ofcom and other regulators within the UK. I reassure noble Lords that Section 393 of the Communications Act 2003 already includes provisions for sharing information between Ofcom and other regulators in the UK.
As has been noted, Ofcom already co-operates effectively with other domestic regulators. That has been strengthened by the establishment of the Digital Regulation Co-operation Forum. By promoting greater coherence, the forum helps to resolve potential tensions, offering clarity for people and the industry. It ensures collaborative work across areas of common interest to address complex problems. Its outputs have already delivered real and wide-ranging impacts, including landmark policy statements clarifying the interactions between digital regulatory regimes, research into cross-cutting issues, and horizon-scanning activities on new regulatory challenges. We will continue to assess how best to support collaboration between digital regulators and to ensure that their approaches are joined up. We therefore do not think that Amendment 253 is necessary.
Sit down or stand up—I cannot remember.
I wonder whether the department has looked at the DSA and other situations where this is being worked out. I recognise that it takes a period of time, but it is not without some precedent that a pathway should be described.
We do not think that six weeks is enough time for the evidence base to develop sufficiently, our assessment being that to endow the Secretary of State with that power at this point is premature.
Amendment 262AA would require Ofcom to consider whether it is appropriate to require providers to take steps to comply with Ofcom’s researcher access guidance when including a requirement to take steps in a confirmation decision. This would be inappropriate because the researcher access provisions are not enforceable requirements; as such, compliance with them should not be subject to enforcement by the regulator. Furthermore, enforcement action may relate to a wide variety of very important issues, and the steps needed should be sufficient to address a failure to comply with an enforceable requirement. Singling out compliance with researcher access guidance alone risks implying that this will be adequate to address core failures.
Amendment 272AB would require Ofcom to give consideration to whether greater access to data could be achieved through legal requirements or incentives for regulated services. I reassure noble Lords that the scope of Ofcom’s report will already cover how greater access to data could be achieved, including through enforceable requirements on providers.
Amendment 272E would require Ofcom to take a provider’s compliance with Ofcom’s guidance on researcher access to data into account when assessing risks from regulated services and determining whether to take enforcement action and what enforcement action to take. However, we do not believe that this is a relevant factor for consideration of these issues. I hope noble Lords will agree that whether or not a company has enabled researcher access to its data should not be a mitigating factor against Ofcom requiring companies to deal with terrorism or child sexual exploitation or abuse content, for example.
On my noble friend Lord Bethell’s remaining Amendments 272BA, 273A and 273B, the first of these would require Ofcom to publish its report on researchers’ access to information within six months. While six months would not be deliverable given other priorities and the complexity of this issue, the government amendment to which I have spoken would reduce the timelines from two years to 18 months. That recognises the importance of the issue while ensuring that Ofcom can deliver the key priorities in establishing the core parts of the regulatory framework; for example, the illegal content and child safety duties.
(1 year, 4 months ago)
Lords ChamberWe talked about bots controlled by service providers before the noble Lord, Lord Knight, asked questions on this. The Bill is designed to make online service providers responsible for the safety of their users in light of harmful activities that their platforms might facilitate. Providers of a user-to-user service will need to adhere to their duties of care, which apply to all user-generated content present on their service. The Bill does not, however, regulate content published by user-to-user providers themselves. That is because the providers are liable for the content they publish on the service themselves. The one exception to this—as the noble Baroness, Lady Kidron, alluded to in her contribution—is pornography, which poses a particular risk to children and is regulated by Part 5 of the Bill.
I am pleased to reassure the noble Lord, Lord Clement- Jones, that the Bill—
I thank the noble Lord for giving way. The Minister just said that private providers will be responsible for their content. I would love to understand what mechanism makes a provider responsible for their content?
I will write to noble Lords with further information and will make sure that I have picked up correctly the questions that they have asked.
On Amendment 152A, which the noble Lord, Lord Clement-Jones, has tabled, I am pleased to assure him that the Bill already achieves the intention of the amendment, which seeks to add characters and objects that might interact with users in the virtual world to the Bill’s definition of user-generated content. Let me be clear again: the Bill already captures any service that facilitates online user-to-user interaction, including in the metaverse or other augmented reality or immersive online worlds.
The Bill broadly defines “content” as
“anything communicated by means of an internet service”,
so it already captures the various ways in which users may encounter content. Clause 211 makes clear that “encounter” in relation to content for the purposes of the Bill means to,
“read, view, hear or otherwise experience”
content. That definition extends to the virtual worlds which noble worlds have envisaged in their contributions. It is broad enough to encompass any way of encountering content, whether that be audio-visually or through online avatars or objects.
In addition, under the Bill’s definition of “functionality”,
“any feature that enables interactions of any description between users of the service”
will be captured. That could include interaction between avatars or interaction by means of an object in a virtual world. All in-scope services must therefore consider a range of functionalities as part of their risk assessment and must put in place any necessary measures to mitigate and manage any risks that they identify.
I hope that that provides some assurance to the noble Lord that the concerns that he has raised are covered, but I shall happily write on his further questions before we reach the amendment that the noble Baroness, Lady Finlay, rightly flagged in her contribution.
(1 year, 4 months ago)
Lords ChamberMy Lords, as I set out in Committee, the Government are bringing forward a package of amendments to address the challenges that bereaved parents and coroners have faced when seeking to access data after the death of a child.
These amendments have been developed after consultation with those who, so sadly, have first-hand experience of these challenges. I thank in particular the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens and Frankie Thomas for raising awareness of the challenges they have faced when seeking access to information following the heartbreaking cases involving their children. I am also grateful to the noble Baroness, Lady Kidron, for championing this issue in Parliament and more widely. I am very happy to say that she is supporting the government amendments in this group.
The loss of any life is heartbreaking, but especially so when it involves a child. These amendments will create a more straightforward and humane process for accessing data and will help to ensure that parents and coroners receive the answers they need in cases where a child’s death may be related to online harms. We know that coroners have faced challenges in accessing relevant data from online service providers, including information about a specific child’s online activity, where that might be relevant to an investigation or inquest. It is important that coroners can access such information.
As such, I turn first to Amendments 246, 247, 249, 250, 282, 283 and 287, which give Ofcom an express power to require information from regulated services about a deceased child’s online activity following a request from a coroner. This includes the content the child had viewed or with which he or she had engaged, how the content came to be encountered by the child, the role that algorithms and other functionalities played, and the method of interaction. It also covers any content that the child generated, uploaded or shared on the service.
Crucially, this power is backed up by Ofcom’s existing enforcement powers, so that, where a company refuses to provide information requested by Ofcom, companies may be subject to enforcement action, including senior management liability. To ensure that there are no barriers to Ofcom sharing information with coroners, first, Amendment 254 enables Ofcom to share information with a coroner without the prior consent of a business to disclose such information. This will ensure that Ofcom is free to provide information it collects under its existing online safety functions to coroners, as well as information requested specifically on behalf of a coroner, where that might be useful in determining whether social media played a part in a child’s death.
Secondly, coroners must have access to online safety expertise, given the technical and fast-moving nature of the industry. As such, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest, following a request from a coroner. This may include, for example, information about a company’s systems and processes, including how algorithms have promoted specific content to a child. To this end, the Chief Coroner’s office will consider issuing non-statutory guidance and training for coroners about social media as appropriate, subject to the prioritisation of resources. We are confident that this well-established framework provides an effective means to provide coroners with training on online safety issues.
It is also important that we address the lack of transparency from large social media services about their approach to data disclosure. Currently, there is no common approach to this issue, with some services offering memorialisation or contact-nomination processes, while others seemingly lack any formal policy. To tackle this, a number of amendments in this group will require the largest services—category 1, 2A and 2B services—to set out policies relating to the disclosure of data regarding the online activities of a deceased child in a clear, accessible and sufficiently detailed format in their terms of service. These companies will also be required to provide a written response to data requests in a timely manner and must provide a dedicated helpline, or similar means, for parents to communicate with the company, in order to streamline the process. This will address the painful radio silence experienced by many bereaved parents. The companies must also offer options so that parents can complain when they consider that a platform is not meeting its obligations. These must be easy to access, easy to use and transparent.
The package of amendments will apply not only to coroners in England and Wales but also to Northern Ireland and equivalent investigations in Scotland, where similar sad events have occurred.
The Government will also address other barriers which are beyond the scope of this Bill. For example, we will explore measures to introduce data rights for bereaved parents who wish to request information about their deceased children through the Data Protection and Digital Information Bill. We are also working, as I said in Committee, with our American counterparts to clarify and, where necessary, address unintended barriers to information sharing created by the United States Stored Communications Act. I beg to move.
My Lords, I thank the Minister and indeed the Secretary of State for bringing forward these amendments in the fulsome manner that they have. I appreciate it, but I know that Bereaved Families for Online Safety also appreciates it. The Government committed to bringing forward these amendments on the last day in Committee, so they have been pre-emptively welcomed and discussed at some length. One need only read through Hansard of 22 June to understand the strength of feeling about the pain that has been caused to families and the urgent need to prevent others experiencing the horror faced by families already dealing with the loss of their child.
I will speak briefly on three matters only. First, I must once again thank bereaved families and colleagues in this House and in the other place for their tireless work in pressing this issue. This is one of those issues that does not allow for celebration. As I walked from the Chamber on 22 June, I asked one of the parents how they felt. They said: “It is too late for me”. It was not said in bitterness but in acknowledgement of their profound hurt and the failure of companies voluntarily to do what is obvious, moral and humane. I ask the Government to see the sense in the other amendments that noble Lords brought forward on Report to make children safer, and make the same, pragmatic, thoughtful solution to those as they have done on this group of amendments. It makes a huge difference.
Secondly, I need to highlight just one gap; I have written to the Secretary of State and the Minister on this. I find it disappointing that the Government did not find a way to require senior management to attend an inquest to give evidence. Given that the Government have agreed that senior managers should be subject to criminal liability under some circumstances, I do not understand their objections to summoning them to co-operate with legal proceedings. If a company submits information in response to Ofcom and at the coroner’s request the company’s senior management is invited to attend the inquest, it makes sense that someone should be required to appear to answer and follow up those questions. Again, on behalf of the bereaved families and specifically their legal representatives, who are very clear on the importance of this part of the regime, I ask the Government to reconsider this point and ask the Minister to undertake to speak to the department and the MoJ, if necessary, to make sure that, if senior managers are asked to attend court, they are mandated to do so.
Thirdly, I will touch on the additional commitments the Minister made beyond the Bill, the first of which is the upcoming Data Protection and Digital Information Bill. I am glad to report that some of the officials working on the Bill have already reached out, so I am grateful to the Minister that this is in train, but I expect it to include guidance for companies that will, at a minimum, cover data preservation orders and guidance about the privacy of other users in cases where a child has died. I think that privacy for other users is central to this being a good outcome for everybody, and I hope we are able to include that.
I am pleased to hear about the undertaking with the US regarding potential barriers, and I believe—and I would love to hear from the Minister—that the objective is to make a bilateral agreement that would allow data to be shared between the two countries in the case of a child’s death. It is very specific requirement, not a wide-ranging one. I believe, if we can do it on a bilateral basis, it would be easier than a broad attempt to change the data storage Act.
I turn finally to training for coroners. I was delighted that the Chief Coroner made a commitment to consider issuing non-legislative guidance and training on social media for coroners and the offer of consultation with experts, including Ofcom, the ICO and bereaved families and their representatives, but this commitment was made subject to funding. I ask the Minister to agree to discuss routes to funding from the levy via Ofcom’s digital literacy duty. I have proposed an amendment to the government amendment that would make that happen, but I would welcome the opportunity to discuss it with the Minister. Coroners must feel confident in their understanding of the digital world, and I am concerned that giving this new route to regulated companies via Ofcom without giving them training on how to use it may create a spectre of failure or further frustration and distress for bereaved families. I know there is not a person in the House who would want that to be the outcome of these welcome government amendments.
My Lords, Amendments 100 and 101 seek further to define the meaning of “significant” in the children’s access assessment, with the intention of aligning this with the meaning of “significant” in the Information Commissioner’s draft guidance on the age-appropriate design code.
I am grateful to the noble Baroness, Lady Kidron, for the way in which she has set out the amendments and the swiftness with which we have considered it. The test in the access assessment in the Bill is already aligned with the test in the code, which determines whether a service is likely to be accessed by children in order to ensure consistency for all providers. The Information Commissioner’s Office has liaised with Ofcom on its new guidance on the likely to access test for the code, with the intention of aligning the two regulatory regimes while reflecting that they seek to do different things. In turn, the Bill will require Ofcom to consult the ICO on its guidance to providers, which will further support alignment between the tests. So while we agree about the importance of alignment, we think that it is already catered for.
With regard to Amendment 100, Clause 30(4)(a) already states that
“the reference to a ‘significant’ number includes a reference to a number which is significant in proportion to the total number of United Kingdom users of a service”.
There is, therefore, already provision in the Bill for this being a significant number in and of itself.
On Amendment 101, the meaning of “significant” must already be more than insignificant by its very definition. The amendment also seeks to define “significant” with reference to the number of children using a service rather than seeking to define what is a significant number.
I hope that that provides some reassurance to the noble Baroness, Lady Kidron, and that she will be content to withdraw the amendment.
I am not sure that, at this late hour, I completely understood what the Minister said. On the basis that we are seeking to align, I will withdraw my amendment, but can we check that we are aligned as my speech came directly from a note from officials that showed a difference? On that basis, I am happy to withdraw.
(1 year, 4 months ago)
Lords ChamberMy Lords, this has been a useful debate. As the noble Baroness, Lady Kidron, says, because I spoke first to move the government amendments, in effect I got my response in first to her Amendment 174, the only non-government amendment in the group. That is useful because it allows us to have a deeper debate on it.
The noble Baroness asked about the way that organisations such as the British Board of Film Classification already make assessments of sexualised content. However, the Bill’s requirement on service providers and the process that the BBFC takes to classify content are not really comparable. Services will have far less time and much more content to consider them the BBFC does, so will not be able to take the same approach. The BBFC is able to take an extended time to consider maybe just one scene, one image or one conversation, and therefore can apply nuance to its assessments. That is not possible to do at the scale at which services will have to apply the child safety duties in the Bill. We therefore think there is a real risk that they would excessively apply those duties and adversely affect children’s rights online.
I know the noble Baroness and other noble Lords are rightly concerned with protecting rights to free expression and access to information online for children and for adults. It is important that we strike the right balance, which is what we have tried to do with the government amendments in this group.
To be clear, the point that I made about the BBFC was not to suggest a similar arrangement but to challenge the idea that we cannot categorise material of a sexualised nature. Building on the point made by the noble Lord, Lord Allan, perhaps we could think about it in terms of the amber light rather than the red light—in other words, something to think about.
I certainly will think about it, but the difficulty is the scale of the material and the speed with which we want these assessments to be made and that light to be lit, in order to make sure that people are properly protected.
My noble friend Lord Moylan asked about differing international terminology. In order for companies to operate in the United Kingdom they must have an understanding of the United Kingdom, including the English-language terms used in our legislation. He made a point about the Equality Act 2010. While it uses the same language, it does not extend the Equality Act to this part of the Bill. In particular, it does not create a new offence.
The noble Baroness, Lady Fox, also mentioned the Equality Act when she asked about the phraseology relating to gender reassignment. We included this wording to ensure that the language used in the Bill matches Section 7(1) of the Equality Act 2010 and that gender reassignment has the same meaning in the Bill as it does in that legislation. As has been said by other noble Lords—
(1 year, 4 months ago)
Lords ChamberI believe I misspoke when I asked my question. I referred to under-18s. Of course, if they are under 18 then it is child sexual abuse. I meant someone under the age of 18 with an adult image. I put that there for the record.
If the noble Baroness misspoke, I understood what she intended. I knew what she was getting at.
With that, I hope noble Lords will be content not to press their amendments and that they will support the government amendments.
(1 year, 5 months ago)
Lords ChamberIt is guidance rather than direction, but it will be done openly and transparently. Users will be able to see the guidance which Ofcom has issued, to see whether companies have responded to it as they see fit and, through the rest of the framework of the Bill, be empowered to make their decisions about their experiences online. This being done openly and transparently, and informed by Ofcom’s research, will mean that everyone is better informed.
We are sympathetic to the amendment. It is complex, and this has been a useful debate—
I wonder whether the Minister has an answer to the academic community, who now see their European colleagues getting ahead through being able to access data through other legislation in other parts of the world. Also, we have a lot of faith in Ofcom, but it seems a mistake to let it be the only arbiter of what needs to be seen.
We are very aware that we are not the only jurisdiction looking at the important issues the Bill addresses. The Government and, I am sure, academic researchers will observe the implementation of the European Union’s Digital Services Act with interest, including the provisions about researchers’ access. We will carefully consider any implications of our own online safety regime. As noble Lords know, the Secretary of State will be required to undertake a review of the framework between two and five years after the Bill comes into force. We expect that to include an assessment of how the Bill’s existing transparency provisions facilitate researcher access.
I am very grateful to the noble Baroness for her amendment, which is a useful opportunity for us to state publicly and share with the Committee the progress we have been making in our helpful discussions on these issues in relation to these amendments. I am very grateful to her and to my noble friends Lord Bethell and Lady Harding for speaking as one on this, including, as is well illustrated, in this short debate this evening.
As the noble Baroness knows, discussions continue on the precise wording of these definitions. I share her optimism that we will be able to reach agreement on a suitable way forward, and I look forward to working with her, my noble friends and others as we do so.
The Bill already includes a definition of age assurance in Clause 207, which is
“measures designed to estimate or verify the age or age-range of users of a service”.
As we look at these issues, we want to avoid using words such as “checking”, which suggests that providers need to take a proactive approach to checking age, as that may inadvertently preclude the use of technologies which determine age through other means, such as profiling. It is also important that any definition of age assurance does not restrict the current and future use of innovative and accurate technologies. I agree that it is important that there should be robust definitions for terms which are not currently defined in the Bill, such as age verification, and recommit to the discussions we continue to have on what terms need to be defined and the best way to define them.
This has been a very helpful short debate with which to end our deliberations in Committee. I am very grateful to noble Lords for all the points that have been raised over the past 10 days, and I am very glad to be ending in this collaborative spirit. There is much for us still to do, and even more for the Office of the Parliamentary Counsel to do, before we return on Report, and I am grateful to it and to the officials working on the Bill. I urge the noble Baroness to withdraw her amendment.
(1 year, 6 months ago)
Lords ChamberYes, that would be a sensible way to view it. We will work on that and allow noble Lords to see it before they come to talk to us about it.
I put on record that the withdrawal of Part 3 of the Digital Economy Act 2017 will be greeted with happiness only should the full schedule of AV and harms be put into the Bill. I must say that because the noble Baroness, Lady Benjamin, is not in her place. She worked very hard for that piece of legislation.
My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.
I am sorry to enter the fray again on complaints, but how will anyone know that they have failed in this way if there is no complaints system?
I refer to the meeting my noble friend Lord Camrose offered; we will be able to go through and unpick the issues raised in that group of amendments, rather than looping back to that debate now.
I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.
The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.
It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.
The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.
Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.
I asked a number of questions on specific scenarios. If the Minister cannot answer them straight away, perhaps he could write to me. They all rather called for “yes/no” answers.
The noble Baroness threw me off with her subsequent question. She was broadly right, but I will write to her after I refresh my memory about what she said when I look at the Official Report.
(1 year, 6 months ago)
Lords ChamberMy Lords, this debate has demonstrated the diversity of opinion regarding misinformation and disinformation—as the noble Lord said, the Joint Committee gave a lot of thought to this issue—as well as the difficulty of finding the truth of very complex issues while not shutting down legitimate debate. It is therefore important that we legislate in a way that takes a balanced approach to tackling this, keeping people safe online while protecting freedom of expression.
The Government take misinformation and disinformation very seriously. From Covid-19 to Russia’s use of disinformation as a tool in its illegal invasion of Ukraine, it is a pervasive threat, and I pay tribute to the work of my noble friend Lord Bethell and his colleagues in the Department of Health and Social Care during the pandemic to counter the cynical and exploitative forces that sought to undermine the heroic effort to get people vaccinated and to escape from the clutches of Covid-19.
We recognise that misinformation and disinformation come in many forms, and the Bill reflects this. Its focus is rightly on tackling the most egregious, illegal forms of misinformation and disinformation, such as content which amounts to the foreign interference offence or which is harmful to children—for instance, that which intersects with named categories of primary priority or priority content.
That is not the only way in which the Bill seeks to tackle it, however. The new terms of service duties for category 1 services will hold companies to account over how they say they treat misinformation and disinformation on their services. However, the Government are not in the business of telling companies what legal content they can and cannot allow online, and the Bill should not and will not prevent adults accessing legal content. In addition, the Bill will establish an advisory committee on misinformation and disinformation to provide advice to Ofcom on how they should be tackled online. Ofcom will be given the tools to understand how effectively misinformation and disinformation are being addressed by platforms through transparency reports and information-gathering powers.
Amendment 52 from the noble Baroness, Lady Merron, seeks to introduce a new duty on platforms in relation to health misinformation and disinformation for adult users, while Amendments 59 and 107 from my noble friend Lord Moylan aim to introduce new proportionality duties for platforms tackling misinformation and disinformation. The Bill already addresses the most egregious types of misinformation and disinformation in a proportionate way that respects freedom of expression by focusing on misinformation and disinformation that are illegal or harmful to children.
I am curious as to what the Bill says about misinformation and disinformation in relation to children. My understanding of primary priority and priority harms is that they concern issues such as self-harm and pornography, but do they say anything specific about misinformation of the kind we have been discussing and whether children will be protected from it?
I am sorry—I am not sure I follow the noble Baroness’s question.
Twice so far in his reply, the Minister has said that this measure will protect children from misinformation and disinformation. I was just curious because I have not seen any sight of that, either in discussions or in the Bill. I was making a distinction regarding harmful content that we know the shape of—for example, pornography and self-harm, which are not, in themselves, misinformation or disinformation of the kind we are discussing now. It is news to me that children are going to be protected from this, and I am delighted, but I was just checking.
Yes, that is what the measure does—for instance, where it intersects with the named categories of primary priority or priority content in the Bill, although that is not the only way the Bill does it. This will be covered by non-designated content that is harmful to children. As we have said, we will bring forward amendments on Report—which is perhaps why the noble Baroness has not seen them in the material in front of us—regarding material harms to children, and they will provide further detail and clarity.
Returning to the advisory committee that the Bill sets up and the amendments from the noble Baroness, Lady Merron, and my noble friend Lord Moylan, all regulated service providers will be forced to take action against illegal misinformation and disinformation in scope of the Bill. That includes the new false communication offences in the Bill that will capture communications where the sender knows the information to be false but sends it intending to cause harm—for example, hoax cures for a virus such as Covid-19. The noble Baroness is right to say that that is a slightly different approach from the one taken in her amendment, but we think it an appropriate and proportionate response to tackling damaging and illegal misinformation and disinformation. If a platform is likely to be accessed by children, it will have to protect them from encountering misinformation and disinformation content that meets the Bill’s threshold for content that is harmful to children. Again, that is an appropriate and proportionate response.
Turning to the points made by my noble friend Lord Moylan and the noble Baroness, Lady Fox, services will also need to have particular regard to freedom of expression when complying with their safety duties. Ofcom will be required to set out steps that providers can take when complying with their safety duties in the codes of practice, including what is proportionate for different providers and how freedom of expression can be protected.
My Lords, this has been a good debate. I am glad that a number of noble Lords mentioned Lord Puttnam and the committee that he chaired for your Lordships’ House on democracy and digital technologies. I responded to the debate that we had on that; sadly, it was after he had already retired from your Lordships’ House, but he participated from the steps of the Throne. I am mindful of that report and the lessons learned in it in the context of the debate that we have had today.
We recognise the intent behind the amendments in this group to strengthen the UK’s approach to media literacy in so far as it relates to services that will be regulated by the Bill. Ofcom has a broad duty to promote media literacy under the Communications Act 2003. That is an important responsibility for Ofcom, and it is right that the regulator is able to adapt its approach to support people in meeting the evolving challenges of the digital age.
Amendments 52A and 91 from the noble Lord, Lord Knight, and Amendment 91A from the noble Lord, Lord Holmes of Richmond, seek to introduce duties on in-scope services, requiring them to put in place measures that promote users’ media literacy, while Amendment 98 tabled by the noble Lord, Lord Knight, would require Ofcom to issue a code of practice in relation to the new duty proposed in his Amendment 91. While we agree that the industry has a role to play in promoting media literacy, the Government believe that these amendments could lead to unintended, negative consequences.
I shall address the role of the industry and media literacy, which the noble Baroness, Lady Kidron, dwelt on in her remarks. We welcome the programmes that it runs in partnership with online safety experts such as Parent Zone and Internet Matters and hope they continue to thrive, with the added benefit of Ofcom’s recently published evaluation toolkit. However, we believe that platforms can go further to empower and educate their users. That is why media literacy has been included in the Bill’s risk assessment duties, meaning that regulated services will have to consider measures to promote media literacy to their users as part of the risk assessment process. Additionally, through work delivered under its existing media literacy duty, Ofcom is developing a set of best-practice design principles for platform-based media literacy measures. That work will build an evidence base of the most effective measures that platforms can take to build their users’ media literacy.
In response to the noble Baroness’s question, I say: no, platforms will not be able to avoid putting in place protections for children by using media literacy campaigns. Ofcom would be able to use its enforcement powers if a platform was not achieving appropriate safety outcomes. There are a range of ways in which platforms can mitigate risks, of which media literacy is but one, and Ofcom would expect platforms to consider them all in their risk assessments.
Let me say a bit about the unintended consequences we fear might arise from these amendments. First, the resource demands to create a code of practice and then to regulate firms’ compliance with this type of broad duty will place an undue burden on the regulator. It is also unclear how the proposed duties in Amendments 52A, 91 and 91A would interact with Ofcom’s existing media literacy duty. There is a risk, we fear, that these parallel duties could be discharged in conflicting ways. Amendment 91A is exposed to broad interpretation by platforms and could enable them to fulfil the duty in a way that lacked real impact on users’ media literacy.
The amendment in the name of my noble friend Lord Holmes proposes a duty to promote awareness of financial deception and fraud. The Government are already taking significant action to protect people from online fraud, including through their new fraud strategy and other provisions in this Bill. I know that my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn met noble Lords to talk about that earlier this week. We believe that measures such as prompts for users before they complete financial transactions sit more logically with financial service providers than with services in scope of this Bill.
Amendment 52A proposes a duty on carriers of journalistic content to promote media literacy to their users. We do not want to risk requiring platforms to act as de facto press regulators, assessing the quality of news publishers’ content. That would not be compatible with our commitment to press freedom. Under its existing media literacy duty, Ofcom is delivering positive work to support people to discern high-quality information online. It is also collaborating with the biggest platforms to design best practice principles for platform-based media literacy measures. It intends to publish these principles this year and will encourage platforms to adopt them.
It is right that Ofcom is given time to understand the benefits of these approaches. The Secretary of State’s post-implementation review will allow the Government and Parliament to establish the effectiveness of Ofcom’s current approach and to reconsider the role of platforms in enhancing users’ media literacy, if appropriate. In the meantime, the Bill introduces new transparency-reporting and information-gathering powers to enhance Ofcom’s visibility of platforms delivery and evaluation of media literacy activities. We would not want to see amendments that would inadvertently dissuade platforms from delivering these activities in favour of less costly and less effective measures.
My noble friend Lord Holmes asked about the Online Media Literacy Strategy, published in July 2021, which set out the Government’s vision for improving media literacy in the country. Alongside the strategy, we have committed to publishing annual action plans each financial year until 2024-25, setting out how we meet the ambition of the strategy. In April 2022 we published the Year 2 Action Plan, which included extending the reach of media literacy education to those who are currently disengaged, in consultation with the media literacy task force—a body of 17 cross-sector experts—expanding our grant funding programme to provide nearly £2.5 million across two years for organisations delivering innovative media literacy activities, and commissioning research to improve our understanding of the challenges faced by the sector. We intend to publish the research later this year, for the benefit of civil society organisations, technology platforms and policymakers.
The noble Lord, Lord Knight, in his Amendment 186, would stipulate that Ofcom must levy fees on regulated firms sufficient to fund the work of third parties involved in supporting it to meet its existing media literacy duties. The Bill already allows Ofcom to levy fees sufficient to fund the annual costs of exercising its online safety functions. This includes its existing media literacy duty as far as it relates to services regulated by this Bill. As such, the Bill already ensures that these media literacy activities, including those that Ofcom chooses to deliver through third parties, can be funded through fees levied on industry.
I turn to Amendments 188, 235, 236, 237 and 238. The Government recognise the intent behind these amendments, which is to help improve the media literacy of the general public. Ofcom already has a statutory duty to promote media literacy with regard to the publication of anything by means of electronic media, including services in scope of the Bill. These amendments propose rather prescriptive objectives, either as part of a new duty for Ofcom or through updating its existing duty. They reflect current challenges in the sector but run the risk of becoming obsolete over time, preventing Ofcom from adapting its work in response to emerging issues.
Ofcom has demonstrated flexibility in its existing duty through its renewed Approach to Online Media Literacy, launched in 2021. This presented an expanded media literacy programme, enabling it to achieve almost all the objectives specified in this group. The Government note the progress that Ofcom has already achieved under its renewed approach in the annual plan it produced last month. The Online Safety Bill strengthens Ofcom’s functions relating to media literacy, which is included in Ofcom’s new transparency-reporting and information-gathering powers, which will give it enhanced oversight of industry activity by enabling it to require regulated services to share or publish information about the work that that they are doing on media literacy.
The noble Baroness, Lady Prashar, asked about the view expressed by the Joint Committee on minimum standards for media literacy training. We agree with the intention behind that, but, because of the broad and varied nature of media literacy, we do not believe that introducing minimum standards is the most effective way of achieving that outcome. Instead, we are focusing efforts on improving the evaluation practices of media literacy initiatives to identify which ones are most effective and to encourage their delivery. Ofcom has undertaken extensive work to produce a comprehensive toolkit to support practitioners to deliver robust evaluations of their programmes. This was published in February this year and has been met with praise from practitioners, including those who received grant funding from the Government’s non-legislative media literacy work programme. The post-implementation review of Ofcom’s online safety regime, which covers its existing media literacy duty in so far as it relates to regulated services, will provide a reasonable point at which to establish the effectiveness of Ofcom’s new work programme, after giving it time to take effect.
Noble Lords talked about the national curriculum and media literacy in schools. Media literacy is indeed a crucial skill for everyone in the digital age. Key media literacy skills are already taught through a number of compulsory subjects in the national curriculum. Digital literacy is included in the computing national curriculum in England, which equips pupils with the knowledge, understanding and skills to use information and communication technology creatively and purposefully. I can reassure noble Lords that people such as Monica are being taught not about historic things like floppy disks but about emerging and present challenges; the computing curriculum ensures that pupils are taught how to design program systems and accomplish goals such as collecting, analysing, evaluating and presenting data.
Does the Minister know how many children are on computing courses?
(1 year, 6 months ago)
Lords ChamberDoes the Minister therefore think that the Government condone the current system, where we are inundated algorithmically with material that we do not want? Are the Government condoning that behaviour, in the way that he is saying they would condone a safety measure?
We will come to talk about algorithms and their risks later on. There is an important balance to strike here that we have debated, rightly, in this group. I remind noble Lords that there are a range of measures that providers can put in place—
But as I think the noble Baroness understands from that reference, this is a definition already in statute, and with which Parliament and the courts are already engaged.
The Bill’s overarching freedom of expression duties also apply to Clause 12. Subsections (4) to (7) of Clause 18 stipulate that category 1 service providers are required to assess the impact on free expression from their safety policies, including the user empowerment features. This is in addition to the duties in Clause 18(2), which requires all user-to-user services to have particular regard to the importance of protecting freedom of expression when complying with their duties. The noble Baroness’s Amendment 283ZA would require category 1 providers to make judgments on user empowerment content to a similar standard required for illegal content. That would be disproportionate. Clause 170 already specifies how providers must make judgments about whether content is of a particular kind, and therefore in scope of the user empowerment duties. This includes making their judgment based on “all relevant information”. As such, the Bill already ensures that the user empowerment content features will be applied in a proportionate way that will not undermine free speech or hinder legitimate debate online.
Amendment 45, tabled by the noble Lord, Lord Stevenson of Balmacara, would require the Secretary of State to lay a Statement before Parliament outlining whether any of the user empowerment duties should be applied to children. I recognise the significant interest that noble Lords have in applying the Clause 12 duties to children. The Bill already places comprehensive requirements on Part 3 services which children are likely to access. This includes undertaking regular risk assessments of such services, protecting children from harmful content and activity, and putting in place age-appropriate protections. If there is a risk that children will encounter harm, such as self-harm content or through unknown or unverified users contacting them, service providers will need to put in place age- appropriate safety measures. Applying the user empowerment duties for child users runs counter to the Bill’s child safety objectives and may weaken the protections for children—for instance, by giving children an option to see content which is harmful to them or to engage with unknown, unverified users. While we recognise the concerns in this area, for the reasons I have set out, the Government do not agree with the need for this amendment.
I will resist the challenge of the noble Lord, Lord Knight, to talk about bots because I look forward to returning to that in discussing the amendments on future-proofing. With that, I invite noble Lords—
I noted the points made about the way information is pushed and, in particular, the speech of the right reverend Prelate. Nothing in the Government’s response has really dealt with that concern. Can the Minister say a few words about not the content but the way in which users are enveloped? On the idea that companies always act because they have a commercial imperative not to expose users to harmful material, actually, they have a commercial imperative to spread material and engage users. It is well recorded that a lot of that is in fact harmful material. Can the Minister speak a little more about the features rather than the content?
We will discuss this when it comes to the definition of content in the Bill, which covers features. I was struck by the speech by the right reverend Prelate about the difference between what people encounter online, and the analogy used by the noble Baroness, Lady Fox, about a bookshop. Social media is of a different scale and has different features which make that analogy not a clean or easy one. We will debate in other groups the accumulated threat of features such as algorithms, if the noble Baroness, Lady Kidron, will allow me to go into greater detail then, but I certainly take the points made by both the right reverend Prelate and the noble Baroness, Lady Fox, in their contributions.
(1 year, 6 months ago)
Lords ChamberI am sorry to interrupt the noble Lord. Is the answer to my question that the blue tick and the current Meta system will not be considered as verification under the terms of the Bill? Is that the implication of what he said?
Yes. The blue tick is certainly not identity verification. I will write to confirm on Meta, but they are separate and, as the example of blue ticks and Twitter shows, a changing feast. That is why I am talking in general terms about the approach, so as not to rely too much on examples that are changing even in the course of this Committee.
Government Amendment 43A stands in my name. This clarifies that “non-verified user” refers to users whether they are based in the UK or elsewhere. This ensures that, if a UK user decides he or she no longer wishes to interact with non-verified users, this will apply regardless of where they are based.
Finally, Amendment 106 in the name of my noble friend Lady Buscombe would make an addition to the online safety objectives for regulated user-to-user services. It would amend them to make it clear that one of the Bill’s objectives is to protect people from communications offences committed by anonymous users.
The Bill already imposes duties on services to tackle illegal content. Those duties apply across all areas of a service, including the way it is designed and operated. Platforms will be required to take measures—for instance, changing the design of functionalities, algorithms, and other features such as anonymity—to tackle illegal content.
Ofcom is also required to ensure that user-to-user services are designed and operated to protect people from harm, including with regard to functionalities and other features relating to the operation of their service. This will likely include the use of anonymous accounts to commit offences in the scope of the Bill. My noble friend’s amendment is therefore not needed. I hope she will be satisfied not to press it, along with the other noble Lords who have amendments in this group.
(1 year, 6 months ago)
Lords ChamberI will come on to talk a bit about dissemination as well. If the noble Lord will allow me, he can intervene later on if I have not done that to his satisfaction.
I was about to talk about the child safety duties in Clause 11(5), which also specifies that they apply to the way that a service is designed, how it operates and how it is used, as well as to the content facilitated by it. The definition of content makes it clear that providers are responsible for mitigating harm in relation to all communications and activity on their service. Removing the reference to content would make service providers responsible for all risk of harm to children arising from the general operation of their service. That could, for instance, bring into scope external advertising campaigns, carried out by the service to promote its website, which could cause harm. This and other elements of a service’s operations are already regulated by other legislation.
I apologise for interrupting. Is that the case, and could that not be dealt with by defining harm in the way that it is intended, rather than as harm from any source whatever? It feels like a big leap that, if you take out “content”, instead of it meaning the scope of the service in its functionality and content and all the things that we have talked about for the last hour and a half, the suggestion is that it is unworkable because harm suddenly means everything. I am not sure that that is the case. Even if it is, one could find a definition of harm that would make it not the case.
Taking it out in the way that the amendment suggests throws up that risk. I am sure that it is not the intention of the noble Lord or the noble Baroness in putting it, but that is a risk of the drafting, which requires some further thought.
Clause 11(2), which is the focus of Amendments 32, 85 and 295, already means that platforms have to take robust action against content which is harmful because of the manner of its dissemination. However, it would not be feasible for providers to fulfil their duties in relation to content which is harmful only by the manner of its dissemination. This covers content which may not meet the definition of content which is harmful to children in isolation but may be harmful when targeted at children in a particular way. One example could be content discussing a mental health condition such as depression, where recommendations are made repeatedly or in an amplified manner through the use of algorithms. The nature of that content per se may not be inherently harmful to every child who encounters it, but, when aggregated, it may become harmful to a child who is sent it many times over. That, of course, must be addressed, and is covered by the Bill.
(1 year, 7 months ago)
Lords ChamberI was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.
I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.
With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.
I agree that content, although unfathomable to the outside world, is defined as the Minister says. However, does that mean that when we see that
“primary priority content harmful to children”
will be put in regulations by the Secretary of State under Clause 54(2)—ditto Clause 54(3) and (4)—we will see those contact risks, conduct risks and commercial risks listed as primary priority, priority and non-designated harms?
I have tried to outline the Bill’s definition of content, which I think will give some reassurance that other concerns that noble Lords have raised are covered. I will turn in a moment to address priority and primary priority content, if the noble Baroness will allow me to do that, and then perhaps intervene again if I have not done so to her satisfaction. I want to set that out and try to keep track of all the questions which have been posed as I do so.
For now, I know there have been concerns from some noble Lords that if functionalities are not labelled as harm in the legislation they would not be addressed by providers, and I reassure your Lordships’ House that this is not the case. There is an important distinction between content and other risk factors such as, for instance, an algorithm, which without content cannot risk causing harm to a child. That is why functionalities are not covered by the categories of primary, priority and priority content which is harmful to children. The Bill sets out a comprehensive risk assessment process which will cover content or activity that poses a risk of harm to children and other factors, such as functionality, which may increase the risk of harm. As such, the existing children’s risk assessment criteria already cover many of the changes proposed in this amendment. For example, the duties already require service providers to assess the risk of harm to children from their business model and governance. They also require providers to consider how a comprehensive range of functionalities affect risk, how the service is used and how the use of algorithms could increase the risks to children.
Turning to the examples of harmful content set out in the proposed new schedule, I am happy to reassure the noble Baroness and other noble Lords that the Government’s proposed list of primary, priority and priority content covers a significant amount of this content. In her opening speech she asked about cumulative harm—that is, content sent many times or content which is harmful due to the manner of its dissemination. We will look at that in detail on the next group as well, but I will respond to the points she made earlier now. The definition of harm in the Bill under Clause 205 makes it clear that physical or psychological harm may arise from the fact or manner of dissemination of the content, not just the nature of the content—content which is not harmful per se, but which if sent to a child many times, for example by an algorithm, would meet the Bill’s threshold for content that is harmful to children. Companies will have to consider this as a fundamental part of their risk assessment, including, for example, how the dissemination of content via algorithmic recommendations may increase the risk of harm, and they will need to put in place proportionate and age-appropriate measures to manage and mitigate the risks they identify. I followed the exchanges between the noble Baronesses, Lady Kidron and Lady Fox, and I make it clear that the approach set out by the Bill will mean that companies cannot avoid tackling the kind of awful content which Molly Russell saw and the harmful algorithms which pushed that content relentlessly at her.
This point on cumulative harm was picked up by my noble friend Lord Bethell. The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service by way of payment or non-financial reward. This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children. The actions that companies will be required to take under their risk assessment duties in the Bill and the safety measures they will be required to put in place to manage the services risk will consider this bigger-picture risk profile.
The amendments of the noble Baroness, Lady Kidron, would remove references to primary priority and priority harmful content to children from the child risk assessment duties, which we fear would undermine the effectiveness of the child safety duties as currently drafted. That includes the duty for user-to-user providers to prevent children encountering primary priority harms, such as pornography and content that promotes self-harm or suicide, as well as the duty to put in place age-appropriate measures to protect children from other harmful content and activity. As a result, we fear these amendments could remove the requirement for an age-appropriate approach to protecting children online and make the requirement to prevent children accessing primary priority content less clear.
The noble Baroness, Lady Kidron, asked in her opening remarks about emerging harms, which she was right to do. As noble Lords know, the Bill has been designed to respond as rapidly as possible to new and emerging harms. First, the primary priority and priority list of content can be updated by the Secretary of State. Secondly, it is important to remember the function of non-designated content that is harmful to children in the Bill—that is content that meets the threshold of harmful content to children but is not on the lists designated by the Government. Companies are required to understand and identify this kind of content and, crucially, report it to Ofcom. Thirdly, this will inform the actions of Ofcom itself in its review and report duties under Clause 56, where it is required to review the incidence of harmful content and the severity of harm experienced by children as a result of it. This is not limited to content that the Government have listed as being harmful, as it is intended to capture new and emerging harms. Ofcom will be required to report back to the Government with recommendations on changes to the primary priority and priority content lists.
I turn to the points that the noble Lord, Lord Knight of Weymouth, helpfully raised earlier about things that are in the amendments but not explicitly mentioned in the Bill. As he knows, the Bill has been designed to be tech-neutral, so that it is future-proof. That is why there is no explicit reference to the metaverse or virtual or augmented reality. However, the Bill will apply to service providers that enable users to share content online or interact with each other, as well as search services. That includes a broad range of services such as websites, applications, social media sites, video games and virtual reality spaces such as the metaverse; those are all captured. Any service that allows users to interact, as the metaverse does, will need to conduct a children’s access assessment and comply with the child safety duties if it is likely to be accessed by children.
Amendment 123 from the noble Baroness, Lady Kidron, seeks to amend Clause 48 to require Ofcom to create guidance for Part 3 service providers on this new schedule. For the reasons I have just set out, we do not think it would be workable to require Ofcom to produce guidance on this proposed schedule. For example, the duty requires Ofcom to provide guidance on the content, whereas the proposed schedule includes examples of risky functionality, such as the frequency and volume of recommendations.
I stress again that we are sympathetic to the aim of all these amendments. As I have set out, though, our analysis leads us to believe that the four Cs framework is simply not compatible with the existing architecture of the Bill. Fundamental concepts such as risk, harm and content would need to be reconsidered in the light of it, and that would inevitably have a knock-on effect for a large number of clauses and timing. The Bill has benefited from considerable scrutiny—pre-legislative and in many discussions over many years. The noble Baroness, Lady Kidron, has been a key part of that and of improving the Bill. The task is simply unfeasible at this stage in the progress of the Bill through Parliament and risks delaying it, as well as significantly slowing down Ofcom’s implementation of the child safety duties. We do not think that this slowing down is a risk worth taking, because we believe the Bill already achieves what is sought by these amendments.
Even so, I say to the Committee that we have listened to the noble Baroness, Lady Kidron, and others and have worked to identify changes which would further address these concerns. My noble friend Lady Harding posed a clear question: if not this, what would the Government do instead? I am pleased to say that, as a result of the discussions we have had, the Government have decided to make a significant change to the Bill. We will now place the categories of primary priority and priority content which is harmful to children on the face of the Bill, rather than leaving them to be designated in secondary legislation, so Parliament will have its say on them.
We hope that this change will reassure your Lordships that protecting children from the most harmful content is indeed the priority for the Bill. That change will be made on Report. We will continue to work closely with the noble Baroness, Lady Kidron, my noble friends and others, but I am not able to accept the amendments in the group before us today. With that, I hope that she will be willing to withdraw.
I thank all the speakers. There were some magnificent speeches and I do not really want to pick out any particular ones, but I cannot help but say that the right reverend Prelate described the world without the four Cs. For me, that is what everybody in the Box and on the Front Bench should go and listen to.
I am grateful and pleased that the Minister has said that the Government are moving in this direction. I am very grateful for that but there are a couple of things that I have to come back on. First, I have swiftly read Amendment 205’s definition of harm and I do not think it says that you do not have to reach a barrier of harm; dissemination is quite enough. There is always the problem of what the end result of the harm is. The thing that the Government are not listening to is the relationship between the risk assessment and the harm. It is about making sure that we are clear that it is the functionality that can cause harm. I think we will come back to this at another point, but that is what I beg them to listen to. Secondly, I am not entirely sure that it is correct to say that the four Cs mean that you cannot have primary priority, priority and so on. That could be within the schedule of content, so those two things are not actually mutually exclusive. I would be very happy to have a think about that.
What was not addressed in the Minister’s answer was the point made by the noble Lord, Lord Allan of Hallam, in supporting the proposal that we should have in the schedule: “This is what you’ve got to do; this is what you’ve got to look at; this is what we’re expecting of you; and this is what Parliament has delivered”. That is immensely important, and I was so grateful to the noble Lord, Lord Stevenson, for putting his marker down on this set of amendments. I am absolutely committed to working alongside him and to finding ways around this, but we need to find a way of stating it.
Ironically, that is my answer to both the noble Baronesses, Lady Ritchie and Lady Fox: we should have our arguments here and now, in this Chamber. I do not wish to leave it to the Secretary of State, whom I have great regard for, as it happens, but who knows: I have seen a lot of Secretaries of State. I do not even want to leave it to the Minister, because I have seen a lot of Ministers too—ditto Ofcom, and definitely not the tech sector. So here is the place, and we are the people, to work out the edges of this thing.
Not for the first time, my friend, the noble Baroness, Lady Harding, read out what would have been my answer to the noble Baroness, Lady Ritchie. I have gone round and round, and it is like the Marx brothers’ movie: in the end, harm is defined by subsection (4)(c), but that says that harm will defined by the Secretary of State. It goes around like that through the Bill.
(1 year, 7 months ago)
Lords ChamberI am hesitant to give too tight a definition, because we want to remain technology neutral and make sure that we are keeping an open mind to developing changes. I will think about that and write to the noble Lord. The best endeavours will inevitably change over time as new technological solutions present themselves. I point to the resourcefulness of the sector in identifying those, but I will see whether there is anything more I can add.
While the Minister is reflecting, I note that the words “best endeavours” are always a bit of a worry. The noble Lord, Lord Allan, made the good point that once it is on your phone, you are in trouble and you must report it, but the frustration of many people outside this Chamber, if it has been on a phone and you cannot deal with it, is what comes next to find the journey of that piece of material without breaking encryption. I speak to the tech companies very often—indeed, I used to speak to the noble Lord, Lord Allan, when he was in position at then Facebook—but that is the question that we would like answered in this Committee, because the frustration that “It is nothing to do with us” is where we stop with our sympathy.
(1 year, 7 months ago)
Lords ChamberMy Lords, I share noble Lords’ determination to deliver the strongest protections for children and to develop a robust and future-proofed regulatory regime. However, it will not be possible to solve every problem on the internet through this Bill, nor through any piece of legislation, flagship or otherwise. The Bill has been designed to confer duties on the services that pose the greatest risk of harm—user-to-user services and search services—and where there are proportionate measures that companies can take to protect their users.
As the noble Baroness, Lady Kidron, and others anticipated, I must say that these services act as a gateway for users to discover and access other online content through search results and links shared on social media. Conferring duties on these services will therefore significantly reduce the risk of users going on to access illegal or harmful content on non-regulated services, while keeping the scope of the Bill manageable and enforceable.
As noble Lords anticipated, there is also a practical consideration for Ofcom in all this. I know that many noble Lords are extremely keen to see this Bill implemented as swiftly as possible; so am I. However, as the noble Lord, Lord Allan, rightly pointed out, making major changes to the Bill’s scope at this stage would have significant implications for Ofcom’s implementation timelines. I say this at the outset because I want to make sure that noble Lords are aware of those implications as we look at these issues.
I turn first to Amendments 2, 3, 5, 92 and 193, tabled by the noble Baroness, Lady Kidron. These aim to expand the number of services covered by the Bill to incorporate a broader range of services accessed by children and a broader range of harms. I will cover the broader range of harms more fully in a separate debate when we come to Amendment 93, but I am very grateful to the noble Baroness for her constructive and detailed discussions on these issues over the past few weeks and months.
These amendments would bring new services into scope of the duties beyond user-to-user and search services. This could include services which enable or promote commercial harms, including consumer businesses such as online retailers. As I have just mentioned in relation to the previous amendments, bringing many more services into scope would delay the implementation of Ofcom’s priorities and risk detracting from its work overseeing existing regulated services where the greatest risk of harm exists—we are talking here about the services run by about 2.5 million businesses in the UK alone. I hope noble Lords will appreciate from the recent communications from Ofcom how challenging the implementation timelines already are, without adding further complication.
Amendment 92 seeks to change the child-user condition in the children’s access assessment to the test in the age-appropriate design code. The test in the Bill is already aligned with the test in that code, which determines whether a service is likely to be accessed by children, in order to ensure consistency for providers. The current child-user condition determines that a service is likely to be accessed by children where it has a significant number or proportion of child users, or where it is of a kind likely to attract a significant number or proportion of child users. This will already bring into scope services of the kind set out in this amendment, such as those which are designed or intended for use by children, or where children form a—
I am sorry to interrupt. Will the Minister take the opportunity to say what “significant” means, because that is not aligned with the ICO code, which has different criteria?
If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.
On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.
The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.
(2 years ago)
Lords ChamberMy Lords, in begging leave to ask the Question of which I have given private notice, I declare my interests, particularly as founder and chair of 5Rights Foundation.
My Lords, the arrangement of parliamentary business is, as the noble Baroness will appreciate, a matter for business managers through the usual channels. However, the Bill remains a priority. The Secretary of State committed on 20 October to bringing it back to Parliament shortly. We will continue to work with noble Lords, Members in another place and others on the passage of this important legislation.
I thank the Minister for that reply and am happy to see him back in his place. However, after four years of waiting, I am afraid his Answer was not quite good enough.
Coroner Walker’s landmark judgment that Molly Russell died after suffering negative effects of online content, and his Prevention of Future Deaths Report, deserve to be met with action. That action should be finally bringing forward the Online Safety Bill. Molly Russell died five years ago, the same five years in which we have been working on the Online Safety Bill, in the absence of which children suffer an aggressive bombardment of material that valorises self-harm, body dysmorphia, violent porn and, of course, suicide— real harms to real children. Does the Minister agree that it is time to stop this suffering and commit to bringing the Bill to this House before the end of this month, which is the date by which we have been told we need it to ensure correct scrutiny and its passage in this Session?
My Lords, this important legislation has indeed been a long time coming. I was a special adviser in the Home Office when it was first proposed and was in Downing Street when it was first put in the Conservative manifesto in 2017. Like the noble Baroness, I am very keen to see it in your Lordships’ House so that it can be properly scrutinised, so that we can deliver the protections that we all want to see for children and vulnerable people. The noble Baroness is tireless in her defence of these people. She served excellently on the Joint Committee, which has already looked at the Bill. Like her, I am very keen to get it before your Lordships’ House so that we can continue.
(2 years, 5 months ago)
Lords ChamberThrough the Online Safety Bill, we are giving Ofcom strengthened media literacy functions on transparency reporting, information gathering and the other areas I set out. However, through its strategy announced in December last year, Ofcom has set out its own expanded work programme to discharge its existing duty, which includes pilots, campaigns to promote media literacy, establishing best practice and creating guidance on evaluation, so we are pleased to see that it is using and extending the powers that it has.
My Lords, many digital literacy programmes are provided free of charge to schools by private companies with an emphasis that teaches children about user behaviour rather than the risks created by those very same companies. Given the lack of provision in the Bill, perhaps the Minister could say what plans Her Majesty’s Government have to ensure that schools are not simply marketing tech products but offering a holistic digital literacy to children that is independent of those tech companies?
Digital literacy is a key priority in the computing national curriculum in England, which equips people with knowledge, understanding and skills to use the internet creatively and purposefully. Through citizenship education and other subjects, as I mentioned, we are making sure that schoolchildren are equipped with the skills that they need, and of course the companies themselves have a role to play in delivering and funding media literacy education. We welcome the steps that platforms have already taken, but we believe that they can go further to empower and educate their users.
(2 years, 9 months ago)
Lords ChamberI am grateful for the noble Baroness’s support for the newer measures the Government announced this week. Of course, we will be responding in full to the work of the Joint Committee and the DCMS Select Committee in the other place. We have looked at the draft online safety Bill to respond to the further recommendations and suggestions they have made. However, we have not been inactive in the meantime. In June last year, for example, we published safety by design guidance and a one-stop shop on child online safety, which provided guidance on steps platforms can take to design safer services and protect children. Last July, we published our Online Media Literacy Strategy, which supports the empowerment of users. So we are taking steps, as well as introducing the Bill, which will be coming soon.
My Lords, I also welcome the new commissioner, John Edwards, to his role, and congratulate the Government on this week’s announcement that the online safety Bill will include statutory guidance for privacy-preserving age assurance. Given that, to date, many of the eye-catching changes brought about by the age-appropriate design code, such as safe search and dismantling direct messaging by unknown adults to children, have been rolled out globally, are the Government working with the new commissioner to encourage the UK’s allies and trading partners to adopt the code in other jurisdictions to better enforce its provisions? Does he agree that regulatory alignment between the online safety Bill and the code is essential if we are to keep children safe?
I am very grateful for the noble Baroness’s welcome for the new measures. There is agreement at an international level and within the UK that much more needs to be done to create a safer online environment for children, and the noble Baroness has played a significant part in fostering that agreement. The Information Commissioner has an international team responsible for engaging with data protection and information regulators all over the world. He is himself a former privacy commissioner in New Zealand, while his predecessor worked in this area in Canada, and I think that is to the great benefit of international dialogue. The international team works to ensure that the ICO’s regulatory and other priorities are appropriately reflected in international discussions. Through its work in organisations such as the OECD, the Council of Europe and the Global Privacy Assembly, the ICO also influences work on the interoperability of global data protection regimes.