Baroness Keeley
Main Page: Baroness Keeley (Labour - Life peer)(2 years, 6 months ago)
Public Bill CommitteesClause 58, which was touched on in our last debate, simply sets out Ofcom’s duty to publish guidance for category 1 services to assist them in complying with the user identification duty set out in clause 57. We have probably covered the main points, so I will say nothing further.
Question put and agreed to.
Clause 58 accordingly ordered to stand part of the Bill.
Clause 59
Requirement to report CSEA content to the NCA
Question proposed, That the clause stand part of the Bill.
You are really moving us at pace, Sir Roger. It is a pleasure to serve in Committee with you in the Chair.
It is welcome that regulated services will have to report all child sexual exploitation and abuse material that they detect on their platform. The Government’s decision to move away from the approach of a regulatory code of practice to a mandatory reporting requirement is an important improvement to the draft Bill.
For companies to report child sexual exploitation and abuse material correctly to the mandatory reporting body, they will need access to accurate datasets that will determine whether something that they are intending to report is child sexual exploitation and abuse content. What guidance will be made available to companies so that they can proactively detect CSEA, and what plans are in place to assist companies to identify potential CSEA that has not previously been identified? The impact assessment mentions that, for example, BT is planning to use the Internet Watch Foundation’s hash list, which is compliant with UK law enforcement standards, to identify CSEA proactively. Hashing is a technology used to prevent access to known CSEA; a hash is a unique string of letters and numbers which is applied to an image and which can then be matched every time a user attempts to upload a known illegal image to a platform. It relies, however, on CSEA already having been detected. What plans are in place to assist companies to identify potential CSEA?
Finally, it is important that the introduction of mandatory reporting does not impact on existing international reporting structures. Many of the largest platforms in the scope of the Bill are US-based and required under US law to report CSEA material detected on their platform to the National Centre for Missing and Exploited Children, which ensures that information relevant to UK law enforcement is referred to it for investigation.
To answer the shadow Minister’s question about the duty to detect CSEA proactively—because, as she says, we have to detect it before we can report it—I confirm that there are already duties in the Bill to prevent and detect CSEA proactively, because CSEA is a priority offence in the schedule 6 list of child exploitation and abuse offences, and there is a duty for companies to prevent those proactively. In preventing them proactively, they will by definition identify them. That part of her question is well covered.
The hon. Lady also asked about the technologies available to those companies, including hash matching—comparing images against a known database of child sexual exploitation images. A lot of technology is being developed that can proactively spot child sexual exploitation in new images that are not on the hash matching database. For example, some technology combines age identification with nude image identification; by putting them together, we can identify sexual exploitation of children in images that are new and are not yet in the database.
To ensure that such new technology can be used, we have the duties under clause 103, which gives Ofcom the power to mandate—to require—the use of certain accredited technologies in fighting not just CSEA, but terrorism. I am sure that we will discuss that more when we come to that clause. Combined, the requirement to proactively prevent CSEA and the ability to specify technology under clause 103 will mean that companies will know about the content that they now, under clause 59, have to report to the National Crime Agency. Interestingly, the hon. Member for Worsley and Eccles South mentioned that that duty already exists in the USA, so it is good that we are matching that requirement in our law via clause 59, which I hope that the Committee will agree should stand part of the Bill.
Question put and agreed to.
Clause 59 accordingly ordered to stand part of the Bill.
Clause 60
Regulations about reports to the NCA
Question proposed, That the clause stand part of the Bill.
The additional regulations created by the Secretary of State in connection with the reports will have a lot resting on them. It is vital that they receive the appropriate scrutiny when the time comes. For example, the regulations must ensure that referrals to the National Crime Agency made by companies are of a high quality, and that requirements are easy to comply with. Prioritising the highest risk cases will be important, particularly where there is an immediate threat to the safety and welfare of a child.
Clause 60 sets out that the Secretary of State’s regulations must include
“provision about cases of particular urgency”.
Does the Minister have an idea what that will look like? What plans are in place to ensure that law enforcement can prioritise the highest risk and harm cases?
Under the new arrangements, the National Crime Agency as the designated body, the Internet Watch Foundation as the appropriate authority for notice and takedown in the UK, and Ofcom as the regulator for online harms will all hold a vast amount of information on the scale of the threat posed by child sexual exploitation and illegal content. How will the introduction of mandatory reporting assist those three organisations in improving their understanding of how harm manifests online? How does the Minister envisage the organisations working together to share information to better protect children online?
I am glad that clause 60 will be in the Bill and that there will be a duty to report to the NCA. On subsection (3), though, I would like the Minister to clarify that if the Secretary of State believes that the Scottish Ministers would be appropriate people to consult, they would consult them, and the same for the Northern Ireland Executive.
I would appreciate the Minister explaining how clause 61 will work in a Scottish context, because that clause talks about the Crime and Courts Act 2013. Does a discussion need to be had with Scottish Ministers, and perhaps Northern Ireland Ministers as well, to ensure that information sharing takes place seamlessly with devolved areas with their own legal systems, to the same level as within England and Wales? If the Minister does not have an answer today, which I understand that he may not in detail, I am happy to hear from him later; I understand that it is quite a technical question.
The hon. Member for Worsley and Eccles South asks about the prioritisation of reports made to the NCA under the new statutory provisions. The prioritisation of investigations is an operational matter for the NCA, acting as a law enforcement body. I do not think it would be right either for myself as a Minister or for Parliament as a legislative body to specify how the NCA should conduct its operational activities. I imagine that it would pursue the most serious cases as a matter of priority, and if there is evidence of any systemic abuse it would also prioritise that, but it really is a matter for the NCA, as an operationally independent police force, to decide for itself. I think it is fairly clear that the scope of matters to be contained in these regulations is fairly comprehensive, as one would expect.
On the questions raised by the hon. Member for Aberdeen North, the Secretary of State might consult Scottish Ministers under clause 63(6)(c), particularly those with responsibility for law enforcement in Scotland, and the same would apply to other jurisdictions. On whether an amendment is required to cover any matters to do with the procedures in Scotland equivalent to the matter covered in clause 61, we do not believe that any equivalent change is required to devolved Administration law. However, in order to be absolutely sure, we will get the hon. Lady written confirmation on that point.
I am not sure that the Minister has answered my question on clause 60. I think we all agree that law enforcement agencies can decide their own priorities, quite rightly, but clause 60(2)(d) sets out that the Secretary of State’s regulations must include
“provision about cases of particular urgency”.
I asked the Minister what that would look like.
Also, we think it is pretty important that the National Crime Agency, the Internet Watch Foundation and Ofcom work together on mandatory reporting. I asked him how he envisaged them working together to share information, because the better they do that, the more children are protected.
I apologise for missing those two points. On working together, the hon. Lady is right that agencies such as the Internet Watch Foundation and others should co-operate closely. There is already very good working between the Internet Watch Foundation, law enforcement and others—they seem to be well networked together and co-operating closely. It is appropriate to put on the record that Parliament, through this Committee, thinks that co-operation should continue. That communication and the sharing of information on particular images is obviously critical.
As the clause states, the regulations can set out expedited timeframes in cases of particular urgency. I understand that to mean cases where there might be an immediate risk to a child’s safety, or where somebody might be at risk in real time, as opposed to something historic—for example, an image that might have been made some time ago. In cases where it is believed abuse is happening at the present time, there is an expectation that the matter will be dealt with immediately or very close to immediately. I hope that answers the shadow Minister’s questions.
Question put and agreed to.
Clause 60 accordingly ordered to stand part of the Bill.
Clause 61 ordered to stand part of the Bill.
Clause 62
Offence in relation to CSEA reporting
Clause 63 sets out that the CSEA content required to be reported must have been published, generated, uploaded or shared either in the UK, by a UK citizen, or including a child in the UK. Subsection (6) requires services to provide evidence of such a link to the UK, which might be quite difficult in some circumstances. I would appreciate the Minister outlining what guidance and support will be made available to regulated services to ensure that they can fulfil their obligations. This is about how services are to provide evidence of such a link to the UK.
Takeovers, mergers and acquisitions are commonplace in the technology industry, and many companies are bought out by others based overseas, particularly in the United States. Once a regulated service has been bought out by a company based abroad, what plans are in place to ensure that either the company continues to report to the National Crime Agency or that it is enabled to transition to another mandatory reporting structure, as may be required in another country in the future. That is particularly relevant as we know that the European Union is seeking to introduce mandatory reporting functions in the coming years.
If the child had been in the UK when the offence was committed, that would ordinarily be subject to UK criminal law, because the crime would have been committed in the UK. The test is: where was the child or victim at the time the offence was committed? As I said a moment ago, however, the definition of “UK-linked” is particularly wide and includes
“the place where the content was published, generated, uploaded or shared.”
The word “generated”—I am reading from clause 63(6)(a), at the top of page 56—is clearly in the past tense and would include the circumstance that the hon. Lady described.
What the Minister has said is helpful, but the question I asked was about what guidance and support will be made available to regulated services. We all want this to work, because it is one of the most important aspects of the Bill—many aspects are important. He made it clear to us that the definition is quite wide, for both the general definitions and the “UK-linked” content. The point of the question was, given the possible difficulties in some circumstances, what guidance and support will be made available?
I anticipate that the National Crime Agency will issue best practice guidance. A fair amount of information about the requirements will also be set out in the regulations that the Secretary of State will issue under clause 60, which we have already debated. So it is a combination of those regulations and National Crime Agency best practice guidance. I hope that answers the question.
Finally, on companies being taken over, if a company ceases to be UK-linked, we would expect it to continue to discharge its reporting duties, which might include reporting not just in the UK but to its domestic reporting agency—we have already heard the US agency described and referenced.
I hope that my answers demonstrate that the clause is intended to be comprehensive and effective. It should ensure that the National Crime Agency gets all the information it needs to investigate and prosecute CSEA in order to keep our children safe.
Question put and agreed to.
Clause 62, as amended, accordingly ordered to stand part of the Bill.
Clause 63 ordered to stand part of the Bill.
Clause 64
Transparency reports about certain Part 3 services
I beg to move amendment 54, in clause 64, page 56, line 29, leave out “Once” and insert “Twice”.
This amendment would change the requirement for transparency report notices from once a year to twice a year.
With this it will be convenient to discuss the following:
Clause stand part.
Amendment 55, in schedule 8, page 188, line 42, at end insert—
“31A The notice under section 64(1) must require the provider to provide the following information about the service—
(a) the languages in which the service has safety systems or classifiers;
(b) details of how human moderators employed or engaged by the provider are trained and supported;
(c) the process by which the provider takes decisions about the design of the service;
(d) any other information that OFCOM considers relevant to ensuring the safe operation of the service.”
This amendment sets out details of information Ofcom must request be provided in a transparency report.
That schedule 8 be the Eighth schedule to the Bill.
Clause 65 stand part.
The duties on regulated services set out in the clause are welcome. Transparency reports will be a vital tool to hold platforms to account for understanding the true drivers of online harm. However, asking platforms to submit transparency reports once a year does not reflect how rapidly we know the online world changes. As we have seen time and again, the online environment can shift significantly in a matter of months, if not weeks. We have seen that in the rise of disinformation about covid, which we have talked about, and in the accelerated growth of platforms such as TikTok.
Increasing the frequency of transparency reports from annual to biannual will ensure that platforms stay on the pulse of emergent risks, allowing Ofcom to do the same in turn. The amendment would also mean that companies focus on safety, rather than just profit. As has been touched on repeatedly, that is the culture change that we want to bring about. It would go some way towards preventing complacency about reporting harms, perhaps forcing companies to revisit the nature of harm analysis, management and reduction. In order for this regime to be world-leading and ambitious—I keep hearing the Minister using those words about the Bill—we must demand the most that we can from the highest-risk services, including on the important duty of transparency reporting.
Moving to clauses 64 and 65 stand part, transparency reporting by companies and Ofcom is important for analysing emerging harms, as we have discussed. However, charities have pointed out that platforms have a track record of burying documents and research that point to risk of harm in their systems and processes. As with other risk assessments and reports, such documents should be made public, so that platforms cannot continue to hide behind a veil of secrecy. As I will come to when I speak to amendment 55, the Bill must be ambitious and bold in what information platforms are to provide as part of the clause 64 duty.
Clause 64(3) states that, once issued with a notice by Ofcom, companies will have to produce a transparency report, which must
“be published in the manner and by the date specified in the notice.”
Can the Minister confirm that that means regulated services will have to publish transparency reports publicly, not just to Ofcom? Can he clarify that that will be done in a way that is accessible to users, similarly to the requirements on services to make their terms of service and other statements clear and accessible? Some very important information will be included in those reports that will be critical for researchers and civil society when analysing trends and harms. It is important that the data points outlined in schedule 8 capture the information needed for those organisations to make an accurate analysis.
The evidence we heard from Frances Haugen set out how important transparency is. If internet and service providers have nothing to hide, transparency is surely in their interests as well. From my perspective, there is little incentive for the Government not to support the amendment, if they want to help civil society, researchers, academics and so on in improving a more regulated approach to transparency generally on the internet, which I am sure we all agree is a good thing.
I very much agree. We cannot emphasis that enough, and it is useful that my hon. Friend has set that out, adding to what I was saying.
Amendment 55 sets out the details of the information that Ofcom must request to be provided in a transparency report in new paragraph 31A. First, transparency disclosures required by the Bill should include how large companies allocate resources to tackling harm in different languages —an issue that was rightly raised by the hon. Member for Ochil and South Perthshire. As we heard from Frances Haugen, many safety systems at Meta have only a subset of detection systems for languages other than English. Languages such as Welsh have almost no safety systems live on Facebook. It is neither fair nor safe.
When we consider that more than 250 languages are spoken in London alone, the inconsistency of safety systems becomes very concerning. Charities have warned that people accessing Facebook in different languages are being exposed to very different levels of risk, with some versions of Facebook having few or none of the safety systems that protect other versions of the site in different languages.
When giving evidence to the Committee last month, Richard Earley disclosed that Meta regulated only 70 languages. Given that around 3 billion people use Facebook on a monthly basis across the world, that is clearly inadequate.
One of the things we found on the Joint Committee last year was the consistent message that we should not need to put this Bill in place. I want to put on the record my continued frustration that Meta and the other social media platforms are requiring us to put this Bill in place because they are not doing the monitoring, engaging in that way or putting users first. I hope that the process of going through the Bill has helped them to see the need for more monitoring. It is disappointing that we have had to get to this point. The UK Government are having to lead the world by putting this Bill in place—it should not be necessary. I hope that the companies do not simply follow what we are putting forward, but go much further and see that it is imperative to change the way they work and support their users around the world.
I thank the hon. Gentleman and I agree. It is a constant frustration that we need this Bill. We do need it, though. In fact, amendment 55 would really assist with that, by requiring those services to go further in transparency reporting and to disclose
“the languages in which the service has safety systems or classifiers”.
We need to see what they are doing on this issue. It is an easily reported piece of information that will have an outsized impact on safety, even for English speakers. It will help linguistic groups in the multilingual UK and around the world.
Reporting on language would not be a big burden on companies. In her oral evidence, Frances Haugen told the Committee that large platforms can trivially produce this additional data merely by changing a single line of code when they do their transparency reports. We must not become wrapped up in the comfort of the language we all speak and ignore the gaping loophole left for other languages, which allows harms to slip through.
To start with, it is worth saying that clause 64 is extremely important. In the course of debating earlier clauses, Opposition Members rightly and repeatedly emphasised how important it is that social media platforms are compelled to publish information. The testimony that Frances Haugen gave to the Joint Committee and to this Committee a few weeks ago demonstrates how important that is. Social media platforms are secretive and are not open. They seek to disguise what is going on, even though the impact of what they are doing has a global effect. So the transparency power in clause 64 is a critical part of the Bill and will dramatically transform the insights of parliamentarians, the wider public, civil society campaigners and academics. It will dramatically open up the sense of what is going on inside these companies, so it is extremely important indeed.
Amendment 54 seeks to increase the frequency of transparency reporting from once a year to twice a year. To be honest, we do not want to do this unreasonably frequently, and our sense is that once a year, rather than twice a year, is the right regularity. We therefore do not support the amendment. However, Members will notice that there is an ability in clause 64(12) for the Secretary of State, by regulation, to
“amend subsection (1) so as to change the frequency of the transparency reporting process.”
If it turns out in due course that once a year is not enough and we would like to do it more frequently—for example, twice a year—there is the power for those regulations to be used so that the reporting occurs more frequently. The frequency is not set in stone.
I turn to amendment 55, which sets out a number of topics that would be included in reporting. It is important to say that, as a quick glance at schedule 8 shows, the remit of the reports is already extremely wide in scope. Hon. Members will see that paragraph 5 specifies that reports can cover
“systems and processes for users to report content which they consider to be illegal”
or “harmful”, and so on. Paragraph 6 mentions:
“The systems and processes that a provider operates to deal with illegal content, content that is harmful to children”,
and so on. Therefore, the topics that amendment 55 speaks to are already covered by the schedule, and I would expect such things to be reported on. We have given Ofcom the explicit powers to do that and, rather than prescribe such details in the Bill, we should let Ofcom do its job. It certainly has the powers to do such things—that is clearly set out in the schedule—and I would expect, and obviously the Opposition would expect, that it will do so. On that basis, I will gently resist amendments 54 and 55.
On amendment 55, I want to come back to the Minister on two points about languages that were made by the hon. Member for Aberdeen North. I think most people would be shocked to discover that safety systems and the languages in which they operate are not protected, so if people are speaking a language other than English, they will not be protected. I also think that people will be shocked about, as I outlined, the employment of moderators and how badly they are paid and trained. There are factories full of people doing that important task.
I recommend that the Minister thinks again about requiring Ofcom to provide details on human moderators who are employed or engaged and how they are trained and supported. It is a bit like when we find out about factories producing various items under appalling conditions in other parts of the world—we need transparency on these issues to make people do something about it. These platforms will not do anything about it. Under questioning from my hon. Friend the Member for Pontypridd, Richard Earley admitted that he had no idea how many human moderators were working for Facebook. That is appalling and we must do something about it.
I obviously have sympathy with the objectives, but the topics covered in schedule 8, which include the systems and processes for responding to illegal and harmful content and so on, give Ofcom the power to do what the hon. Member requires. On the language point, the risk assessments that companies are required to do are hard-edged duties in the Bill, and they will have to include an assessment of languages used in the UK, which is a large number of languages—obviously, it does not include languages spoken outside the UK. So the duty to risk-assess languages already exists. I hope that gives the hon. Member reassurance. She is making a reasonable point, and I would expect that, in setting out transparency requirements, Ofcom will address it. I am sure that it will look at our proceedings to hear Parliament’s expectations, and we are giving it those powers, which are clearly set out in schedule 8.
I will just make a final point. The Bill gives Ofcom powers when it already has so much to do. We keep returning to the point of how much will ride on Ofcom’s decisions. Our amendments would make clear the requirement for transparency reporting relating to the language issue, as well as the employment of human moderators and how they are trained and supported. If we do not point that out to Ofcom, it really has enough other things to be doing, so we are asking for these points to be drawn out specifically. As in so many of our amendments, we are just asking for things to be drawn out so that they happen.
Question put, That the amendment be made.