Online Safety Bill (Eighth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Eighth sitting)

Maria Miller Excerpts
Committee stage
Thursday 9th June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

On clause 37, it is welcome that Ofcom will have to prepare and issue a code of practice for service providers with duties relating to illegal content in the form of terrorism or child sexual exploitation and abuse content. The introduction of compliance measures relating to fraudulent advertising is also very welcome. We do, however, have some important areas to amend, including the role of different expert groups in assisting Ofcom during its consultation process, which I have already outlined in relation to animal cruelty.

On clause 38, Labour supports the notion that Ofcom must have specific principles to adhere to when preparing the codes of practice, and of course, the Secretary of State must have oversight of those. However, as I will touch on as we proceed, Labour feels that far too much power is given to the Secretary of State of the day in establishing those codes.

Labour believes that that schedule 4 is overwhelmingly loose in its language, and we have concerns about the ability of Ofcom—try as it might—to ensure that its codes of practice are both meaningful to service providers and in compliance with the Bill’s legislative requirements. Let me highlight the schedule’s broadness by quoting from it. Paragraph 4 states:

“The online safety objectives for regulated user-to-user services are as follows”.

I will move straight to paragraph 4(a)(iv), which says

“there are adequate systems and processes to support United Kingdom users”.

Forgive me if I am missing something here, but surely an assessment of adequacy is too subjective for these important codes of practice. Moreover, the Bill seems to have failed to consider the wide-ranging differences that exist among so-called United Kingdom users. Once again, there is no reference to future-proofing against emerging technologies. I hope that the Minister will therefore elaborate on how he sees the codes of practice and their principles, objectives and content as fit for purpose. More broadly, it is remarkable that schedule 4 is both too broad in its definitions and too limiting in some areas—we might call it a Goldilocks schedule.

I turn to new clause 20. As we have discussed, a significant majority of online child abuse takes place in private messages. Research from the NSPCC shows that 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people whom they have not met offline before. When children are contacted by someone they do not know, in nearly three quarters of cases that takes place by private message.

Schedule 4 introduces new restrictions on Ofcom’s ability to require a company to use proactive technology to identify or disrupt abuse in private messaging. That will likely restrict Ofcom’s ability to include in codes of practice widely used industry-standard tools such as PhotoDNA and CSAI Match, which detect known child abuse images, and artificial intelligence classifiers to detect self-generated images and grooming behaviour. That raises significant questions about whether the regulator can realistically produce codes of practice that respond to the nature and extent of the child abuse threat.

As it stands, the Bill will leave Ofcom unable to require companies to proactively use technology that can detect child abuse. Instead, Ofcom will be wholly reliant on the use of CSEA warning notices under clause 103, which will enable it to require the use of proactive technologies only where there is evidence that child abuse is already prevalent—in other words, where significant online harm has already occurred. That will necessitate the use of a laborious and resource-intensive process, with Ofcom having to build the evidence to issue CSEA warning notices company by company.

Those restrictions will mean that the Bill will be far less demanding than comparable international legislation in respect of the requirement on companies to proactively detect and remove online child abuse. So much for the Bill being world leading. For example, the EU child abuse legislative proposal published in May sets out clear and unambiguous requirements on companies to proactively scan for child abuse images and grooming behaviour on private messages.

If the regulator is unable to tackle online grooming sufficiently proactively, the impact will be disproportionately felt by girls. NSPCC data shows that an overwhelming majority of criminal offences target girls, with those aged 12 to 15 the most likely to be victims of online grooming. Girls were victims in 83% of offences where data was recorded. Labour recognises that once again there are difficulties between our fundamental right to privacy and the Bill’s intentions in keeping children safe. This probing new clause is designed to give the Government an opportunity to report on the effectiveness of their proposed approach.

Ultimately, the levels of grooming taking place on private messaging platforms are incredibly serious. I have two important testimonies that are worth placing on the record, both of which have been made anonymous to protect the victims but share the same sentiment. The first is from a girl aged 15. She said:

“I’m in a serious situation that I want to get out of. I’ve been chatting with this guy online who’s like twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to prove my trust to him, like doing video chats with my chest exposed.”

The second is from a boy aged 17. He said:

“I’ve got a fitness page on Instagram to document my progress but I get a lot of direct messages from weird people. One guy said he’d pay me a lot of money to do a private show for him. He now messages me almost every day asking for more explicit videos and I’m scared that if I don’t do what he says, then he will leak the footage and my life would be ruined”.

Those testimonies go to show how fundamentally important it is for an early assessment to be made of the effectiveness of the Government’s approach following the Bill gaining Royal Assent.

We all have concerns about the use of proactive technology in private messaging and its potential impact on personal privacy. End-to-end encryption offers both risks and benefits to the online environment, but the main concern is based on risk profiles. End-to-end encryption is particularly problematic on social networks because it is embedded in the broader functionality of the service, so all text, DMs, images and live chats could be encrypted. Consequently, its impact on detecting child abuse becomes even greater. There is an even greater risk with Meta threatening to bring in end-to-end encryption for all its services. If platforms cannot demonstrate that they can mitigate those risks to ensure a satisfactory risk profile, they should not be able to proceed with end-to-end encryption until satisfactory measures and mitigations are in place.

Tech companies have made significant efforts to frame this issue in the false binary that any legislation that impacts private messaging will damage end-to-end encryption and will mean that encryption will not work or is broken. That argument is completely false. A variety of novel technologies are emerging that could allow for continued CSAM scanning in encrypted environments while retaining the privacy benefits afforded by end-to-end encryption.

Apple, for example, has developed its NeuralHash technology, which allows for on-device scans for CSAM before a message is sent and encrypted. That client-side implementation—rather than service-side encryption—means that Apple does not learn anything about images that do not match the known CSAM database. Apple’s servers flag accounts that exceed a threshold number of images that match a known database of CSAM image hashes, so that Apple can provide relevant information to the National Centre for Missing and Exploited Children. That process is secure and expressly designed to preserve user privacy.

Homomorphic encryption technology can perform image hashing on encrypted data without the need to decrypt the data. No identifying information can be extracted and no details about the encrypted image are revealed, but calculations can be performed on the encrypted data. Experts in hash scanning—including Professor Hany Farid of the University of California, Berkeley, who developed PhotoDNA—insist that scanning in end-to-end encrypted environments without damaging privacy will be possible if companies commit to providing the engineering resources to work on it.

To move beyond the argument that requiring proactive scanning for CSAM means breaking or damaging end-to-end encryption, amendments to the Bill could provide a powerful incentive for companies to invest in technology and engineering resources that will allow them to continue scanning while pressing ahead with end-to-end encryption, so that privacy is preserved but appropriate resources for and responses to online child sexual abuse can continue. It is highly unlikely that some companies will do that unless they have the explicit incentive to do so. Regulation can provide such an incentive, and I urge the Minister to make it possible.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

It is a pleasure to follow the shadow Minister, who made some important points. I will focus on clause 37 stand part. I pay tribute to the Minister for his incredible work on the Bill, with which he clearly wants to stop harm occurring in the first place. We had a great debate on the matter of victim support. The Bill requires Ofcom to produce a number of codes of practice to help to achieve that important aim.

Clause 37 is clear: it requires codes of practice on illegal content and fraudulent advertising, as well as compliance with “the relevant duties”, and it is on that point that I hope the Minister can help me. Those codes will help Ofcom to take action when platforms do things that they should not, and will, I hope, provide a way for platforms to comply in the first place rather than falling foul of the rules.

How will the codes help platforms that are harbouring material or configuring their services in a way that might be explicitly or inadvertently promoting violence against women and girls? The Minister knows that women are disproportionately the targets of online abuse on social media or other platforms. The impact, which worries me as much as I am sure it worries him, is that women and girls are told to remove themselves from social media as a way to protect themselves against extremely abusive or harassing material. My concern is that the lack of a specific code to tackle those important issues might inadvertently mean that Ofcom and the platforms overlook them.

Would a violence against women and girls code of practice help to ensure that social media platforms were monitored by Ofcom for their work to prevent tech-facilitated violence against women and girls? A number of organisations think that it would, as does the Domestic Abuse Commissioner herself. Those organisations have drafted a violence against women and girls code of practice, which has been developed by an eminent group of specialists—the End Violence Against Women Coalition, Glitch, Carnegie UK Trust, the NSPCC, 5Rights, and Professors Clare McGlynn and Lorna Woods, both of whom gave evidence to us. They believe it should be mandatory for Ofcom to adopt a violence against women and girls code to ensure that this issue is taken seriously and that action is taken to prevent the risks in the first place. Clause 37 talks about codes, but it is not specific on that point, so can the Minister help us? Like the rest of the Committee, he wants to prevent women from experiencing these appalling acts online, and a code of practice could help us deal with that better.

--- Later in debate ---
I hope that clarifies how the Bill operates. As I said, we are giving careful thought to finding ways—which I hope we can—to strengthen those powers in clause 103.
Maria Miller Portrait Dame Maria Miller
- Hansard - -

I think my hon. Friend’s list goes on to page 37, which means there would be a number of different relevant duties that would presumably then be subject to the ability to issue codes of practice. However, the point I was making in my earlier contribution is that this list does not include the issue of violence against women and girls. In looking at this exhaustive list that my hon. Friend has included in the Bill, I must ask whether he might inadvertently be excluding the opportunity for Ofcom to produce a code of practice on the issue of violence against women and girls. Having heard his earlier comments, I felt that he was slightly sympathetic to that idea.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, and as Members have pointed out, women and girls suffer disproportionately from abuse online; unfortunately, tragically and disgracefully, they are disproportionately victims of such abuse. The duties in the Bill obviously apply to everybody—men and women—but women will obviously disproportionately benefit, because they are disproportionately victims.

Obviously, where there are things that are particular to women, such as particular kinds of abuse that women suffer that men do not, or particular kinds of abuse that girls suffer that boys do not, then we would expect the codes of practice to address those kinds of abuse, because the Bill states that they must keep children safe, in clause 37(10)(b), and adults safe, in clause 37(10)(c). Obviously, women are adults and we would expect those particular issues that my right hon. Friend mentioned to get picked up by those measures.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

My hon. Friend is giving me a chink of light there, in that subsection (10)(c) could actively mean that a code of practice that specifically dealt with violence against women and girls would be admissible as a result of that particular point. I had not really thought of it in that way—am I thinking about it correctly?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend makes an interesting point. To avoid answering a complicated question off the cuff, perhaps I should write to her. However, I certainly see no prohibition in these words in the clause that would prevent Ofcom from writing a particular code of practice. I would interpret these words in that way, but I should probably come back to her in writing, just in case I am making a mistake.

As I say, I interpret those words as giving Ofcom the latitude, if it chose to do so, to have codes of practice that were specific. I would not see this clause as prescriptive, in the sense that if Ofcom wanted to produce a number of codes of practice under the heading of “adults”, it could do so. In fact, if we track back to clause 37(3), that says:

“OFCOM must prepare and issue one or more codes of practice”.

That would appear to admit the possibility that multiple codes of practice could be produced under each of the sub-headings, including in this case for adults and in the previous case for children. [Interruption.] I have also received some indication from officials that I was right in my assessment, so hopefully that is the confirmation that my right hon. Friend was looking for.

Question put and agreed to.

Clause 37 accordingly ordered to stand part of the Bill.

Clause 38 ordered to stand part of the Bill.

Schedule 4

Codes of practice under section 37: principles, objectives, content

Amendment proposed: 63, in schedule 4, page 176, line 29, at end insert “and

(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.—(Alex Davies-Jones.)

This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.

Question put, That the amendment be made.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a quick question about timelines because I am slightly confused about the order in which everything will happen. It is unlikely that the Bill will have been through the full parliamentary process before the summer, yet Ofcom intends to publish information and guidance by the summer, even though some things, such as the codes of practice, will not come in until after the Bill has received Royal Assent. Will the Minister give a commitment that, whether or not the Bill has gone through the whole parliamentary process, Ofcom will be able to publish before the summer?

Will Ofcom be encouraged to publish everything, whether that is guidance, information on its website or the codes of practice, at the earliest point at which they are ready? That will mean that anyone who has to apply those codes of practice or those regulations—people who will have to work within those codes, for example, or charities or other organisations that might be able to make super-complaints—will have as much information as possible, as early as possible, and will be able to prepare to fully implement their work at the earliest possible time. They will need that information in order to be able to gear up to do that.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I have three short questions for the Minister about clause 40 and the Secretary of State’s powers of direction. Am in order to cover that?

None Portrait The Chair
- Hansard -

We are not debating clause 40, Dame Maria, but we will come to it eventually.