(2 years, 5 months ago)
Public Bill CommitteesI agree with the hon. Member wholeheartedly. It should be Parliament that is assessing the effectiveness of the Bill. The Committee has discussed many times how groundbreaking the Bill could be, how difficult it has been to regulate the internet for the first time, the many challenges encountered, the relationship between platforms and regulator and how other countries will be looking at the legislation as a guide for their own regulations. Once this legislation is in place, the only way we can judge how well it is tackling harm in the UK is with clear public reports detailing information on what harms have been prevented, who has intervened to remove that harm, and what role the regulator—in this case Ofcom—has had in protecting us online.
New clause 25 will place a number of important obligations on Ofcom to provide us with that crucial information. First, Ofcom will report annually to Parliament on the overall effectiveness of the Act. That report will allow Ofcom to explore fully where the Act is working, where it could be tightened and where we have left gaps. Throughout the Bill we are heaping considerable responsibility on to Ofcom, and it is only right that Ofcom is able to feedback publicly and state clearly where its powers allow it to act, and where it is constrained and in need of assistance.
Secondly, new clause 25 will compel Ofcom to monitor, collate and publish figures relating to the number of harms removed by category 1 services, which is an important indicator for us to know the scale of the issue and that the Act is working.
Thirdly, we need to know how often Ofcom is intervening, compared with how often the platforms themselves are acting. That crucial figure will allow us to assess the balance of regulation, which assists not only us in the UK but countries looking at the legislation as a guide for their own regulation.
Finally, Ofcom will detail the harms removed by type to identify any areas where the Act may be falling short, and where further attention may be needed.
I hope the Committee understands why this information is absolutely invaluable, when we have previously discussed our concerns that this groundbreaking legislation will need constant monitoring. I hope it will also understand why the information needs to be transparent in order to instil trust in the online space, to show the zero-tolerance approach to online harms, and to show countries across the globe that the online space can be effectively regulated to protect citizens online. Only Parliament, as the legislature, can be an effective monitor of that information. I hope I can count on the Government’s support for new clause 25.
I speak in support of new clause 25. As my hon. Friend has argued, transparency is critical to the Bill. It is too risky to leave information and data about online harms unpublished. That is why we have tabled several amendments to the Bill to increase reporting, both to the regulator and publicly.
New clause 25 is an important addition that would offer an overview of the effectiveness of the Bill and act as a warning bell for any unaddressed historical or emerging harms. Not only would such a report benefit legislators, but the indicators included in the report would be helpful for both Ofcom and user advocacy groups. We cannot continue to attempt to regulate the internet blind. We must have the necessary data and analysis to be sure that the provisions in the Bill are as effective as they can be. I hope the Minister can support this new clause.
The idea that a report on Ofcom’s activities be delivered to Parliament so that it can be considered is an excellent one. In fact, it is such an excellent idea that it has been set out in statute since 2002: the Office of Communications Act 2002 already requires Ofcom to provide a report to the Secretary of State on the carrying out of all of its functions, which will include the new duties we are giving Ofcom under the Bill. The Secretary of State must then lay that report before each House of Parliament. That is a well-established procedure for Ofcom and for other regulatory bodies. It ensures the accountability of Ofcom to the Department and to Parliament.
I was being slightly facetious there, because the hon. Member for Batley and Spen is quite right to raise the issue. However, the duty she is seeking to create via new clause 25 is already covered by the duties in the Office of Communications Act. The reports that Ofcom publish under that duty will include their new duties under the Bill. Having made that clear, I trust that new clause 25 can be withdrawn.
(2 years, 6 months ago)
Public Bill CommitteesI beg to move, That the clause be read a Second time.
New clause 3 would make provision for a statutory user advocacy body representing the interests of children. It would also allow the Secretary of State to appoint a new or existing body as the statutory user advocate. A strong, authoritative and well-resourced voice that can speak for children in regulatory debates would ensure that complex safeguarding issues are well understood, and would also actively inform the regulator’s decisions.
Charities have highlighted that the complaints and reporting mechanisms in the Bill may not always be appropriate for children. Ofcom’s own evidence shows that only 14% to 12 to 15-year-old children have ever reported content. Children who are most at risk of online harms may find it incredibly challenging to complete a multi-stage reporting and complaints process. Dame Rachel de Souza told the Committee:
“I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and taking their complaints into account.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]
A children’s advocacy body would be able to support children with redress mechanisms that are fundamentally targeted at adults. Given how many children now use the internet, that is an essential element that is missing from the Bill. That is why the super-complaints mechanism needs to be strengthened with specific arrangements for children, as advocated by the National Society for the Prevention of Cruelty to Children and other children’s organisations. A statutory user advocacy body could support the regulator, as well as supporting child users. It would actively promote the interests of children in regulatory decision making and offer support by ensuring that an understanding of children’s behaviour and safeguarding is front and centre in its approach.
My hon. Friend is making a really valid point. As I look around the room—I mean this with no disrespect to anybody—I see that we are all of an age at which we do not understand the internet in the same way that children and young people do. Surely, one of the key purposes of the Bill is to make sure that children and young people are protected from harms online, and as the Children’s Commissioner said in her evidence, their voices have to be heard. I am sure that, like me, many Members present attend schools as part of their weekly constituency visits, and the conversations we have with young people are some of the most empowering and important parts of this job. We have to make sure that the voices of the young people who we all represent are heard in this important piece of legislation, and it is really important that we have an advocacy body to ensure that.
I very much agree with my hon. Friend. She is quite right: we have to remember that we do not see these things as children and young people do.
The user advocacy body that my hon. Friend has just spoken in support of could also shine a light on the practices that are most harmful to children by using data, evidence and specialist expertise to point to new and emerging areas of harm. That would enable the regulator to ensure its risk profiles and regulatory approach remain valid and up to date. In his evidence, Andy Burrows of the NSPCC highlighted the importance of an advocacy body acting as an early warning system:
“Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]
The provision in the new clause is comparable to those that already exist in many other sectors. For example, Citizens Advice is the statutory user advocate for consumers of energy and the postal services, and there are similar arrangements representing users of public transport. Establishing a children’s user advocacy body would ensure that the most vulnerable online users of all—children at risk of online sexual abuse—receive equivalent protections to customers of post offices or passengers on a bus.
(2 years, 6 months ago)
Public Bill CommitteesThe duties on regulated services set out in the clause are welcome. Transparency reports will be a vital tool to hold platforms to account for understanding the true drivers of online harm. However, asking platforms to submit transparency reports once a year does not reflect how rapidly we know the online world changes. As we have seen time and again, the online environment can shift significantly in a matter of months, if not weeks. We have seen that in the rise of disinformation about covid, which we have talked about, and in the accelerated growth of platforms such as TikTok.
Increasing the frequency of transparency reports from annual to biannual will ensure that platforms stay on the pulse of emergent risks, allowing Ofcom to do the same in turn. The amendment would also mean that companies focus on safety, rather than just profit. As has been touched on repeatedly, that is the culture change that we want to bring about. It would go some way towards preventing complacency about reporting harms, perhaps forcing companies to revisit the nature of harm analysis, management and reduction. In order for this regime to be world-leading and ambitious—I keep hearing the Minister using those words about the Bill—we must demand the most that we can from the highest-risk services, including on the important duty of transparency reporting.
Moving to clauses 64 and 65 stand part, transparency reporting by companies and Ofcom is important for analysing emerging harms, as we have discussed. However, charities have pointed out that platforms have a track record of burying documents and research that point to risk of harm in their systems and processes. As with other risk assessments and reports, such documents should be made public, so that platforms cannot continue to hide behind a veil of secrecy. As I will come to when I speak to amendment 55, the Bill must be ambitious and bold in what information platforms are to provide as part of the clause 64 duty.
Clause 64(3) states that, once issued with a notice by Ofcom, companies will have to produce a transparency report, which must
“be published in the manner and by the date specified in the notice.”
Can the Minister confirm that that means regulated services will have to publish transparency reports publicly, not just to Ofcom? Can he clarify that that will be done in a way that is accessible to users, similarly to the requirements on services to make their terms of service and other statements clear and accessible? Some very important information will be included in those reports that will be critical for researchers and civil society when analysing trends and harms. It is important that the data points outlined in schedule 8 capture the information needed for those organisations to make an accurate analysis.
The evidence we heard from Frances Haugen set out how important transparency is. If internet and service providers have nothing to hide, transparency is surely in their interests as well. From my perspective, there is little incentive for the Government not to support the amendment, if they want to help civil society, researchers, academics and so on in improving a more regulated approach to transparency generally on the internet, which I am sure we all agree is a good thing.
I very much agree. We cannot emphasis that enough, and it is useful that my hon. Friend has set that out, adding to what I was saying.
Amendment 55 sets out the details of the information that Ofcom must request to be provided in a transparency report in new paragraph 31A. First, transparency disclosures required by the Bill should include how large companies allocate resources to tackling harm in different languages —an issue that was rightly raised by the hon. Member for Ochil and South Perthshire. As we heard from Frances Haugen, many safety systems at Meta have only a subset of detection systems for languages other than English. Languages such as Welsh have almost no safety systems live on Facebook. It is neither fair nor safe.
When we consider that more than 250 languages are spoken in London alone, the inconsistency of safety systems becomes very concerning. Charities have warned that people accessing Facebook in different languages are being exposed to very different levels of risk, with some versions of Facebook having few or none of the safety systems that protect other versions of the site in different languages.
When giving evidence to the Committee last month, Richard Earley disclosed that Meta regulated only 70 languages. Given that around 3 billion people use Facebook on a monthly basis across the world, that is clearly inadequate.
(2 years, 6 months ago)
Public Bill CommitteesThat is fine.
Professor Clare McGlynn: I know that there was a discussion this morning about age assurance, which obviously targets children’s access to pornography. I would emphasise that age assurance is not a panacea for the problems with pornography. We are so worried about age assurance only because of the content that is available online. The pornography industry is quite happy with age verification measures. It is a win-win for them: they get public credibility by saying they will adopt it; they can monetise it, because they are going to get more data—especially if they are encouraged to develop age verification measures, which of course they have been; that really is putting the fox in charge of the henhouse—and they know that it will be easily evaded.
One of the most recent surveys of young people in the UK was of 16 and 17-year-olds: 50% of them had used a VPN, which avoids age verification controls, and 25% more knew about that, so 75% of those older children knew how to evade age assurance. This is why the companies are quite happy—they are going to make money. It will stop some people stumbling across it, but it will not stop most older children accessing pornography. We need to focus on the content, and when we do that, we have to go beyond age assurance.
You have just heard Google talking about how it takes safety very seriously. Rape porn and incest porn are one click away on Google. They are freely and easily accessible. There are swathes of that material on Google. Twitter is hiding in plain sight, too. I know that you had a discussion about Twitter this morning. I, like many, thought, “Yes, I know there is porn on Twitter,” but I must confess that until doing some prep over the last few weeks, I did not know the nature of that porn. For example, “Kidnapped in the wood”; “Daddy’s little girl comes home from school; let’s now cheer her up”; “Raped behind the bin”—this is the material that is on Twitter. We know there is a problem with Pornhub, but this is what is on Twitter as well.
As the Minister mentioned this morning, Twitter says you have to be 13, and you have to be 18 to try to access much of this content, but you just put in whatever date of birth is necessary—it is that easy—and you can get all this material. It is freely and easily accessible. Those companies are hiding in plain sight in that sense. The age verification and age assurance provisions, and the safety duties, need to be toughened up.
To an extent, I think this will come down to the regulator. Is the regulator going to accept Google’s SafeSearch as satisfying the safety duties? I am not convinced, because of the easy accessibility of the rape and incest porn I have just talked about. I emphasise that incest porn is not classed as extreme pornography, so it is not a priority offence, but there are swathes of that material on Pornhub as well. In one of the studies that I did, we found that one in eight titles on the mainstream pornography sites described sexually violent material, and the incest material was the highest category in that. There is a lot of that around.
Q
Professor Clare McGlynn: In many ways, it is going to be up to the regulator. Is the regulator going to deem that things such as SafeSearch, or Twitter’s current rules about sensitive information—which rely on the host to identify their material as sensitive—satisfy their obligations to minimise and mitigate the risk? That is, in essence, what it will all come down to.
Are they going to take the terms and conditions of Twitter, for example, at face value? Twitter’s terms and conditions do say that they do not want sexually violent material on there, and they even say that it is because they know it glorifies violence against women and girls, but this material is there and does not appear to get swiftly and easily taken down. Even when you try to block it—I tried to block some cartoon child sexual abuse images, which are easily available on there; you do not have to search for them very hard, it literally comes up when you search for porn—it brings you up five or six other options in case you want to report them as well, so you are viewing them as well. Just on the cartoon child sexual abuse images, before anyone asks, they are very clever, because they are just under the radar of what is actually a prohibited offence.
It is not necessarily that there is more that the Bill itself could do, although the code of practice would ensure that they have to think about these things more. They have to report on their transparency and their risk assessments: for example, what type of content are they taking down? Who is making the reports, and how many are they upholding? But it is then on the regulator as to what they are going to accept as acceptable, frankly.