None Portrait The Chair
- Hansard -

Maria Miller has indicated that she would like to ask a question, so if I may, I will bring her in.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

Not immediately —go on please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you, Maria.

I am just trying to get to the intricacies of this, and of what would happen during the time that it would take for you to recategorise. This platform, which is disseminating harm to both children and adults, would be allowed to carry on while the recategorisation process is under way. There is no mechanism in the Bill to stop that from happening.

Richard Wronka: A really important point here is that we will be regulating that platform from the outset for illegal content and, potentially, for how it protects children on its platform, irrespective of the categorisation approach. That is really important. We will be able to take action, and take action quickly, irrespective of how the platform is categorised. Categorisation really determines whether the adult “legal but harmful” provisions apply. That is the bit that really matters in this context.

It is worth reminding ourselves what those provisions mean: they are more a transparency and accountability measure. Those categorised category 1 platforms will need to have clear terms and conditions applied to adult “legal but harmful” content, and they will need to implement those consistently. We would expect the really serious and egregious concerns to be picked up by the “illegal” part of the regime, and the protection-of-children part of the regime. The categorisation process may go on. It may take a little time, but we will have tools to act in those situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q May I bring you on to the powers of the Secretary of State and the question of the regulator’s independence? The Bill will see the Secretary of State, whoever that may be, have a huge amount of personal direction over Ofcom. Do you have any other experience of being directed by a Secretary of State in this way, and what are the consequences of such an approach?

Kevin Bakhurst: We do have some experience across the various sectors that we regulate, but being directed by the Secretary of State does not happen very often. Specifically on the Bill, our strong feeling is that we think it entirely appropriate, and that the Secretary of State should be able to direct us on matters of national security and terrorist content. However, we have some concerns about the wider direction powers of the Secretary of State, and particularly the grounds on which the Secretary of State can direct public policy, and we have expressed those concerns previously.

We feel it is important that the independence of a regulator can be seen to be there and is there in practice. Legally, we feel it important that there is accountability. We have some experience of being taken to judicial review, and there must be accountability for the codes of practice that we put in place. We must be able to show why and how we have created those codes of practice, so that we can be accountable and there is absolute clarity between regulator and Government.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Thank you very much to the witnesses who have taken the time to be with us today. We are really grateful. You have already alluded to the fact that you have quite extensive experience in regulation, even in social media spaces. I think the Committee would be really interested in your view, based on your experience, about what is not in the Bill that should be.

Kevin Bakhurst: Richard has been leading this process, so he can give more detail on it, but suffice to say, we have been engaging closely with DCMS over the last year or so, and we appreciate the fact that it has taken on board a number of our concerns. What we felt we needed from the Bill was clarity as far as possible, and a balance between clarity and flexibility for this regime, which is a very fast-moving field. We feel, by and large, that the Bill has achieved that.

We still have concerns about one or two areas, to pick up on your question. We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of “illegal content” is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.

Richard Wronka: I completely agree with Kevin that the Bill as it stands gives us a good framework. I think the pre-legislative scrutiny process has been really helpful in getting us there, and I point out that it is already quite a broad and complex regime. We welcome the introduction of issues such as fraudulent advertising and the regulation of commercial pornographic providers, but I think there is a point about ensuring that the Bill does not expand too much further, because that might raise some practical and operational issues for us.

I completely agree with Kevin that clarity in the Bill regarding illegal content and what constitutes that is really important. An additional area that requires clarity is around some of the complex definitions in the Bill, such as journalistic content and democratically important content. Those are inherently tricky issues, but any extra clarity that Parliament can provide in those areas would be welcome.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q You talk about illegal content and that Ofcom would not have a view on particular laws, but do you think there are harmful areas of content that are not currently covered by the law? I am thinking particularly about the issue of intimate image abuse, which is currently under Law Commission review, with recommendations expected very soon. Have you had any thoughts, particularly in the area of policy, about how you deal with issues that should be against the law but currently are not, given that part of your regulatory process is to determine whether companies are operating within the law?

Richard Wronka: I would start by saying that this is a fluid area. We have had a number of conversations with the Law Commission in particular and with other stakeholders, which has been really helpful. We recognise that the Bill includes four new offences, so there is already some fluidity in this space. We are aware that there are other Law Commission proposals that the Government are considering. Incitement to self-harm and flashing imagery that might trigger epilepsy are a couple of issues that come to mind there. Ultimately, where the criminal law sits is a matter for Parliament. We are a regulator: our role here is to make sure that the criminal law is reflected in the regulatory regime properly, rather than to determine or offer a view on where the criminal law should sit. Linking back to our point just a minute ago, we think it is really important that there is as much clarity as possible about how platforms can take some of those potentially quite tricky decisions about whether content meets the criminal threshold.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q May I press a little further? The four new offences that you talked about, and others, and just the whole approach of regulation will lead more individuals to seek redress and support. You are not responsible for individuals; you are responsible for regulation, but you must have some thoughts on whether the current system of victim support will cope with the changes in the law and the new regulatory process. What might you want to see put in place to ensure that those victims are not all landing at your door, erroneously thinking that Ofcom will provide them with individual redress? Do you have any thoughts on that?

Kevin Bakhurst: One area that is very important and which is in the Bill and one of our responsibilities is to make sure there is a sufficiently robust and reactive complaints process from the platforms—one that people feel they can complain to and be heard—and an appeals process. We feel that that is in the Bill. We already receive complaints at Ofcom from people who have issues about platforms and who have gone to the platforms but do not feel their complaints have been properly dealt with or recognised. That is within the video-sharing platform regime. Those individual complaints, although we are not going to be very specific in looking at individual pieces of material per se, are very useful to alert us where there are issues around particular types of offence or harm that the platforms are not seen to be dealing with properly. It will be a really important part of the regime to make sure that platforms provide a complaints process that is easy to navigate and that people can use quite quickly and accessibly.

Richard Wronka: An additional point I would make, building on that, is that this is a really complex ecosystem. We understand that and have spent a lot of the last two or three years trying to get to grips with that complex ecosystem and building relationships with other participants in the ecosystem. It brings in law enforcement, other regulators, and organisations that support victims of crime or online abuse. We will need to find effective ways to work with those organisations. Ultimately, we are a regulator, so there is a limit to what we can do. It is important that those other organisations are able to operate effectively, but that is perhaps slightly outside our role.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Q To what extent do you think services should publish publicly the transparency and risk assessments that they will be providing to Ofcom?

Richard Wronka: I think our starting point here is that we think transparency is a really important principle within the regime—a fundamental principle. There are specific provisions in the Bill that speak to that, but more generally we are looking for this regime to usher in a new era of transparency across the tech sector, so that users and other participants in this process can be clearer about what platforms are doing at the moment, how effective that is and what more might be done in the future. That is something that will be a guiding principle for us as we pick up regulation.

Specifically, the Bill provides for transparency reports. Not all services in scope will need to provide transparency reports, but category 1 and 2 services will be required to produce annual transparency reports. We think that is really important. At the moment, risk assessments are not intended to be published—that is not provided for in the Bill—but the transparency reports will show the effectiveness of the systems and processes that those platforms have put in place.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I think Lynn Perry is back. Are you with us, Lynn? [Interruption.] No—okay. We will move on to Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I have a question for the Children’s Commissioner. You talked just now about doing more on the advocacy of individual cases. I asked a question of Ofcom in the first panel about the issue of support for victims. Its response was that complaints processes will be part of what it will regulate. Do you think that will be enough to answer your concerns, or are you expecting more than simply ensuring that platforms do what they should do?

Dame Rachel de Souza: I absolutely think that we need to look at independent advocacy and go further. I do not think the Bill does enough to respond to individual cases of abuse and to understand issues and concerns directly from children. Children should not have to exhaust platforms’ ineffective complaints routes. It can take days, weeks, months. Even a few minutes or hours of a nude image being shared online can be enormously traumatising for children.

That should inform Ofcom’s policies and regulation. As we know, the risks and harms of the online world are changing constantly. It serves a useful purpose as an early warning mechanism within online safety regulation. I would like to see independent advocacy that allows a proper representation service for children. We need to hear from children directly, and I would like to see the Bill go further on this.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Is there capacity in the sector to deliver what you are talking about?

Dame Rachel de Souza: I think we need to make capacity. There is some—the NSPCC has its Childline and, as Children’s Commissioner, I have my own advocacy service for children in care. I think this should function in that way, with direct access. So I think that we can create it.

Andy Burrows: May I come in briefly? Our proposals for user advocacy reflect the clear “polluter pays” principle that we think should apply here, to help build and scale up that capacity, but the levy that is covering the direct cost of regulation should also provide really effective user advocacy. That is really important not only to help to give victims what they need in frontline services, but in ensuring that there is a strong counterbalance to some of the largest companies in the world for our sector, which has clear ambition but self-evident constraints.

Dame Rachel de Souza: One of the concerns that has come to me from children—I am talking about hundreds of thousands of children—over the past year is that there is not strong enough advocacy for them and that their complaints are not being met. Girls in particular, following the Everyone’s Invited concerns, have tried so hard to get images down. There is this almost medieval bait-out practice of girls’ images being shared right across platforms. It is horrendous, and the tech firms are not acting quickly enough to get those down. We need proper advocacy and support for children, and I think that they would expect that of us in this groundbreaking Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q There has not been a huge amount of discussion of online gaming in the context of the Bill, despite the fact that for many young people that is the way in which they interact with other people online. Do you think the Bill covers online gaming adequately? A lot of interaction in online gaming is through oral communication—voice chat messages. Do you think that it is possible to properly regulate oral communications in gaming?

Dame Rachel de Souza: Good question. I applaud the Bill for what it does cover. We are looking at a Bill that, for the first time, is going to start protecting children’s rights online, so I am really pleased to see that. We have looked a bit at gaming in the past. In terms of harms, obviously the Bill does not cover gaming in full, but it does cover the safety aspects of children’s experience.

It is always good for us to be looking further. Gaming, we know, has some extremely harmful and individualistic issues with it, particularly around money and the profile of potential grooming and safety. In terms of communications, one of the reasons that I am so concerned about encryption and communications online is that it happens through gaming. We need to make sure that those elements are really firm.

Andy Burrows: It is vitally important that the gaming sector is in scope. We know that there are high-risk gaming sites—for example, Twitch—and gaming-adjacent services such as Discord. To go back to my earlier point about the need for cross-platform provisions to apply here, in gaming we can see grooming pathways that can take on a different character from those on social networks, for example, where we might see abuse pathways where that grooming is taking place at the same time, rather than sequentially from a gaming streaming service, say, to a gaming-adjacent platform such as Discord. I think it is very important that a regulator is equipped to understand the dynamics of the harms and how they will perhaps apply differently on gaming services. That is a very strong and important argument for use advocacy.

I would say a couple of things on oral communications. One-to-one oral communication are excluded from the Bill’s scope—legitimately—but we should recognise that there is a grooming risk there, particularly when that communication is embedded in a platform of wider functionality. There is an argument for a platform to consider all aspects of its functionality within the risk assessment process. Proactive scanning is a different issue.

There is a broader challenge for the Bill, and this takes us back to the fundamental objectives and the very welcome design based around systemic risk identification and mitigation. We know that right now, in respect of oral communications and livestream communications, the industry response is not as developed in terms of detecting and disrupting harm as it is for, say, text-based chat. In keeping with the risk assessment process, it should be clear that if platforms want to offer that functionality, they should have to demonstrate through the risk assessment process that they have high-quality, effective arrangements in place to detect and disrupt harm, and that should be the price of admission. If companies cannot demonstrate that, they should not be offering their services, because there is a high risk to children.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I am sorry, I have to interrupt because of time. Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Two hopefully quick questions. I have been listening carefully. Could you summarise the main changes you will make to your products that your users will notice make them safer, whether they are children or adults? I have heard a lot about problems, but what are the changes you will actually make? Within that, could you talk about how you will improve your complaints system, which earlier witnesses said is inadequate?

Katy Minshall: We would certainly look to engage with Ofcom positively on the requirements it sets out. I am sorry to sound repetitive, but the challenge is that the Bill depends on so many things that do not exist yet and the definitions around what we mean by content harmful to adults or to children. In practice, that makes it challenging to say to you exactly today what approaches we would take. To be clear, we would of course look to continue working with the Government and now Ofcom with the shared objective of making the online space safer for everyone.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I want to probe you a little on that. Harmful content not being defined means that you will not make any changes other than around that. It is quite a large Bill; surely there are other things you will do differently, no?

Katy Minshall: The lesson of the past three or four years is that we cannot wait for the Bill. We at Twitter are continuing to make changes to our product and our policies to improve safety for everyone, including children.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q So the bill is irrelevant to you.

Katy Minshall: The Bill is a really important piece of regulation, which is why I was so pleased to come today and share our perspectives. We are continuing to engage positively with Ofcom. What I am trying to say is that until we see the full extent of the requirements and definitions, it is hard to set out exactly what steps we would take with regards to the Bill.

Ben Bradley: To add to that point, it is hard to be specific about some of the specific changes we would make because a lot of the detail of the Bill defers to Ofcom guidance and the codes of practice. Obviously we all have the duties around child safety and adult safety, but the Ofcom guidance will suggest specific measures that we can take to do that, some of which we may take already, and some of which may go further than what we already do. Once we see the details of the codes, we will be able to give a clearer answer.

Broadly from a TikTok perspective, through the design of the product and the way we approach safety, we are in a good place for when the new regime comes in, because we are regulated by Ofcom in the VSP regime, but we would have to wait for the full amount of detail. But outside some of the companies that you will hear from today, this will touch 20,000 companies and will raise the floor for all the companies that will be regulated under the regime.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q But you cannot give any further detail about specific changes you will make as a result of this legislation because you have not seen the guidance and the codes.

Ben Bradley: Yes, the codes of practice will recommend specific steps that we should take to achieve our duties. Until we see the detail of those codes it is hard to be specific about some of the changes that we would make.

None Portrait The Chair
- Hansard -

Barbara, you have just a couple of minutes.