Online Safety Bill (First sitting) Debate
Full Debate: Read Full DebateDean Russell
Main Page: Dean Russell (Conservative - Watford)Department Debates - View all Dean Russell's debates with the Department for Digital, Culture, Media & Sport
(2 years, 5 months ago)
Public Bill CommitteesI refer Members to my entry in the Register of Members’ Financial Interests regarding work I did six months ago for a business called DMA.
We will now hear oral evidence from Kevin Bakhurst, group director of broadcasting and online content at Ofcom, and Richard Wronka, director of Ofcom’s online harms policy. Before calling the first Member to ask a question, I remind all Members that questions should be limited to matters within the scope of the Bill, and we must stick to the timings in the programme motion that the Committee has agreed. For this witness panel, we have until 10.05 am. Could the witnesses please introduce themselves for the record?
Kevin Bakhurst: Good morning. I am Kevin Bakhurst, group director at Ofcom for broadcasting and online content.
Richard Wronka: I am Richard Wronka, a director in Ofcom’s online safety policy team.
Q
Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.
A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.
Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.
Q
Richard Wronka: This picks up the point we discussed earlier, which is that I understand that the Government are considering proposals from the Law Commission to criminalise the sending of those kinds of images. It would not be covered by the illegal content duties as things stand, but if the Government conclude that it is right to criminalise those issues, it would automatically be picked up by the Bill.
Even so, the regime is not, on the whole, going to be able to pick up every instance of harm. It is about making sure that platforms have the right systems and processes. Where there is clear harm to individuals, we would expect those processes to be robust. We know there is work going on in the industry on that particular issue to try and drive forward those processes.
Q
Kevin Bakhurst: This is a really important point, which Richard just tried to make. The Bill gives us a great range of tools to try and prevent harm as far as possible; I just think we need to get expectations right here. Unfortunately, this Bill will not result in no harm of any type, just because of the nature of the internet and the task that we face. We are ambitious about driving constant improvement and stopping and addressing the main harms, but it is not going to stop any particular harm. We will absolutely focus on the ones that have a significant impact, but unfortunately that is the nature of the web.
Q
“psychological harm amounting to serious distress”?
Therefore, sending somebody a flashing image with the intention of inducing an epileptic fit would be likely caught under this new harmful communications offence in clause 150, even before a separate future offence that may be introduced.
Richard Wronka: I think we can certainly understand the argument. I think it is important that the Bill is as clear as possible. Ultimately, it is for the courts to decide whether that offence would pick up these kinds of issues that we are talking about around flashing imagery.
Q
Ben Bradley: Speaking for TikTok, we view ourselves as a second-generation platform. We launched in 2018, and at that time when you launched a product you had to make sure that safety was at the heart of it. I think the Secretary of State herself has said that the Bill process actually predates the launch of TikTok in the UK.
We view ourselves as an entertainment platform and to express yourself, enjoy yourself and be entertained you have to feel safe, so I do not think we would be seen as kicking and screaming under this regime. It is something that we have supported for a long time and we are regulated by Ofcom under the video-sharing platform, or VSP, regime. What the Bill will achieve is to raise the floor of industry standards, a bit like GDPR did for data, so that for all the companies in the future—to Alex’s point, this is about the next five and 10 years—there will be a baseline of standards that everyone must comply with and expectations that you will be regulated. Also, it takes a lot of these difficult decisions about the balance between safety and expression, privacy and security out of the hands of tech companies and into the hands of a regulator that, of course, will have democratic oversight.
Katy Minshall: I do not have very much more to add. We already engage positively with Ofcom. I remember appearing before a Select Committee back in 2018 or 2019 and at that point saying that we were absolutely supportive of Ofcom taking in this role and regulation potentially being a game changer. We are supportive of the systems and processes approach and look forward to engaging constructively in the regulation.
Q
Katy Minshall: I am glad you asked that question. The problem with the Bill is it depends on so many things that do not exist yet. We are looking at the Bill and thinking how we can prepare and start thinking about what is necessary, but in practice, content that is harmful to adults and harmful to children has not been set out yet. So much of the Bill depends on secondary legislation and codes of practice, and as I described earlier in the question from Alex Davies-Jones, there are such real workability questions around exemptions and ID verification that I worry there would be the risk of substantial delays at the other end, which I do not think anyone wants to see.
Ben Bradley: It is the same from our perspective. We have our community guidelines and we are committed to enforcing those at the moment. A lot of the detail of the Bill will be produced in Ofcom’s codes of practice but I think it is important we think about operationalising the process, what it looks like in practice and whether it is workable.
Something like Katy mentioned in terms of the user empowerment duties, how prescriptive those would be and how those would work, not just from the platforms of today but for the future, is really important. For TikTok, to use a similar example on the user empowerment duties, the intent is to discover content from all over the world. When you open the app, you are recommended content from all sorts of users and there is no expectation that those would be verified. If you have opted into this proposed user empowerment duty, there is a concern that it could exacerbate the risk of filter bubbles, because you would only be receiving content from users within the UK who have verified themselves, and we work very hard to make sure there is a diverse range of recommendations in that. I think it is a fairly easy fix. Much like elsewhere in the Bill, where Ofcom has flexibility about whether to require specific recommendations, they could have that flexibility in this case as well, considering whether this type of power works for these types of platforms.
To use the example of the metaverse, how would it work once the metaverse is up and running? The whole purpose of the metaverse is a shared environment in which users interact, and because the Bill is so prescriptive at the minute about how this user empowerment duty needs to be achieved, it is not clear, if you were verified and I were unverified and you had opted not to see my content but I moved something in the shared environment, like this glass, whether that would move for everyone. It is a small point, but it just goes to the prescriptiveness of how it is currently drafted and the importance of giving Ofcom the flexibility that it has elsewhere in the Bill, but in this section as well.
Q
Katy Minshall: At present, we follow the industry standard of age self-declaration. How you manage and verify identity—whether using a real-name system or emerging technologies like blockchain or documentation—is at the heart of a range of industries, not just ours.
Technology will change and new products that we cannot even envisage today will come on to the market. In terms of what we would do in relation to the Bill, as I said, until we see the full extent of the definitions and requirements, we cannot really say what exact approach we would take.