Online Safety Bill (First sitting) Debate
Full Debate: Read Full DebateKim Leadbeater
Main Page: Kim Leadbeater (Labour - Spen Valley)Department Debates - View all Kim Leadbeater's debates with the Department for Digital, Culture, Media & Sport
(2 years, 6 months ago)
Public Bill CommitteesQ
Kevin Bakhurst: Overall, we feel that it is. By and large, the balance between certainty and flexibility in the Bill is probably about right and will allow some flexibility in future, but it is very hard to predict what other harms may emerge. We will remain as flexible as possible.
Richard Wronka: There are some really important updating tools in the Bill. The ability for the Secretary of State to introduce new priority harms or offences—with the approval of Parliament, of course—is really important.
Q
Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.
A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.
Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.
Q
Richard Wronka: This picks up the point we discussed earlier, which is that I understand that the Government are considering proposals from the Law Commission to criminalise the sending of those kinds of images. It would not be covered by the illegal content duties as things stand, but if the Government conclude that it is right to criminalise those issues, it would automatically be picked up by the Bill.
Even so, the regime is not, on the whole, going to be able to pick up every instance of harm. It is about making sure that platforms have the right systems and processes. Where there is clear harm to individuals, we would expect those processes to be robust. We know there is work going on in the industry on that particular issue to try and drive forward those processes.
I will bring in Kim Leadbeater and then Maria Miller and Kirsty Blackman, but I will definitely bring in the Minister at 10.45 am.
Q
Dame Rachel de Souza: I have argued hard to get pornographic sites brought into the Bill. That is something very positive about the Bill, and I was really pleased to see that. Why? I have surveyed more than half a million children in my Big Ask survey and spoken recently to 2,000 children specifically about this issue. They are seeing pornography, mainly on social media sites—Twitter and other sites. We know the negative effects of that, and it is a major concern.
I am pleased to see that age assurance is in the Bill. We need to challenge the social media companies—I pull them together and meet them every six months—on getting this stuff off their sites and making sure that under-age children are not on their sites seeing some of these things. You cannot go hard enough in challenging the social media companies to get pornography off their sites and away from children.
Andy Burrows: Just to add to that, I would absolutely echo that we are delighted that part 5 of the Bill, with measures around commercial pornography, has been introduced. One of our outstanding areas of concern, which applies to pornography but also more broadly, is around clause 26, the children’s access assessment, where the child safety duties will apply not to all services but to services where there is a significant number of child users or children comprise a significant part of the user base. That would seem to open the door to some small and also problematic services being out of scope. We have expressed concerns previously about whether OnlyFans, for example, which is a very significant problem as a user-generated site with adult content, could be out of scope. Those are concerns that I know the Digital, Culture, Media and Sport Committee has recognised as well. We would very much like to see clause 26 removed from the Bill, which would ensure that we have a really comprehensive package in this legislation that tackles both commercial pornography and user-generated material.
I think Lynn Perry is back. Are you with us, Lynn? [Interruption.] No—okay. We will move on to Maria Miller.
Q
Katy Minshall: That is absolutely the case and it has been documented by numerous organisations and research. Social media mirrors society and society has the problems you have just described. In terms of how we ensure intersectionality in our policies and approaches, we are guided by our trust and safety council, which is a network of dozens of organisations around the world, 10 of which are here in the UK, and which represents different communities and different online harms issues. Alongside our research and engagement, the council ensures that when it comes to specific policies, we are constantly considering a range of viewpoints as we develop our safety solutions.
Q
Katy Minshall: At the very least, there must be tighter definitions. I am especially concerned when it comes to the news publisher exemption. The Secretary of State has indicated an amendment that would mean that services like Twitter would have to leave such content up while an appeals process is ongoing. There is no timeline given. The definition in the Bill of a news publisher is, again, fairly vague. If Ben and I were to set up a news website, nominally have some standards and an email address where people could send complaints, that would enable it to be considered a news publisher under the Bill. If we think about some of the accounts that have been suspended from social media over the years, you can absolutely see them creating a news website and saying, “I have a case to come back on,” to Twitter or TikTok or wherever it maybe.
Ben Bradley: We share those concerns. There are already duties to protect freedom of expression in clause 19. Those are welcome. It is the breadth of the definition of journalistic and democratic content that is a concern for us, particularly when it comes to things like the expediated and dedicated appeals mechanism, which those people would be able to claim if their content was removed. We have already seen people like Tommy Robinson on the far right present themselves as journalists or citizen journalists. Giving them access to a dedicated and expediated appeals mechanism is an area of concern.
There are different ways you could address that, such as greater clarity in those definitions and removing subjective elements. At the minute, it is whether or not a user considers their content to be journalistic; that it is not an objective criterion but about their belief about their content.
Also, if you look at something like the dedicated and expediated appeals mechanism, could you hold that in reserve so that if a platform were found to be failing in its duties to journalistic content or in its freedom of expression duties, Ofcom could say, like it can in other areas of the Bill, “Okay, we believe that you need to create this dedicated mechanism, because you have failed to protect those duties.”? That would, I think, minimise the risk for exploitation of that mechanism.