Online Safety Bill (Seventh sitting) Debate
Full Debate: Read Full DebateKim Leadbeater
Main Page: Kim Leadbeater (Labour - Spen Valley)Department Debates - View all Kim Leadbeater's debates with the Department for Digital, Culture, Media & Sport
(2 years, 6 months ago)
Public Bill CommitteesIt is categorically not the Government’s position that this problem is too big to fix. In fact, the whole purpose of this piece of groundbreaking and world-leading legislation is to fix a problem of such magnitude. The point my right hon. Friend was making about the hypothecation of fines to support user advocacy is a somewhat different one, which we will come to in due course, but there is nothing in the Bill to prevent individual groups from assisting individuals with making specific complaints to individual companies, as they are now entitled to do in law under clauses 17 and 18.
The point about an ombudsman is a slightly different one—if an individual complaint is made to a company and the individual complainant is dissatisfied with the outcome of their individual, particular and personal complaint, what should happen? In the case of financial services, if, for example, someone has been mis-sold a mortgage and they have suffered a huge loss, they can go to an ombudsman who will bindingly adjudicate that individual, single, personal case. The point that I am making is that having hundreds of thousands or potentially millions of cases being bindingly adjudicated on a case-by- case basis is not the right way to tackle a problem of this scale. The right way to tackle the problem is to force the social media companies, by law, to systemically deal with all of the problem, not just individual problems that may end up on an ombudsman’s desk.
That is the power in the Bill. It deals at a systems and processes level, it deals on an industry-wide level, and it gives Ofcom incredibly strong enforcement powers to make sure this actually happens. The hon. Member for Pontypridd has repeatedly called for a systems and processes approach. This is the embodiment of such an approach and the only way to fix a problem of such magnitude.
I associate myself with the comments of the right hon. Member for Basingstoke. Surely, if we are saying that this is such a huge problem, that is an argument for greater stringency and having an ombudsman. We cannot say that this is just about systems. Of course it is about systems, but online harms—we have heard some powerful examples of this—are about individuals, and we have to provide redress and support for the damage that online harms do to them. We have to look at systemic issues, as the Minister is rightly doing, but we also have to look at individual cases. The idea of an ombudsman and greater support for charities and those who can support victims of online crime, as mentioned by the hon. Member for Aberdeen North, is really important.
I thank the hon. Lady for her thoughtful intervention. There are two separate questions here. One is about user advocacy groups helping individuals to make complaints to the companies. That is a fair point, and no doubt we will debate it later. The ombudsman question is different; it is about whether to have a right of appeal against decisions by social media companies. Our answer is that, rather than having a third-party body—an ombudsman—effectively acting as a court of appeal against individual decisions by the social media firms, because of the scale of the matter, the solution is to compel the firms, using the force of law, to get this right on a systemic and comprehensive basis.
I would have been quite happy to move the amendment, but I do not think the Opposition would have been terribly pleased with me if I had stolen it. I have got my name on it, and I am keen to support it.
As I have said, I met the NSPCC yesterday, and we discussed how clause 31(3) might work, should the Minister decide to keep it in the Bill and not accept the amendment. There are a number of issues with the clause, which states that the child user condition is met if
“a significant number of children”
are users of the service, or if the service is
“likely to attract a significant number of users who are children”.
I do not understand how that could work. For example, a significant number of people who play Fortnite are adults, but a chunk of people who play it are kids. If some sort of invisible percentage threshold is applied in such circumstances, I do not know whether that threshold will be met. If only 20% of Fortnite users are kids, and that amounts only to half a million children, will that count as enough people to meet the child access assessment threshold?
Fortnite is huge, but an appropriate definition is even more necessary for very small platforms and services. With the very far-right sites that we have mentioned, it may be that only 0.5% of their users are children, and that may amount only to 2,000 children—a very small number. Surely, because of the risk of harm if children access these incredibly damaging and dangerous sites that groom people for terrorism, they should have a duty to meet the child access requirement threshold, if only so that we can tell them that they must have an age verification process—they must be able to say, “We know that none of our users are children because we have gone through an age verification process.” I am keen for children to be able to access the internet and meet their friends online, but I am keen for them to be excluded from these most damaging sites. I appreciate the action that the Government have taken in relation to pornographic content, but I do not think that this clause allows us to go far enough in stopping children accessing the most damaging content that is outwith pornographic content.
The other thing that I want to raise is about how the number of users will be calculated. The Minister made it very clear earlier on, and I thank him for doing so, that an individual does not have to be a registered user to be counted as a user of a site. People can be members of TikTok, for example, only if they are over 13. TikTok has some hoops in place—although they are not perfect—to ensure that its users are over 13, and to be fair, it does proactively remove users that it suspects are under 13, particularly if they are reported. That is a good move.
My child is sent links to TikTok videos through WhatsApp, however. He clicks on the links and is able to watch the videos, which will pop up in the WhatsApp mini-browser thing or in the Safari browser. He can watch the videos without signing up as a registered user of TikTok and without using the platform itself—the videos come through Safari, for example, rather than through the app. Does the Minister expect that platforms will count those people as users? I suggest that the majority of people who watch TikTok by those means are doing so because they do not have a TikTok account. Some will not have accounts because they are under 13 and are not allowed to by TikTok or by the parental controls on their phones.
My concern is that, if the Minister does not provide clarity on this point, platforms will count just the number of registered users, and will say, “It’s too difficult for us to look at the number of unregistered users, so in working out whether we meet the criteria, we are not even going to consider people who do not access our specific app or who are not registered users in some way, shape or form.” I have concerns about the operation of the provisions and about companies using that “get out of jail free” card. I genuinely believe that the majority of those who access TikTok other than through its platform are children and would meet the criteria. If the Minister is determined to keep subsection (3) and not accept the amendment, I feel that he should make it clear that those users must be included in the counting by any provider assessing whether it needs to fulfil the child safety duties.
I agree with thon. Lady’s important point, which feeds into the broader question of volume versus risk—no matter how many children see something that causes harm and damage, one is one too many—and the categorisation of service providers into category 1 to category 2A and category 2B. The depth of the risk is the problem, rather than the number of people who might be affected. The hon. Lady also alluded to age verification—I am sure we will come to that at some point—which is another can of worms. The important point, which she made well, is about volume versus risk. The point is not how many children see something; even if only a small number of children see something, the damage has been done.
I absolutely agree. In fact, I have tabled an amendment to widen category 1 to include sites with the highest risk of harm. The Minister has not said that he agrees with my amendment specifically, but he seems fairly amenable to increasing and widening some duties to include the sites of highest risk. I have also tabled another new clause on similar issues.
I am glad that these clauses are in the Bill—a specific duty in relation to children is important and should happen—but as the shadow Minister said, clause 31(3) is causing difficulty. It is causing difficulty for me and for organisations such as the NSPCC, which is unsure how the provisions will operate and whether they will do so in the way that the Government would like.
I hope the Minister will answer some of our questions when he responds. If he is not willing to accept the amendment, will he give consideration to how the subsection could be amended in the future—we have more stages, including Report and scrutiny in the other place—to ensure that there is clarity and that the intention of the purpose is followed through, rather than being an intention that is not actually translated into law?