Online Safety Bill Debate
Full Debate: Read Full DebateJeremy Wright
Main Page: Jeremy Wright (Conservative - Kenilworth and Southam)Department Debates - View all Jeremy Wright's debates with the Department for Digital, Culture, Media & Sport
(2 years ago)
Commons ChamberI am afraid I cannot agree with the hon. Lady that the fines would be a drop in the ocean. These are very substantial amounts of money. In relation to individual director liability, I completely understand where the right hon. Member for Barking (Dame Margaret Hodge) is coming from, and I support a great deal of what she says. However, there are difficulties with the amendment. Does the hon. Member for Pontypridd (Alex Davies-Jones) accept that it would be very odd to end up in a position in which the only individual director liability attached to information offences, meaning that, as long as an individual director was completely honest with Ofcom about their wrongdoing, they would attract no individual liability?
The right hon. Gentleman and I have some form on this matter going back a number of years. The amendment is in the tradition that this House has followed of passing legislation to protect journalists, their sources and their material. I make this offer again to the Minister: the NUJ is happy to meet and discuss how the matter can be resolved effectively through the tabling of an amendment in the other place or discussions around codes of practice. However, I emphasise to the Minister that, as we have found previously, the stronger protection is through a measure in the Bill itself.
I rise to speak to amendments 1 to 9 and new clause 1 in my name and the names of other hon. and right hon. Members. They all relate to the process of categorisation of online services, particularly the designation of some user-to-user services as category 1 services. There is some significance in that designation. In the Bill as it stands, perhaps the greatest significance is that only category 1 services have to concern themselves with so-called “legal but harmful” content as far as adults are concerned. I recognise that the Government have advertised their intention to modify the Bill so that users are offered instead mechanisms by which they can insulate themselves from such content, but that requirement, too, would only apply to category 1 services. There are also other obligations to which only category 1 services are subject—to protect content of democratic importance and journalistic content, and extra duties to assess the impact of their policies and safety measures on rights of freedom of expression and privacy.
Category 1 status matters. The Bill requires Ofcom to maintain a register of services that qualify as category 1 based on threshold criteria set out in regulations under schedule 11 of the Bill. As schedule 11 stands, the Secretary of State must make those regulations, specifying threshold conditions, which Ofcom must then apply to designate a service as category 1. That is based only on the number of users of the service and its functionalities, which are defined in clause 189.
Amendments 2 to 8 would replace the word “functionalities” with the word “characteristics”. This term is defined in amendment 1 to include not only functionalities —in other words what can be done on the platform—but other aspects of the service: its user base; its business model; governance and other systems and processes. Incidentally, that definition of the term “characteristics” is already in the Bill in clause 84 dealing with risk profiles, so it is a definition that the Government have used themselves.
Categorisation is about risk, so the amendments ask more of platforms and services where the greatest risk is concentrated; but the greatest risk will not always be concentrated in the functionality of an online service. For example, its user base and business model will also disclose a significant risk in some cases. I suggest that there should be broader criteria available to Ofcom to enable it to categorise. I also argue that the greatest risk is not always concentrated on the platforms with the most users. Amendment 9 would change schedule 11 from its current wording, which requires the meeting of both a scale and a functionality threshold for a service to be designated as category 1, to instead require only one or the other.
Very harmful content being located on smaller platforms is an issue that has been discussed many times in consideration of the Bill. That could arise organically or deliberately, with harmful content migrating to smaller platforms to escape more onerous regulatory requirements. Amendment 9 would resolve that problem by allowing Ofcom to designate a service as category 1 based on its size or on its functionalities—or, better yet, on its broader characteristics.
I do not want to take too many risks, but I think the Government have some sympathy with my position, based on the indicative amendments they have published for the further Committee stage they would like this Bill to have. I appreciate entirely that we are not discussing those amendments today, but I hope, Madam Deputy Speaker, you will permit me to make some brief reference to them, as some of them are on exactly the same territory as my amendments here.
Some of those amendments that the Government have published would add the words “any other characteristics” to schedule 11 provisions on threshold conditions for categorisation, and define them in a very similar way to my amendment 1. They may ask whether that will answer my concerns, and the answer is, “Nearly.” I welcome the Government’s adding other characteristics to the consideration, not just of threshold criteria, but to the research Ofcom will carry out on how threshold conditions will be set in the first place, but I am afraid that they do not propose to change schedule 11, paragraph 1(4), which requires regulations made on threshold conditions to include,
“at least one specified condition about number of users and at least one specified condition about functionality.”
That means that to be category 1, a service must still be big.
I ask the Minister to consider again very carefully a way in which we can meet the genuine concern about high harm on small platforms. The amendment that he is likely to bring forward in Committee will not yet do so comprehensively. I also observe in passing that the reference the Government make in those amendments to any other characteristics are those that the Secretary of State considers relevant, not that Ofcom considers relevant—but that is perhaps a conversation for another day.
Secondly, I come on to the process of re-categorisation and new clause 1. It is broadly agreed in this debate that this is a fast-changing landscape; platforms can grow quickly, and the nature and scale of the content on them can change fast as well. If the Government are wedded to categorisation processes with an emphasis on scale, then the capacity to re-categorise a platform that is now category 2B but might become category 1 in the future will be very important.
That process is described in clause 83 of the Bill, but there are no timeframes or time limits for the re-categorisation process set out. We can surely anticipate that some category 2B platforms might be reluctant to take on the additional applications of category 1 status, and may not readily acquiesce in re-categorisation but instead dispute it, including through an appeal to the tribunal provided for in clause 139. That would mean that re-categorisation could take some time after Ofcom has decided to commence it and communicate it to the relevant service. New clause 1 is concerned with what happens in the meantime.
To be clear, I would not expect the powers that new clause 1 would create to be used often, but I can envisage circumstances where they would be beneficial. Let us imagine that the general election is under way—some of us will do that with more pleasure than others. Category 1 services have a particular obligation to protect content of democratic importance, including of course by applying their systems and processes for moderating content even-handedly across all shades of political opinion. There will not be a more important time for that obligation than during an election.
Let us assume also that a service subject to ongoing re-categorisation, because in Ofcom’s opinion it now has considerable reach, is not applying that even-handedness to the moderation of content or even to its removal. Formal re-categorisation and Ofcom powers to enforce a duty to protect democratic content could be months away, but the election will be over in weeks, and any failure to correct disinformation against a particular political viewpoint will be difficult or impossible to fully remedy by retrospective penalties at that point.
New clause 1 would give Ofcom injunction-style powers in such a scenario to act as if the platform is a category 1 service where that is,
“necessary to avoid or mitigate significant harm.”
It is analogous in some ways to the powers that the Government have already given to Ofcom to require a service to address a risk that it should have identified in its risk assessment but did not because that risk assessment was inadequate, and to do so before the revised risk assessment has been done.
Again, the Minister may say that there is an answer to that in a proposed Committee stage amendment to come, but I think the proposal that is being made is for a list of emerging category 1 services—those on a watchlist, as it were, as being borderline category 1—but that in itself will not speed up the re-categorisation process. It is the time that that process might take that gives rise to the potential problem that new clause 1 seeks to address.
I hope that my hon. Friend the Minister will consider the amendments in the spirit they are offered. He has probably heard me say before—though perhaps not, because he is new to this, although I do not think anyone else in the room is—that the right way to approach this groundbreaking, complex and difficult Bill is with a degree of humility. That is never an easy sell in this institution, but I none the less think that if we are prepared to approach this with humility, we will all accept, whether Front Bench or Back Bench, Opposition or Government, that we will not necessarily get everything right first time.
Therefore, these Report stages in this Bill of all Bills are particularly important to ensure that where we can offer positive improvements, we do so, and that the Government consider them in that spirit of positive improvement. We owe that to this process, but we also owe it to the families who have been present for part of this debate, who have lost far more than we can possibly imagine. We owe it to them to make sure that where we can make the Bill better, we make it better, but that we do not lose the forward momentum that I hope it will now have.
I approach my contribution from the perspective of the general principle, the thread that runs through all the amendments on the paper today on safety, reform of speech, illegal content and so on. That thread is how we deal with the harm landscape and the real-world impact of issues such as cyber-bullying, revenge porn, predatory grooming, self-harm or indeed suicide forums.
There is a serious risk to children and young people, particularly women and girls, on which there has been no debate allowed: the promulgation of gender ideology pushed by Mermaids and other so-called charities, which has created a toxic online environment that silences genuine professional concern, amplifies unquestioned affirmation and brands professional therapeutic concern, such as that of James Esses, a therapist and co-founder of Thoughtful Therapists, as transphobic. That approach, a non-therapeutic and affirmative model, has been promoted and fostered online.
The reality is that adolescent dysphoria is a completely normal thing. It can be a response to disruption from adverse childhood experiences or trauma, it can be a feature of autism or personality disorders or it can be a response to the persistence of misogynistic social attitudes. Dysphoria can present and manifest in many different ways, not just gender. If someone’s gender dysphoria persists even after therapeutic support, I am first in the queue to defend that person and ensure their wishes are respected and protected, but it is an absolute falsity to give young people information that suggests there is a quick-fix solution.
It is not normal to resolve dysphoria with irreversible so-called puberty blockers and cross-sex hormones, or with radical, irreversible, mutilating surgery. Gender ideology is being reinforced everywhere online and, indeed, in our public services and education system, but it is anything but progressive. It attempts to stuff dysphoric or gender non-conforming young people into antiquated, regressive boxes of what a woman is and what a man is, and it takes no account of the fact that it is fine to be a butch or feminine lesbian, a femboy or a boy next door, an old duffer like me, an elite gay sportsman or woman, or anything in between.