Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Laing of Elderslie
Main Page: Baroness Laing of Elderslie (Conservative - Life peer)Department Debates - View all Baroness Laing of Elderslie's debates with the Department for Digital, Culture, Media & Sport
(2 years, 4 months ago)
Commons ChamberOrder. The House will see that a great many people still wish to speak. May I explain that there are two groups of amendments? We will finish debating this group at 4.30 pm, after which there will be some votes, and debate on the next group of amendments will last until 7 o’clock. By my calculations, there might be more time for speeches during the debate on the next group, so if anyone wishes to speak on that group rather than the current group, I would be grateful if they came and indicated that to me. Meanwhile, if everyone takes about eight minutes and no longer, everyone will have the opportunity to speak. I call Sir Jeremy Wright.
I shall speak to the amendments in my name and the names of other right hon. and hon. Members, to whom I am grateful for their support. I am also grateful to the organisations that helped me to work through some of the problems I am about to identify, including the Carnegie Trust, Reset and the Antisemitism Policy Trust.
On the first amendments I shall talk about, amendments 42 and 43, I have been able to speak to Lego, so I can honestly say that these amendments were put together with Lego. Let me explain. The focus of the Bill, quite rightly, is on safety, and there is no safety more important than the safety of children. In that respect, the Bill is clear: platforms must give the safety of children the utmost priority and pay close attention to ways to enhance it. In other parts of the Bill, however, there are countervailing duties—for example, in relation to freedom of speech and privacy—where, predominantly in relation to adults, we expect platforms to conduct a balancing exercise. It seems right to me to think about that in the context of children, too.
As I said, the emphasis is rightly on children’s safety, but the safest approach would be to prohibit children from any online activity at all. We would not regard such an approach as sensible, because there are benefits to children in being able to engage—safely, of course—in online activity and to use online products and services. It seems to me that we ought to recognise that in the language of the Bill. Amendment 42 would do that when consideration is given to the safety duties designed to protect children set out in clause 11, which requires that “proportionate measures” must be taken to protect children’s safety and goes on to explain what factors might be taken into account when deciding what is proportionate, by adding
“the benefits to children’s well-being”
of the product or service in that list of factors. Amendment 43 would do the same when consideration is given to the online safety objectives set out in schedule 4. Both amendments are designed to ensure that the appropriate balance is struck when judgments are taken by platforms.
Others have spoken about journalistic content, and I am grateful for what the Minister said about that, but my amendment 10 is aimed at the defect that I perceive in clause 16. The Bill gives additional protections and considerations to journalists, which is entirely justifiable, given the important role that journalism plays in our society, but those extra protections mean that it will be harder for platforms to remove potentially harmful content that is also journalistic content. We should be sure, therefore, that the right people get the benefit of that protection.
It is worth having look at what clause 16 says and does. It sets out that a platform—a user-to-user service—in category 1 will have
“A duty to operate a service using proportionate systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about…how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and…whether to take action against a user generating, uploading or sharing such content.”
So it is important, because of the significance of those protections, that we get right the definitions of those who should benefit from them. Amendment 10 would amend clause 16(8), which states that:
“For the purposes of this section content is “journalistic content”, in relation to a user-to-user service, if…the content is”
either
“news publisher content in relation to that service”—
the definition of which I will return to—
“or…regulated user-generated content in relation to that service”.
That is the crucial point. The content also has to be
“generated for the purposes of journalism”
and be linked to the UK.
The first problem here is that journalism is not defined in the Bill. There are definitions of journalism, but none appears in the text of this Bill. “UK-linked” does not narrow it down much, and “regulated user-generated content” is a very broad category indeed. Clause 16 as drafted offers the protection given to journalistic content not just to news publishers, but to almost everybody else who chooses to define themselves as a journalist, whether or not that is appropriate. I do not think that that is what the Bill is intended to do, or an approach that this House should endorse. Amendment 10 would close the loophole by removing the second limb, regulated user-generated content that is not news publisher content. Let me be clear: I do not think that that is the perfect answer to the question I have raised, but it is better than the Bill as it stands, and if the Government can come up with a way of reintroducing protections of this kind for types of journalistic content beyond news publisher content that clearly deserve them, I will be delighted and very much open to it. Currently, however, the Bill is defective and needs to be remedied.
That brings us to the definition of news publisher content, because it is important that if we are to give protection to that category of material, we are clear about what we mean by it. Amendments 11 and 12 relate to the definition of news publisher content that arises from the definition of a recognised news publisher in clauses 49 and 50. That matters for the same reason as I just set out: we should give these protections only to those who genuinely deserve them. That requires rigorous definition. Clause 50 states that if an entity is not named in the Bill, as some are, it must fulfil a set of conditions set out in subsection (2), which includes having a standards code and policies and procedures for handling and resolving complaints. The difficulty here is that in neither case does the Bill refer to any quality threshold for those two things, so having any old standards code or any old policy for complaints will apparently qualify. That cannot be right.
I entirely accept that inserting a provision that the standards code and the complaints policies and procedures should be both “suitable and sufficient” opens the question whose job it becomes to decide what is suitable and sufficient. I am familiar with all the problems that may ensue, so again, I do not say that the amendment is the final word on the subject, but I do say that the Government need to look more carefully at what the value of those two items on the list really is if the current definition stands. If we are saying that we want these entities to have a standards code and a complaints process that provide some reassurance that they are worthy of the protections the Bill gives, it seems to me that meaningful criteria must apply, which currently they do not.
The powers of the Secretary of State have also been discussed by others, but I perhaps differ from their view in believing that there should be circumstances in which the Secretary of State should hold powers to act in genuine emergency situations. However, being able to direct Ofcom, as the Bill allows the Secretary of State to do, to modify a code of practice
“for reasons of public policy”
is far too broad. Amendment 13 would simply remove that capacity, with amendment 14 consequential upon it.
I accept that on 7 July the Secretary of State issued a written statement that helps to some extent on that point—it was referred to by my hon. Friend the Member for Croydon South South (Chris Philp). First, it states that the Secretary of State would act only in “exceptional circumstances”, although it does not say who defines what exceptional circumstances are, leaving it likely that the Secretary of State would do so, which does not help us much. Secondly, it states the intention to replace the phrase
“for reasons of public policy”
with a list of circumstances in which the Secretary of State might act. I agree with my hon. Friend the Member for Solihull (Julian Knight) that that is still too broad. The proposed list comprises
“national security, public safety, public health, the UK’s international relations and obligations, economic policy and burden to business.”—[Official Report, 7 July 2022; Vol. 717, c. 69WS.]
The platforms we are talking about are businesses. Are we really saying that a burden on them would give the Secretary of State reason to say to Ofcom, the independent regulator, that it must change a code of practice? That clearly cannot be right. This is still too broad a provision. The progress that has been made is welcome, but I am afraid that there needs to be more to further constrain this discretion. That is because, as others have said, the independence of the regulator is crucial not just to this specific part of the Bill but to the credibility of the whole regulatory and legislative structure here, and therefore we should not undermine it unless we have to.
May I join others in welcoming my hon. Friend the Member for Folkestone and Hythe (Damian Collins) to his place on the Front Bench? He brings a considerable amount of expertise. I also, although it is a shame he is not here to hear me say nice things about him, pay tribute, as others have, to my hon. Friend the Member for Croydon South (Chris Philp). I had the opportunity to work with him, his wonderful team of officials and wonderful officials at the Home Office on some aspects of this Bill, and it was a great pleasure to do so. As we saw again today, his passion for this subject is matched only by his grasp of its fine detail.
I particularly echo what my hon. Friend said about algorithmic promotion, because if we address that, alongside what the Government have rightly done on ID verification options and user empowerment, we would address some of the core wiring and underpinnings at an even more elemental level of online harm.
I want to talk about two subjects briefly. One is fraud, and the other is disinformation. Opposition amendment 20 refers to disinformation, but that amendment is not necessary because of the amendments that the Government are bringing to the National Security Bill to address state-sponsored disinformation. I refer the House in particular to Government amendment 9 to that Bill. That in turn amends this Bill—it is the link, or so-called bridge, between the two. Disinformation is a core part of state threat activity and it is one of the most disturbing, because it can be done at huge volume and at very low cost, and it can be quite hard to detect. When someone has learned how to change the way people think, that makes that part of their weaponry look incredibly valuable to them.
We often talk about this in the context of elections. I think we are actually pretty good—when I say “we”, I mean our country, some other countries and even the platforms themselves—at addressing disinformation in the context of the elections themselves: the process of voting, eligibility to vote and so on. However, first, that is often not the purpose of disinformation at election time and, secondly, most disinformation occurs outside election times. Although our focus on interference with the democratic process is naturally heightened coming up to big democratic events, it is actually a 365-day-a-year activity.
There are multiple reasons and multiple modes for foreign states to engage in that activity. In fact, in many ways, the word “disinformation” is a bit unsatisfactory because a much wider set of things comes under the heading of information operations. That can range from simple untruths to trying to sow many different versions of an event, particularly a foreign policy or wartime event, to confuse the audience, who are left thinking, “Oh well, whatever story I’m being told by the BBC, my newspaper, or whatever it is, they are all much of a muchness.” Those states are competing for truth, even though in reality, of course, there is one truth. Sometimes the aim is to big up their own country, or to undermine faith in a democracy like ours, or the effectiveness of free societies.
Probably the biggest category of information operations is when there is not a particular line to push at all, but rather the disinformer is seeking to sow division or deepen division in our society, often by telling people things that they already believe, but more loudly and more aggressively to try to make them dislike some other group in society more. The purpose, ultimately, is to destabilise a free and open society such as ours and that has a cancerous effect. We talk sometimes of disinformation being spread by foreign states. Actually, it is not spread by foreign states; it is seeded by foreign states and then spread usually by people here. So they create these fake personas to plant ideas and then other people, seeing those messages and personas, unwittingly pick them up and pass them on themselves. It is incredibly important that we tackle that for the health of our democracy and our society.
The other point I want to mention briefly relates to fraud and the SNP amendments in the following group, but also Government new clause 14 in this group. I strongly support what the Government have done, during the shaping of the Bill, on fraud; there have been three key changes on fraud. The first was to bring user-generated content fraud into the scope of the Bill. That is very important for a particularly wicked form of fraud known as romance fraud. The second was to bring fraudulent advertising into scope, which is particularly important for categories of fraud such as investment fraud and e-commerce. The third big change was to make fraud a priority offence in the Bill, meaning that it is the responsibility of the platforms not just to remove that content when they are made aware of it, but to make strenuous efforts to try to stop it appearing in front of their users in the first place. Those are three big changes that I greatly welcome.
There are three further things I think the Government will need to do on fraud. First, there is a lot of fraudulent content beyond categories 1 and 2A as defined in the Online Safety Bill, so we are going to have to find ways—proportionate ways—to make sure that that fraudulent content is suppressed when it appears elsewhere, but without putting great burdens on the operators of all manner of community websites, village newsletters and so on. That is where the DCMS online advertising programme has an incredibly important part to play.
The second thing is about the huge variety of channels and products. Telecommunications are obviously important, alongside online content, but even within online, as the so-called metaverse develops further, with the internet of things and the massive potential for defrauding people through deep fakes and so on, we need to be one step ahead of these technologies. I hope that in DCMS my hon. Friends will look to create a future threats unit that seeks to do that.
Thirdly, we need to make sure everybody’s incentives are aligned on fraud. At present, the banks reimburse people who are defrauded and I hope that rate of reimbursement will shortly be increasing. They are not the only ones involved in the chain that leads to people being defrauded and often they are not the primary part of that chain. It is only right and fair, as well as economically efficient, to make sure the other parts of the chain that are involved share in that responsibility. The Bill makes sure their incentives are aligned because they have to take proportionate steps to stop fraudulent content appearing in front of customers, but we need to look at how we can sharpen that up to make sure everybody’s incentives are absolutely as one.
This is an incredibly important Bill. It has been a long time coming and I congratulate everybody, starting with my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), my hon. Friend the Member for Croydon South (Chris Philp) and others who have been closely involved in creating it. I wish my hon. Friend the Minister the best of luck.