Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Thank you to the witnesses for joining us and giving us such thorough and clear responses to the various questions. I want to start on a topic that William Perrin and William Moy touched on—the exemption for recognised news publishers, set out in clause 50. You both said you have some views on how that is drafted. As you said, I asked questions on Tuesday about whether there are ways in which it could be improved to avoid loopholes—not that I am suggesting there are any, by the way. Mr Perrin and Mr Moy, could you elaborate on the specific areas where you think it might be improved?

William Moy: Essentially, the tests are such that almost anyone could pass them. Without opening the Bill, you have to have a standards code, which you can make up for yourself, a registered office in the UK and so on. It is not very difficult for a deliberate disinformation actor to pass the set of tests in clause 50 as they currently stand.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q How would you change it to address that, if you think it is an issue?

William Moy: This would need a discussion. I have not come here with a draft amendment—frankly, that is the Government’s job. There are two areas of policy thinking over the last 10 years that provide the right seeds and the right material to go into. One is the line of thinking that has been done about public benefit journalism, which has been taken up in the House of Lords Communications and Digital Committee inquiry and the Cairncross review, and is now reflected in recent Charity Commission decisions. Part of Full Fact’s charitable remit is as a publisher of public interest journalism, which is a relatively new innovation, reflecting the Cairncross review. If you take that line of thinking, there might be some useful criteria in there that could be reflected in this clause.

I hate to mention the L-word in this context, but the other line of thinking is the criteria developed in the context of the Leveson inquiry for what makes a sensible level of self-regulation for a media organisation. Although I recognise that that is a past thing, there are still useful criteria in that line of thinking, which would be worth thinking about in this context. As I said, I would be happy to sit down, as a publisher of journalism, with your officials and industry representatives to work out a viable way of achieving your political objectives as effectively as possible.

William Perrin: Such a definition, of course, must satisfy those who are in the industry, so I would say that these definitions need to be firmly industry-led, not simply by the big beasts—for whom we are grateful, every day, for their incredibly incisive journalism—but by this whole spectrum of new types of news providers that are emerging. I have mentioned my experience many years ago of explaining what a blog was to DCMS.

The news industry is changing massively. I should declare an interest: I was involved in some of the work on public-benefit journalism in another capacity. We have national broadcasters, national newspapers, local papers, local broadcasters, local bloggers and local Twitter feeds, all of which form a new and exciting news media ecosystem, and this code needs to work for all of them. I suppose that you would need a very deep-dive exercise with those practitioners to ensure that they fit within this code, so that you achieve your policy objective.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Okay, thank you. I am not sure that I can take anything specific away from that. Perhaps that illustrates the difficulty of legislating. The clause, as drafted, obviously represents the best efforts, thus far, to deal with an obviously difficult and complicated issue.

We heard some commentary earlier—I think from Mr Moy—about the need to address misinformation, particularly in the context of a serious situation such as the recent pandemic. I think you were saying that there was a meeting, in March or April 2020, for the then Secretary of State and social media firms to discuss the issue and what steps they might take to deal with it. You said that it was a private meeting and that it should perhaps have happened more transparently.

Do you accept that the powers conferred in clause 146, as drafted, do, in fact, address that issue? They give the Secretary of State powers, in emergency situations—a public health situation or a national security situation, as set out in clause 146(1)—to address precisely that issue of misinformation in an emergency context. Under that clause, it would happen in a way that was statutory, open and transparent. In that context, is it not a very welcome clause?

William Moy: I am sorry to disappoint you, Minister, but no, I do not accept that. The clause basically attaches to Ofcom’s fairly weak media literacy duties, which, as we have already discussed, need to be modernised and made harms-based and safety-based.

However, more to the point, the point that I was trying to make is that we have normalised a level of censorship that was unimaginable in previous generations. A significant part of the pandemic response was, essentially, some of the main information platforms in all of our day-to-day lives taking down content in vast numbers and restricting what we can all see and share. We have started to treat that as a normal part of our lives, and, as someone who believes that the best way to inform debate in an open society is freedom of expression, which I know you believe, too, Minister, I am deeply concerned that we have normalised that. In fact, you referred to it in your Times article.

I think that the Bill needs to step in and prevent that kind of overreach, as well as the triggering of unneeded reactions. In the pandemic, the political pressure was all on taking down harmful health content; there was no countervailing pressure to ensure that the systems did not overreach. We therefore found ridiculous examples, such as police posts warning of fraud around covid being taken down by the internet companies’ automated systems because those systems were set to, essentially, not worry about overreach.

That is why we are saying that we need, in the Bill, a modern, open-society approach to misinformation. That starts with it recognising misinformation in the first place. That is vital, of course. It should then go on to create a modern, harms-based media literacy framework, and to prefer content-neutral and free-speech-based interventions over content-restricting interventions. That was not what was happening during the pandemic, and it is not what will happen by default. It takes Parliament to step in and get away from this habitual, content-restriction reaction and push us into an open-society-based response to misinformation.

William Perrin: Can I just add that it does not say “emergency”? It does not say that at all. It says “reasonable grounds” that “present a threat”—not a big threat—under “special circumstances”. We do not know what any of that means, frankly. With this clause, I get the intent—that it is important for national security, at times, to send messages—but this has not been done in the history of public communication before. If we go back through 50 or 60 years, even 70 years, of Government communication, the Government have bought adverts and put messages transparently in place. Apart from D-notices, the Government have never sought to interfere in the operations of media companies in quite the way that is set out here.

If this clause is to stand, it certainly needs a much higher threshold before the Secretary of State can act—such as who they are receiving advice from. Are they receiving advice from directors of public health, from the National Police Chiefs’ Council or from the national security threat assessment machinery? I should declare an interest; I worked in there a long time ago. It needs a higher threshold and greater clarity, but you could dispense with this by writing to Ofcom and saying, “Ofcom, you should have regard to these ‘special circumstances’. Why don’t you take actions that you might see fit to address them?”

Many circumstances, such as health or safety, are national security issues anyway if they reach a high enough level for intervention, so just boil it all down to national security and be done with it.

Professor Lorna Woods: If I may add something about the treatment of misinformation more generally, I suspect that if it is included in the regime, or if some subset such as health misinformation is included in the regime, it will be under the heading of “harmful to adults”. I am picking up on the point that Mr Moy made that the sorts of interventions will be more about friction and looking at how disinformation is incentivised and spread at an earlier stage, rather than reactive takedown.

Unfortunately, the measures that the Bill currently envisages for “harmful but legal” seem to focus more on the end point of the distribution chain. We are talking about taking down content and restricting access. Clause 13(4) gives the list of measures that a company could employ in relation to priority content harmful to adults.

I suppose that you could say, “Companies are free to take a wider range of actions”, but my question then is this: where does it leave Ofcom, if it is trying to assess compliance with a safety duty, if a company is doing something that is not envisaged by the Act? For example, taking bot networks offline, if that is thought a key factor in the spreading of disinformation—I see that Mr Moy is nodding. A rational response might be, “Let’s get rid of bot networks”, but that, as I read it, does not seem to be envisaged by clause 13(4).

I think that is an example of a more general problem. With “harmful but legal”, we would want to see less emphasis on takedown and more emphasis on friction, but the measures listed as envisaged do not go that far up the chain.

None Portrait The Chair
- Hansard -

Minister, we have just got a couple of minutes left, so perhaps this should be your last question.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes. On clause 13(4), the actions listed there are quite wide, given that they include not just “taking down the content”—as set out in clause 13(4)(a) —but also

“(b) restricting users’ access to the content;

(c) limiting the recommendation or promotion of the content;

(d) recommending or promoting the content.”

I would suggest that those actions are pretty wide, as drafted.

One of the witnesses—I think it was Mr Moy—talked about what were essentially content-agnostic measures to impede virality, and used the word “friction”. Can you elaborate a little bit on what you mean by that in practical terms?

William Moy: Yes, I will give a couple of quick examples. WhatsApp put a forwarding limit on WhatsApp messages during the pandemic. We knew that WhatsApp was a vector through which misinformation could spread, because forwarding is so easy. They restricted it to, I think, six forwards, and then you were not able to forward the message again. That is an example of friction. Twitter has a note whereby if you go to retweet something but you have not clicked on the link, it says, “Do you want to read the article before you share this?” You can still share it, but it creates that moment of pause for people to make a more informed decision.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. Would you accept that the level of specificity that you have just outlined there is very difficult, if not impossible, to put in a piece of primary legislation?

William Moy: But that is not what I am suggesting you do. I am suggesting you say that this Parliament prefers interventions that are content-neutral or free speech-based, and that inform users and help them make up their own minds, to interventions that restrict what people can see and share.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q But a piece of legislation has to do more than express a preference; it has to create a statutory duty. I am just saying that that is quite challenging in this context.

William Moy: I do not think it is any more challenging than most of the risk assessments, codes of practice and so on, but I am willing to spend as many hours as it takes to talk through it with you.

None Portrait The Chair
- Hansard -

Order. I am afraid that we have come to the end of our allotted time for questions. On behalf of the Committee, I thank the witnesses for all their evidence.

Examination of Witnesses

Danny Stone MBE, Stephen Kinsella OBE and Liron Velleman gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Would any other witness like to contribute? No.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you again to the witnesses for joining us this morning. I will start with Stephen Kinsella. You have spoken already about some of the issues to do with anonymity. Can you share with the Committee your view on the amendments made to the Bill, when it was introduced a couple of months ago, to give users choices over self-verification and the content they see? Do you think they are useful and helpful updates to the Bill?

Stephen Kinsella: Yes. We think they are extremely helpful. We welcome what we see in clause 14 and clause 57. There is thus a very clear right to be verified, and an ability to screen out interactions with unverified accounts, which is precisely what we asked for. The Committee will be aware that we have put forward some further proposals. I would really hesitate to describe them as amendments; I see them as shading-in areas—we are not trying to add anything. We think that it would be helpful, for instance, when someone is entitled to be verified, that verification status should also be visible to other users. We think that should be implicit, because it is meant to act as a signal to others as to whether someone is verified. We hope that would be visible, and we have suggested the addition of just a few words into clause 14 on that.

We think that the Bill would benefit from a further definition of what it means by “user identity verification”. We have put forward a proposal on that. It is such an important term that I think it would be helpful to have it as a defined term in clause 189. Finally, we have suggested a little bit more precision on the things that Ofcom should take into account when dealing with platforms. I have been a regulatory lawyer for nearly 40 years, and I know that regulators often benefit from having that sort of clarity. There is going to be negotiation between Ofcom and the platforms. If Ofcom can refer to a more detailed list of the factors it is supposed to take into account, I think that will speed the process up.

One of the reasons we particularly welcomed the structure of the Bill is that there is no wait for detailed codes of conduct because these are duties that we will be executing immediately. I hope Ofcom is working on the guidance already, but the guidance could come out pretty quickly. Then there would be the process of—maybe negotiating is the wrong word—to-and-fro with the platforms. I would be very reluctant to take too much on trust. I do not mean on trust from the Government; I mean on trust from the platforms—I saw the Minister look up quickly then. We have confidence in Government; it is the platforms we are little bit wary of. I heard the frustration expressed on Tuesday.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

indicated assent.

Stephen Kinsella: I think you said, “If platforms care about the users, why aren’t they already implementing this?” Another Member, who is not here today, said, “Why do they have to be brought kicking and screaming?” Yet, every time platforms were asked, we heard them say, “We will have to wait until we see the detail of—”, and then they would fill in whatever thing is likely to come last in the process. So we welcome the approach. Our suggestions are very modest and we are very happy to discuss them with you.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, and thank you for the work that you have done on this issue, together with Siobhan Baillie, my hon. Friend the Member for Stroud, which the Government adopted. Some of the areas that you have referred to could be dealt with in subsequent Ofcom codes of practice, but we are certainly happy to look at your submissions. Thank you for the work that you have done in this area.

Danny, we have had some fairly extensive discussions on the question of small but toxic platforms such as 4chan and BitChute—thank you for coming to the Department to discuss them. I heard your earlier response to the shadow Minister, but do you accept that those platforms should be subject to duties in the Bill in relation to content that is illegal and content that is already harmful to children?

Danny Stone: Yes, that is accurate. My position has always been that that is a good thing. The extent and the nature of the content that is harmful to adults on such platforms—you mentioned BitChute but there are plenty of others—require an additional level of regulatory burden and closer proximity to the regulator. Those platforms should have to account for it and say, “We are the platforms; we are happy that this harm is on our platform and”—as the Bill says—“we are promoting it.” You are right that it is captured to some degree; I think it could be captured further.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I understand; thank you. Liron, in an earlier answer, you referred to the protections for content of democratic importance and journalistic content, which are set out in clauses 15 and 16. You suggested and were concerned that they could act as a bar to hateful, prohibited or even illegal speech being properly enforced against. Do you accept that clauses 15 and 16 do not provide an absolute protection for content of democratic importance or journalistic content, and that they do not exempt such content from the Bill’s provisions? They simply say that in discharging duties under the Bill, operators must use

“proportionate systems and processes…to ensure that…content of democratic”—

or journalistic—

“importance is taken into account”.

That is not an absolute protection; it is simply a requirement to take into account and perform a proportionate and reasonable balancing exercise. Is that not reasonable?

Liron Velleman: I have a couple of things to say on that. First, we and others in civil society have spent a decade trying to de-platform some of the most harmful actors from mainstream social media companies. What we do not want to see after the Bill becomes an Act are massive test cases where we do not know which way they will go and where it will be up to either the courts or social media companies to make their own decisions on how much regard they place in those exemptions at the same time as all the other clauses.

Secondly, one of our main concerns is the time it takes for some of that content to be removed. If we have a situation in which there is an expediated process for complaints to be made, and for journalistic content to remain on the platform for an announced time until the platform is able take it down, that could move far outside the realms of that journalistic or democratically important content. Again, using the earlier examples, it does not take long for content such as a livestream of a terrorist attack to be up on the Sun or the Daily Mirror websites and for lots of people to modify that video and bypass content, which can then be shared and used to recruit new terrorists and allow copycat attacks to happen, and can go into the worst sewers of the internet. Any friction that is placed on stopping platforms being able to take down some of that harm is definitely of particular concern to us.

Finally, as we heard on Tuesday, social media platforms—I am not sure I would agree with much of what they would say about the Bill, but I think this is true—do not really understand what they are meant to do with these clauses. Some of them are talking about flowcharts and whether this is a point-scoring system that says, “You get plus one for being a journalist, but minus two for being a racist.” I am not entirely sure that platforms will exercise the same level of regard. If, with some of the better-faith actors in the social media space, we have successfully taken down huge reams of the most harmful content and moved it away from where millions of people can see it to where only tens of thousands can see it, we do not want in any way the potential to open up the risk that hundreds of people could argue that they should be back on platforms when they are currently not there.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Okay, thank you. My last question touches on those issues and is for each of the panel in turn. Some people have claimed—I think wrongly—that the provisions in the Bill in some way threaten free speech. As you will have seen in the article I wrote in The Times earlier this week, I do not think, for a number of reasons, that that is remotely true, but I would be interested in hearing the views of each of the panel members on whether there is any risk to freedom of speech in the work that the Bill does in terms of protecting people from illegal content, harm to children and content that is potentially harmful to adults.

Danny Stone: My take on this—I think people have misunderstood the Bill—is that it ultimately creates a regulated marketplace of harm. As a user, you get to determine how harmful a platform you wish to engage with—that is ultimately what it does. I do not think that it enforces content take-downs, except in relation to illegal material. It is about systems, and in some places, as you have heard today, it should be more about systems, introducing friction, risk-assessing and showing the extent to which harm is served up to people. That has its problems.

The only other thing on free speech is that we sometimes take too narrow a view of it. People are crowded out of spaces, particularly minority groups. If I, as a Jewish person, want to go on 4chan, it is highly unlikely that I will get a fair hearing there. I will be threatened or bullied out of that space. Free speech has to apply across the piece; it is not limited. We need to think about those overlapping harms when it comes to human rights—not just free speech but freedom from discrimination. We need to be thinking about free speech in its widest context.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. You made a very important point: there is nothing in the Bill that requires censorship or prohibition of content that is legal and harmless to children. That is a really important point.

Stephen Kinsella: I agree entirely with what Danny was saying. Of course, we would say that our proposals have no implications for free speech. What we are talking about is the freedom not to be shouted at—that is really what we are introducing.

On disinformation, we did some research in the early days of our campaign that showed that a vast amount of the misinformation and disinformation around the 5G covid conspiracy was spread and amplified by anonymous or unverified accounts, so they play a disproportionate role in disseminating that. They also play a disproportionate role in disseminating abuse, and I think you may have a separate session with Kick It Out and the other football bodies. They have some very good research that shows the extent to which abusive language is from unverified or anonymous accounts. So, no, we do not have any free speech concerns at Clean up the Internet.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Good. Thank you, Stephen. Liron?

Liron Velleman: We are satisfied that the Bill adequately protects freedom of speech. Our key view is that, if people are worried that it does not, beefing up the universal protections for freedom of speech should be the priority, instead of what we believe are potentially harmful exemptions in the Bill. We think that freedom of speech for all should be protected, and we very much agree with what Danny said—that the Bill should be about enhancing freedom of speech. There are so many communities that do not use social media platforms because of the harm that exists currently on platforms.

On children, the Bill should not be about limiting freedom of speech, but a large amount of our work covers the growth of youth radicalisation, particularly in the far right, which exists primarily online and which can then lead to offline consequences. You just have to look at the number of arrests of teenagers for far-right terrorism, and so much of that comes from the internet. Part of the Bill is about moderating online content, but it definitely serves to protect against some of the offline consequences of what exists on the platform. We would hope that if people are looking to strengthen freedom of speech, that is a universalist principle in the Bill, and not for some groups but not others.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Good. Thank you. I hope the Committee is reassured by those comments on the freedom of speech question.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I will use the small amount of time we have left to ask one question. A number of other stakeholders and witnesses have expressed concerns regarding the removal of a digital media literacy strategy from the Bill. What role do you see a digital media literacy strategy playing in preventing the kind of abuse that you have been describing?

Danny Stone: I think that a media literacy strategy is really important. There is, for example, UCL data on the lack of knowledge of the word “antisemitism”: 68% of nearly 8,000 students were unfamiliar with the term’s meaning. Dr Tom Harrison has discussed cultivating cyber-phronesis—this was also in an article by Nicky Morgan in the “Red Box” column some time ago—which is a method of building practical knowledge over time to make the right decisions when presented with a moral challenge. We are not well geared up as a society—I am looking at my own kids—to educate young people about their interactions, about what it means when they are online in front of that box and about to type something, and about what might be received back. I have talked about some of the harms people might be directed to, even through Alexa, but some kind of wider strategy, which goes beyond what is already there from Ofcom—during the Joint Committee process, the Government said that Ofcom already has its media literacy requirements—and which, as you heard earlier, updates it to make it more fit for purpose for the modern age, would be very appropriate.

Stephen Kinsella: I echo that. We also think that that would be welcome. When we talk about media literacy, we often find ourselves with the platforms throwing all the obligation back on to the users. Frankly, that is one of the reasons why we put forward our proposal, because we think that verification is quite a strong signal. It can tell you quite a lot about how likely it is that what you are seeing or reading is going to be true if someone is willing to put their name to it. Seeing verification is just one contribution. We are really talking about trying to build or rebuild trust online, because that is what is seriously lacking. That is a system and design failure in the way that these platforms have been built and allowed to operate.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The shadow Minister’s question is related to the removal of what was clause 103 in the old draft of the Bill. As she said, that related to media literacy. Does the panel draw any comfort from three facts? First, there is already a media literacy duty on Ofcom under section 11 of the Communications Act 2003—the now deleted clause 103 simply provided clarification on an existing duty. Secondly, last December, after the Joint Committee’s deliberations, but before the updated Bill was published, Ofcom published its own updated approach to online media literacy, which laid out the fact that it was going to expand its media literacy programme beyond what used to be in the former clause 103. Finally, the Government also have their own media literacy strategy, which is being funded and rolled out. Do those three things—including, critically, Ofcom’s own updated guidance last December—give the panel comfort and confidence that media literacy is being well addressed?

Liron Velleman: If the Bill is seeking to make the UK the safest place to be on the internet, it seems to be the obvious place to put in something about media literacy. I completely agree with what Danny said earlier: we would also want to specifically ensure—although I am sure this already exists in some other parts of Ofcom and Government business—that there is much greater media literacy for adults as well as children. There are lots of conversations about how children understand use of the internet, but what we have seen, especially during the pandemic, is the proliferation of things like community Facebook groups, which used to be about bins and a fair that is going on this weekend, becoming about the worst excesses of harmful content. People have seen conspiracy theories, and that is where we have seen some of the big changes to how the far-right and other hateful groups operate, in terms of being able to use some of those platforms. That is because of a lack of media literacy not just among children, but among the adult population. I definitely would encourage that being in the Bill, as well as anywhere else, so that we can remove some of those harms.

Danny Stone: I think it will need further funding, beyond what has already been announced. That might put a smile on the faces of some Department for Education officials, who looked so sad during some of the consultation process—trying to ensure that there is proper funding. If you are going to roll this out across the country and make it fit for purpose, it is going to cost a lot of money.

None Portrait The Chair
- Hansard -

Thank you. As there are no further questions from Members, I thank the witnesses for their evidence. That concludes this morning’s sitting.

Ordered, That further consideration be now adjourned. —(Steve Double.)