Online Safety Bill (Thirteenth sitting) Debate
Full Debate: Read Full DebateDan Carden
Main Page: Dan Carden (Labour - Liverpool Walton)Department Debates - View all Dan Carden's debates with the Department for Digital, Culture, Media & Sport
(2 years, 5 months ago)
Public Bill CommitteesClause 129(4) states that the Secretary of State will be consulted in the process. What would be the Secretary of State’s powers in relation to that? Would she be able to overrule Ofcom in the writing of its guidance?
The hon. Member asks for my assistance in interpreting legislative language. Generally speaking, “consult” means what it suggests. Ofcom will consult the Secretary of State, as it will consult the ICO, to ascertain the Secretary of State’s opinion, but Ofcom is not bound by that opinion. Unlike the power in a previous clause—I believe it was clause 40—where the Secretary of State could issue a direct instruction to Ofcom on certain matters, here we are talking simply about consulting. When the Secretary of State expresses an opinion in response to the consultation, it is just that—an opinion. I would not expect it to be binding on Ofcom, but I would expect Ofcom to pay proper attention to the views of important stakeholders, which in this case include both the Secretary of State and the ICO. I hope that gives the hon. Member the clarification he was seeking.
I want to briefly agree with the sentiments of the Opposition Front Bench, especially about the strength of the committee and the lack of teeth that it currently has. Given that the Government have been clear that they are very concerned about misinformation and disinformation, it seems odd that they are covered in the Bill in such a wishy-washy way.
The reduction of the time from 18 months to six months would also make sense. We would expect the initial report the committee publish in six months to not be as full as the ones it would publish after that. I do not see any issue with it being required to produce a report as soon as possible to assess how the Act is bedding in and beginning to work, rather than having to wait to assess—potentially once the Act is properly working. We want to be able to pick up any teething problems that the Act might have.
We want the committee to be able to say, “Actually, this is not working quite as we expected. We suggest that Ofcom operates in a slightly different way or that the interaction with providers happens in a slightly different way.” I would rather that problems with the Act were tackled as early as possible. We will not know about problems with the Act, because there is no proper review mechanism. There is no agreement on the committee, for example, to look at how the Act is operating. This is one of the few parts of the Bill where we have got an agreement to a review, and it would make sense that it happen as early as possible.
We agree that misinformation and disinformation are very important matters that really need to be tackled, but there is just not enough clout in the Bill to allow Ofcom to properly tackle these issues that are causing untold harm.
When I spoke at the very beginning of the Committee’s proceedings, I said that the legislation was necessary, that it was a starting point and that it would no doubt change and develop over time. However, I have been surprised at how little, considering all of the rhetoric we have heard from the Secretary of State and other Ministers, the Bill actually deals with the general societal harm that comes from the internet. This is perhaps the only place in the Bill where it is covered.
I am thinking of the echo chambers that are created around disinformation and the algorithms that companies use. I really want to hear from the Minister where he sees this developing and why it is so weak and wishy-washy. While I welcome that much of the Bill seeks to deal with the criminality of individuals and the harm and abuse that can be carried out over the internet, overall it misses a great opportunity to deal with the harmful impact the internet can have on society.
Let me start by speaking on the issue of disinformation more widely, which clearly is the target of the two amendments and the topic of clause 130. First, it is worth reminding the Committee that non-legislatively—operationally—the Government are taking action on the disinformation problem via the counter-disinformation unit of the Department for Digital, Culture, Media and Sport, which we have discussed previously.
The unit has been established to monitor social media firms and sites for disinformation and then to take action and work with social media firms to take it down. For the first couple of years of its operation, it understandably focused on disinformation connected to covid. In the last two or three months, it has focused on disinformation relating to the Russia-Ukraine conflict —in particular propaganda being spread by the Russian Government, which, disgracefully, has included denying responsibility for various atrocities, including those committed at Bucha. In fact, in cases in which the counter-disinformation unit has not got an appropriate response from social media firms, those issues have been escalated to me, and I have raised them directly with those firms, including Twitter, which has tolerated all kinds of disinformation from overt Russian state outlets and channels, including from Russian embassy Twitter accounts, which are of particular concern to me. Non-legislative action is being taken via the CDU.
I agree with the right hon. Member for Basingstoke that these are important clauses. I want to put them into the context of what we heard from Frances Haugen, who, when she spoke to Congress, said that Facebook consistently chose to maximise its growth rather than implement safeguards on its platforms. She said:
“During my time at Facebook, I came to realise a devastating truth: Almost no one outside of Facebook knows what happens inside Facebook. “The company intentionally hides vital information from the public, from the U.S. government, and from governments around the world.”
When we consider users’ experiences, I do not think it is good enough just to look at how the user engages with information. We need far more transparency about how the companies themselves are run. I would like to hear the Minister’s views on how this clause, which looks at users’ experiences, can go further in dealing with the harms at source, with the companies, and making sure a light is shone on their practices.
I welcome the support of the hon. Member for Pontypridd for these clauses. I will turn to the questions raised by my right hon. Friend the Member for Basingstoke. First, she asked whether Ofcom has to publish these reports so that the public, media and Parliament can see what they say. I am pleased to confirm that Ofcom does have to publish the reports; section 15 of the Communications Act 2003 imposes a duty on Ofcom to publish reports of this kind.
Secondly, my right hon. Friend asked about educating the public on issues pertinent to these reports, which is what we would call a media literacy duty. Again, I confirm that, under the Communications Act, Ofcom has a statutory duty to promote media literacy, which would include matters that flow from these reports. In fact, Ofcom published an expanded and updated set of policies in that area at the end of last year, which is why the old clause 103 in the original version of this Bill was removed—Ofcom had already gone further than that clause required.
Thirdly, my right hon. Friend asked about the changes that might happen in response to the findings of these reports. Of course, it is open to Ofcom—indeed, I think this Committee would expect it—to update its codes of practice, which it can do from time to time, in response to the findings of these reports. That is a good example of why it is important for those codes of practice to be written by Ofcom, rather than being set out in primary legislation. It means that when some new fact or circumstance arises or some new bit of research, such as the information required in this clause, comes out, those codes of practice can be changed. I hope that addresses the questions my right hon. Friend asked.
The hon. Member for Liverpool, Walton asked about transparency, referring to Frances Haugen’s testimony to the US Senate and her disclosures to The Wall Street Journal, as well as the evidence she gave this House, both to the Joint Committee and to this Committee just before the Whitsun recess. I have also met her bilaterally to discuss these issues. The hon. Gentleman is quite right to point out that these social media firms use Facebook as an example, although there are others that are also extremely secretive about what they say in public, to the media and even to representative bodies such as the United States Congress. That is why, as he says, it is extremely important that they are compelled to be a lot more transparent.
The Bill contains a large number of provisions compelling or requiring social media firms to make disclosures to Ofcom as the regulator. However, it is important to have public disclosure as well. It is possible that the hon. Member for Liverpool, Walton was not in his place when we came to the clause in question, but if he turns to clause 64 on page 56, he will see that it includes a requirement for Ofcom to give every provider of a relevant service a notice compelling them to publish a transparency report. I hope he will see that the transparency obligation that he quite rightly refers to—it is necessary—is set out in clause 64(1). I hope that answers the points that Committee members have raised.
Question put and agreed to.
Clause 132 accordingly ordered to stand part of the Bill.
Clause 133 ordered to stand part of the Bill.
Clause 134
OFCOM’s statement about freedom of expression and privacy
Question proposed, That the clause stand part of the Bill.