Online Safety Bill Debate
Full Debate: Read Full DebateLord Bethell
Main Page: Lord Bethell (Conservative - Excepted Hereditary)Department Debates - View all Lord Bethell's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, Amendments 233 and 234 from the noble Lord, Lord Knight of Weymouth, were well motivated, so I will be brief. I just have a couple of queries.
First, we need to consider what the criteria are for who is considered worthy of the privileged status of receiving Ofcom approval as a researcher. We are discussing researchers as though they are totally reliable and trustworthy. We might even think that if they are academic researchers, they are bound to be. However, there was an interesting example earlier this week of confirmation bias leading to mistakes when King’s College had to issue a correction to its survey data that was used in the BBC’s “Mariana in Conspiracyland”. King’s College admitted that it had wildly overestimated the numbers of those reading conspiracy newspaper, The Light, and wildly overestimated the numbers of those attending what it dubbed conspiracy demonstrations. By the way, BBC Verify has so far failed to verify the mistake it repeated. I give this example not as a glib point but because we cannot just say that because researchers are accredited elsewhere they should just be allowed in. I also think that the requirement to give the researchers
“all such assistance as they may reasonably require to carry out their research”
sounds like a potentially very time-consuming and expensive effort.
The noble Lord, Lord Allan of Hallam, raised points around “can’t” or “won’t”, and whether this means researchers “must” or “should”, and who decides whether it is ethical that they “should” in all instances. There are ethical questions here that have been raised. Questions of privacy are not trivial. Studying individuals as specimens of “badthink” or “wrongthink” might appear in this Committee to be in the public interest but without the consent of people it can be quite damaging. We have to decide which questions fulfil the public interest so sufficiently that consent could be overridden in that way.
I do not think this is a slam-dunk, though it looks like a sensible point. I do not doubt that all of us want more research, and good research, and data we can use in arguments, whatever side we are on, but it does not mean we should just nod something through without at least pausing.
My Lords, I declare an interest as a trustee of the International Centre for the Study of Radicalisation at the War Studies department of King’s College London. That is somewhere that conducts research using data of the kind addressed in this group, so I have a particular interest in it.
We know from the kind of debates that the noble Lord, Lord Knight, referred to that it is widely accepted that independent researchers benefit hugely from access to relevant information from service providers to research online safety matters. That is why my Amendment 234, supported by the noble Lords, Lord Clement-Jones and Lord Knight, aims to introduce an unavoidable mandatory duty for regulated platforms to give access to that data to approved researchers.
As the noble Lord, Lord Knight, said, there are three ways in which this would be done. First, the timeframe for Ofcom’s report would be accelerated; secondly, proposed new Clause 147 would allow Ofcom to appoint the researchers; and, thirdly, proposed new Clause 148 would require Ofcom to write a code of practice on data access, setting up the fundamental principles for data access—a code which, by the way, should answer some of the concerns quite reasonably voiced by the noble Baroness, Lady Fox.
The internet is absolutely the most influential environment in our society today, but it is a complete black box, and we have practically no idea what is going on in some of the most important parts of it. That has a terrible impact on our ability to devise sensible policies and mitigate harm. Instead, we have a situation where the internet companies decide who accesses data, how much of it and for what purposes.
In answer to his point, I can tell the noble Lord, Lord Allan, who they give the data to—they give it to advertisers. I do not know if anyone has bought advertising on the internet, but it is quite a chilling experience. You can find out a hell of a lot about quite small groups of people if you are prepared to pay for the privilege of trying to reach them with one of your adverts: you can find out what they are doing in their bedrooms, what their mode of transport is to get to work, how old they are, how many children they have and so on. There is almost no limit to what you can find out about people if you are an advertiser and you are prepared to pay.
In fact, only the companies themselves can see the full picture of what goes on on the internet. That puts society and government at a massive disadvantage and makes policy-making virtually impossible. Noble Lords should be in no doubt that these companies deliberately withhold valuable information to protect their commercial interests. They obfuscate and confuse policymakers, and they protect their reputations from criticism about the harms they cause by withholding data. One notable outcome of that strategy is that it has taken years for us to be here today debating the Online Safety Bill, precisely because policy-making around the internet has been so difficult and challenging.
A few years ago, we were making some progress on this issue. I used to work with the Institute for Strategic Dialogue using CrowdTangle, a Facebook product. It made a big impact. We were working on a project on extremism, and having access to CrowdTangle revolutionised our understanding of how the networks of extremists that were emerging in British politics were coming together. However, since then, platforms have gone backwards a long way and narrowed their data-sharing. The noble Lord, Lord Knight, mentioned that CrowdTangle has essentially been closed down, and Twitter has basically stopped providing its free API for researchers—it charges for some access but even that is quite heavily restricted. These retrograde steps have severely hampered our ability to gather the most basic data from otherwise respectable and generally law-abiding companies. It has left us totally blind to what is happening on the rest of the internet—the bit beyond the nice bit; the Wild West bit.
Civil society plays a critical role in identifying harmful content and bad behaviour. Organisations such as the NSPCC, the CCDH, the ISD—which I mentioned—the Antisemitism Policy Trust and King’s College London, with which I have a connection, prove that their work can make a really big difference.
It is not as though other parts of our economy or society have the same approach. In fact, in most parts of our world there is a mixture of public, regulator and expert access to what is going on. Retailers, for instance, publish what is sold in our shops. Mobile phones, hospitals, banks, financial markets, the broadcast media—they all give access, both to the public and to their regulators, to a huge amount of data about what is going on. Once again, internet companies are claiming exceptional treatment—that has been a theme of debates on the Online Safety Bill—as if what happens online should, for some reason, be different from what happens in the rest of the world. That attitude is damaging the interests of our country, and it needs to be reversed. Does anyone think that the FSA, the Bank of England or the MHRA would accept this state of affairs in their regulated market? They absolutely would not.
Greater access to and availability of data and information about systems and processes would hugely improve our understanding of the online environment and thereby protect the innovation, progress and prosperity of the sector. We should not have to wait for Ofcom to be able to identify new issues and then appoint experts to look at them closely; there should be a broader effort to be in touch with what is going on with the internet. It is the nature of regulation that Ofcom will heavily rely on researchers and civil society to help enforce the Online Safety Bill, but this can be achieved only if researchers have sufficient access to data.
As the noble Lord, Lord Allan, pointed out, legislators elsewhere are making progress. The EU’s Digital Services Act gives a broad range of researchers access to data, including civil society and non-profit organisations dedicated to public interest research. The DSA sets out a framework for vetting and access procedures in detail, as the noble Baroness, Lady Fox, rightly pointed out, creating an explicit role for new independent supervisory authorities and digital services co-ordinators to manage that process.
Under Clause 146, Ofcom must produce a report exploring such access within two years of that section of the Bill coming into effect. That is too long. There is no obligation on the part of the regulator or service providers to take this further. No arguments have been put forward for this extended timeframe or relative uncertainty. In contrast, the arguments to speed up the process are extremely persuasive, and I invite my noble friend the Minister to address those.