All 1 Baroness Barker contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2

Online Safety Bill

Baroness Barker Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, my noble friend Lord Stevenson, who tabled this amendment, unfortunately cannot be with us today as he is off somewhere drinking sherry, I hope.

This is an important set of amendments about researchers’ access to data. As I have previously said to the Committee, we need to ensure that Ofcom has the opportunity to be as trusted as possible in doing its job, so that we can give it as much flexibility as we can, and so that it can deal with a rapidly changing environment. As I have also said on more than one occasion, in my mind, that trust is built by the independence of Ofcom from Secretary of State powers; the ongoing and post-legislative scrutiny of Parliament, which is not something that we can deal with in this Bill; and, finally, transparency—and this group of amendments goes to that very important issue.

The lead amendment in this group, Amendment 230 in my noble friend Lord Stevenson’s name, seeks to accelerate the process relating to Ofcom’s report on researchers’ access to information. Instead of simply requiring a report within two years of Clause 146 being brought into force, this amendment would require an interim report within three months with a final report to follow two years later. Although it is the lead amendment in the group, I do not think it is the more significant because, in the end, it does not do much about the fundamental problem that we want to deal with in this group, which is the need to do better than just having a report. We need to ensure that there really is access by independent reporters.

Amendments 233 and 234 are, I think, of more significance. These proposed new clauses would assist independent researchers in accessing information and data from providers of regulated services. Amendment 233 would allow Ofcom itself to appoint researchers to undertake a variety of research. Amendment 234 would require Ofcom to issue a code of practice on researchers’ access to data; again, this is important so that the practical and legal difficulties for both researchers and service providers can be overcome though negotiation and consultation by Ofcom. Amendment 233A from the noble Lord, Lord Allan, which I am sure he will speak to in a moment, is helpful in clarifying that no data protection breach would be incurred by allowing the research access.

In many ways, there is not a huge amount more to say. When Melanie Dawes, the head of Ofcom, appeared before the Joint Committee on 1 November 2021—all that time ago—she said that

“tightening up the requirement to work with external researchers would be a good thing in the Bill”.

It is therefore a disappointment that, when the Bill was finally published after the Joint Committee’s consideration of the draft, there was not something more significant and more weighty than just a report. That is what we are trying to address, particularly now that we see, as an example, that Twitter is charging more than £30,000 a month for researchers’ access. That is quite a substantial rate in order for researchers to be able to do their work in respect of that platform. Others are restricting or obscuring some of the information that people want to be able to see.

This is a vital set of measures if this Bill is to be effective. These amendments go a long way towards where we want to get to on this; for the reasons I have set out around ensuring that there is transparency, they are vital. We know from the work of Frances Haugen that the platforms themselves are doing this research. We need that out in the open, we need Ofcom to be able to see it through independent researchers and we need others to be able to see it so that Parliament and others can continue to hold these platforms to account. Given that the Minister is in such a positive mood, I look forward to his positive response.

Baroness Barker Portrait The Deputy Chairman of Committees (Baroness Barker) (LD)
- Hansard - -

My Lords, I must advise the Committee that if Amendment 230 is agreed to then I cannot call Amendment 231 because of pre-emption.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we are reaching the end of our Committee debates, but I am pleased that we have some time to explore these important questions raised by the noble Lord, Lord Knight of Weymouth.

I have an academic friend who studies the internet. When asked to produce definitive answers about how the internet is impacting on politics, he politely suggests that it may be a little too soon to say, as the community is still trying to understand the full impact of television on politics. We are rightly impatient for more immediate answers to questions around how the services regulated by this Bill affect people. For that to happen, we need research to be carried out.

A significant amount of research is already being done within the companies themselves—both more formal research, often done in partnership with academics, and more quick-fix commercial analyses where the companies do their own studies of the data. These studies sometimes see the light of day through publication or quite often through leaks; as the noble Lord, Lord Knight, has referred to, it is not uncommon for employees to decide to put research into the public domain. However, I suggest that this is a very uneven and suboptimal way for us to get to grips with the impact on services. The public interest lies in there being a much more rigorous and independent body of research work, which, rightly, these amendments collectively seek to promote.

The key issues that we need to address head-on, if we are actively to promote more research, lie within the data protection area. That has motivated my Amendment 233A—I will explain the logic of it shortly—and is the reason why I strongly support Amendment 234.

A certain amount of research can be done without any access to personal data, bringing together aggregated statistics of what is happening on platforms, but the reality is that many of the most interesting research questions inevitably bring us into areas where data protection must be considered. For example, looking at how certain forms of content might radicalise people will involve looking at what individual users are producing and consuming and the relationships between them. There is no way of doing without it for most of the interesting questions around the harms we are looking at. If you want to know whether exposure to content A or content B led to a harm, there is no way to do that research without looking at the individual and the specifics.

There is a broad literature on how anonymisation and pseudonymisation techniques can be used to try to make those datasets a little safer. However, even if the data can be made safe from a technical point of view, that still leaves us with significant ethical questions about carrying out research on people who would not necessarily consent to it and may well disagree with the motivation behind the sorts of questions we may ask. We may want to see how misinformation affects people and steers them in a bad direction; that is our judgment, but the judgment of the people who use those services and consume that information may well be that they are entirely happy and there is no way on earth that they would consent to be studied by us for something that they perceive to be against their interests.

Those are real ethical questions that have to be asked by any researcher looking at this area. That is what we are trying to get to in the amendments—whether we can create an environment with that balance of equity between the individual, who would normally be required to give consent to any use of their data, and the public interest. We may determine that, for example, understanding vaccine misinformation is sufficiently important that we will override that individual’s normal right to choose whether to participate in the research programme.

My Amendment 233A is to Amendment 233, which rightly says that Ofcom may be in a position to say that, for example, vaccine misinformation is in the overriding public interest and we need research into it. If it decides to do that and the platforms transfer data to those independent researchers, because we have said in the amendment that they must, the last thing we want is for the platforms to feel that, if there is any problem further down the track, there will be comeback on them. That would be against the principle of natural justice, given that they have been instructed to hand the data over, and could also act as a barrier.