(5 days, 18 hours ago)
Grand CommitteeMy Lords, I shall also speak to Amendment 198 in my name and register my support for the amendments in the name of the noble Lord, Lord Bethell, to which I have added my name. Independent research access is a very welcome addition to the Bill by the Government. It was a key recommendation of the pre-legislative scrutiny committee on the Online Safety Bill in 2021 and I know that I speak for many colleagues in the academic field, as well as many civil society organisations, who are delighted by its swift and definitive inclusion in the Bill.
The objective of these amendments is not to derail the Government’s plans, but rather to ensure that they happen and to make the regime work for children and the UK’s world-class academic institutions and stellar civil society organisations, ensuring that we can all do high-quality research about emergent threats to children and society more broadly.
Amendment 197 would ensure that the provisions in Clause 123 are acted on by removing the Government’s discretion as to whether or not they introduce regulations. It would also impose a deadline of 12 months for the Government to do so. I have said this before, but I have learnt the hard way that good intentions and warm words from the Dispatch Box are a poor substitute for clear provisions in law. A quick search of the Bill reveals that there are 119 uses of the word “must” and 262 uses of the word “may”. Clearly, they are being used to create different obligations or expectations. The Minister may say that this amendment is not needed and that, for all intents and purposes, we can take the word “may” as a “must” or a “will”, but I would prefer to see it in black and white. In fact, if the Government have reserved discretion on this point, I would like to understand exactly what that means for research.
Amendment 198 seeks to ensure that the regulations will enable independent researchers to research how online risks and harms impact different groups, especially vulnerable users including children. We have already discussed the fact that online harms are not experienced equally by users: those who are most vulnerable offline are often the most vulnerable online. In an earlier debate, I talked about the frustrations experienced when tech companies do not report data according to age groups. In failing to do so, it is possible to hide the reality that children are disproportionately impacted by certain risks and harms. This amendment would ensure that children and other vulnerable groups can be studied in isolation, rather than leaving independent researchers to pick through generalised datasets to uncover where harm is amplified and for whom.
I will leave the noble Lord, Lord Bethell, to explain his amendments, but I will just say why it is so important that we have a clear path to researcher access. It is fundamental to the success of the online safety regime.
Many will remember Frances Haugen, the Facebook whistleblower, who revealed the extent to which Meta knew, through its own detailed internal research, how harmful their platforms actually are to young people. Meta’s own research showed that:
“We make body image issues worse for one in three girls”.
Some 32% of teen girls said that, when they have felt bad about their bodies, Instagram has made them feel worse. Were it not for a whistleblower, this research would never have been made public.
After a series of evidence disclosures to US courts as a result of the legal action by attorneys-general at state level, we have heard whistleblowers suggest, in evidence given to the EU, that there will be a new culture in some Silicon Valley firms—no research and no emails. If you have something to say, you will have to say it in person so that it cannot be used against them in court. The irony of that is palpable given the struggle that we are having about user privacy, but it points to the need for our research regime to be water- tight. If the companies are not looking at the impact of their own services, we must. I hope that the Government continue their leadership on this issue and accept the amendments in the spirit that they are being put forward.
I have another point that I want the Minister to clarify. I apologise, because I raised this in a private meeting but I have forgotten the answer. Given the number of regulatory investigations, proceedings and civil litigations in which tech companies are engaged, I would like some comfort about the legal exemption in these clauses. I want to understand whether it applies only to advice from and between lawyers or exempts data that may negatively impact companies’ defence or surface evidence of safety failures or deficiencies. The best way that I have of explaining my concern is: if it is habitual for tech companies to cc a lawyer in all their communications on product safety, trust and safety, and so on, would that give them legal privilege?
Finally, I support the noble Lord, Lord Clement-Jones, in his desire for a definition of independent researchers. I would be interested to hear what the Minister has to say on that. I beg to move.
My Lords, I will speak to my Amendments 198A and 198C to 198F. I also support Amendments 197, 198 and 198B, to which I have added my name, all of which address the issue of data for researchers.
As was put very thoughtfully by the noble Baroness, Lady Kidron, platforms are not making decisions about their services with due regard to product safety or with independent oversight. Ofcom’s work enforcing the Online Safety Act will significantly shift towards accountability, in some part, but it makes no provision at the moment on researchers’ data access, despite civil society and academic researchers being at the forefront of highlighting online harms for a decade. The anecdotes that the noble Baroness just gave were a very powerful testimony to the importance of that. We are, in fact, flying completely blind, making policy and, in this Room, legislation without data, facts and insight about the performance and algorithms that we seek to address. Were it not for the whistleblowers, we would not have anything to go on and we cannot rely on whistleblowers to guide our hands.
Rectifying this admission is in the Bill, and I am enormously grateful to the Minister and to the role of my noble friend Lord Camrose for putting it in the Bill. It is particularly important, because the situation with data for researchers has deteriorated considerably, even in the last 18 months—with Meta shutting CrowdTangle and X restricting researchers’ access to its API. The noble Baroness, Lady Kidron, spoke about what the whistleblowers think, and they think that this is going to get a lot worse in the future.
I welcome the inclusion of these provisions in the Bill. They will be totally transformational to this sector, bringing a level of access to serious analysts and academics, so we can better understand the impact of the digital world, for both good and bad. A good example of the importance of robust research to inform policy-making was the Secretary of State’s recent announcement that the Government were launching a
“research project to explore the impact of social media on young people’s wellbeing and mental health”.—[Official Report, Commons, 20/11/24; col. 250.]
That project will not be very effective if the researchers cannot access the data, so I very much hope that these provisions will be enforced before they start spending money on that.
To be effective and to have the desired effect, we need to ensure that the data for researchers regime, as described in the Bill, is truly effective and cannot be easily brushed off. That is why the Government need to accept the amendments in this group: to bring some clarity and to close loopholes in the scheme as it is outlined in the Bill.
I will briefly summarise the provisions in the amendments in my name. First, we need to make researcher access regulations enforceable in the same way as other requirements in the Online Safety Act. The enforcement provisions in that Act were strengthened considerably as it passed through this House, and I believe that the measures for data for researchers need to be given the same rocket boosters. Amendment 198D will mean that regulated services will be required to adhere to the regime and give Ofcom the power to levy proper remedial action if regulated services are obfuscating or non-compliant.
Secondly, we need to ensure that any contractual provision of use, such as a platform’s terms of service, is unenforceable if it would prevent
“research into online safety matters”,
as defined in the regulations. This is an important loophole that needs to be closed. It will protect UK researchers carrying out public interest research from nefarious litigation over terms of service violations as platforms seek to obfuscate access to data. We have seen this practice in other areas.
Thirdly, we need to clarify that researchers carrying out applicable research into online safety matters in the UK will be able to access information under the regime, regardless of where they are located. This is a basic point. Amendment 198E would bring the regime in line with the Digital Services Act of the EU and allow the world’s best researchers to study potential harm to UK users.
Ensuring robust researcher access to data contributes to a great ecosystem of investigation and scrutiny that will help to enforce an effective application of the law, while also guarding against overreach in terms of moderating speech. It is time to back UK civil society and academic researchers to ensure that policy-making and regulatory enforcement is as informed as possible. That is why I ask the Minister to support these measures.