Data (Use and Access) Bill [HL]

Lord Bethell Excerpts
Tuesday 19th November 2024

(1 day, 9 hours ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

My Lords, it is a privilege to follow the noble Lord, Lord Vaux. I agree with him completely that this is a better Bill. It is a real tribute to the Minister, and I thank her for how she introduced it.

I start with some good news. This Bill plugs a long-standing gap in our data provisions very handsomely—in providing data for researchers. It has been a real problem for civic society that we have had no reach into the affairs and behaviours of our tech companies, no true understanding of what the activities of their services are, how their algorithms are designed and how they behave, or even what kind of audiences they reach. The most basic questions cannot be answered because we do not have any legal reach or insight into what they are up to. That has been a long-standing inhibitor of accountability and, frankly, of good policy-making.

There were a number of attempts to bring this provision into legislation in the Online Safety Act and the previous data Bill. I am really pleased to see in Clause 123 the information for research about online safety matters provisions, which meet all the requests of those who were pressing for them. I pay tribute to the Minister for ensuring that this is in the Bill, as promised. I pay tribute to my noble friend Lord Camrose. He got these provisions into the previous Bill. Unfortunately, that Bill was pulled at the last minute, but I offer some respect to him too on that point.

This is such an important provision. It should not be overshadowed by the other important contents of this Bill. Can the Minister use the opportunity of the passing of this Bill to flesh out some of the provisions in this important clause? I would like to find a forum to understand how researchers can apply for data access; how the privacy protection measures will be applied; how the government consultation will be put together; and how the grounds for an application might work. There are opportunities for those who resist transparency to use any of these measures to throw a spanner in the works. We owe it to ourselves, having seen these provisions put into the Bill, to ensure that they work as well as intended.

That gap having been plugged, I would like to address two others that are still standing. First is the importance of preventing AI-trained algorithms from creating CSAM, which the noble Baroness, Lady Kidron, spoke movingly about. I pull that out of the many things that she mentioned because it is such a graphic example of how we really must draw a red line under some of the most egregious potential behaviours of AI. I am fully aware that we do not, as a country, want to throw obstacles in the way of progress and the very many good things that AI might bring us. There is a tension between the European and American approaches, on which we seek a position of balance. But if we cannot stop the AI from creating images and behaviours around CSAM, my goodness, what can we stop? Therefore, I ask the Minister to answer the question: why cannot we bring such provision on the face the Bill? I will strongly support any efforts to do so.

Lastly, following the comments from the noble and learned Lord, Lord Thomas, on data processing, I flag the very important issue of transfers of data to areas where there is not any clear adequacy and, in fact, no legal system for implementing the rule of law necessary to stand up standard contractual clauses. Your Lordships will be aware that in countries like China and Russia the rule of law is very lightly applied to matters of data. Protecting British citizens’ data, when it goes to such countries, should be the responsibility of any Government, but that is a very difficult thing to provide for. Huge amounts of data is now travelling across borders to countries where we really do not have any legal reach. The BYD car explosion in the UK is an example of the sheer quantity of data that is going overseas. Genomic information using Chinese genomic machines is an example of where some of that data is now more sensitive. It is a big gap in our data protection laws that we do not have a mechanism for fully accounting for the legal handling of that data. I brought in an amendment to the previous Bill, Amendment 111, which I urge the Minister to look at if she would like to understand this issue more carefully. I give fair warning that I will seek to move a version of that amendment for this Bill.