Data Protection and Digital Information (No. 2) Bill

Kit Malthouse Excerpts
2nd reading
Monday 17th April 2023

(1 year ago)

Commons Chamber
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I am delighted, in turn, to hear my hon. Friend call me the Secretary of State—I am grateful for the promotion, even if it is not a reality. I know how passionate he feels about open data, which is a subject we have discussed before. As I said earlier, I am pleased that the Under-Secretary of State for Business and Trade is present, because this morning he announced that a new council will be driving forward this work. As my hon. Friend knows, this is not necessarily about legislation being in place—I think the Bill gives him what he wants—but about that sense of momentum, and about onboarding new sectors into this regime and not being slow in doing so. As he says, a great deal of economic benefit can be gained from this, and we do not want it to be delayed any further.

Kit Malthouse Portrait Kit Malthouse (North West Hampshire) (Con)
- Hansard - -

Let me first draw attention to my entry in the Register of Members’ Financial Interests. Let me also apologise for missing the Minister’s opening remarks—I was taken by surprise by the shortness of the preceding statement and had to rush to the Chamber.

May I take the Minister back to the subject of compliance costs? I understand that the projected simplification will result in a reduction in those costs, but does she acknowledge that a new regime, or changes to the current regime, will kick off an enormous retraining exercise for businesses, many of which have already been through that process recently and reached a settled state of understanding of how they should be managing data? Even a modest amount of tinkering instils a sense among British businesses, particularly small businesses, that they must put everyone back through the system, at enormous cost. Unless the Minister is very careful and very clear about the changes being made, she will create a whole new industry for the next two or three years, as every data controller in a small business—often doing this part time alongside their main job—has to be retrained.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

We have been very cognisant of that risk in developing our proposals. As I said in my opening remarks, we do not wish to upset the apple cart and create a compliance headache for businesses, which would be entirely contrary to the aims of the Bill. A small business that is currently compliant with the GDPR will continue to be compliant under the new regime. However, we want to give businesses flexibility in regard to how they deliver that compliance, so that, for instance, they do not have to employ a data protection officer.

--- Later in debate ---
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - - - Excerpts

I am delighted to speak in support of this long-awaited Bill. It is a necessary piece of legislation to learn the lessons from GDPR and look at how we can improve the system, both to make it easier for businesses to work with and to give users and citizens the certainty they need about how their data will be processed and used.

In bringing forward new measures, the Bill in no way suggests that we are looking to move away from our data adequacy agreements with the European Union. Around the world, in north America, Europe, Australia and elsewhere in the far east, we see Governments looking at developing trusted systems for sharing and using data and for allowing businesses to process data across international borders, knowing that those systems may not be exactly the same, but they work to the same standards and with similar levels of integrity. That is clearly the direction that the whole world wants to move in and we should play a leading role in that.

I want to talk briefly about an important area of the Bill: getting the balance between data rights and data safety and what the Bill refers to as the “legitimate interest” of a particular business. I should also note that this Bill, while important in its own right, sits alongside other legislation—some of it to be introduced in this Session and some of it already well on its way through the Parliamentary processes—dealing with other aspects of the digital world. The regulation of data is an aspect of digital regulation; it is in some ways the fuel that powers the digital experience and is relevant to other areas of digital life as well.

To take one example, we have already established and implemented the age-appropriate design code for children, which principally addresses the way data is gathered from children online and used to design services and products that they use. As this Bill goes through its parliamentary stages, it is important that we understand how the age-appropriate design code is applied as part of the new data regime, and that the safeguards set out in that code are guaranteed through the Bill as well.

There has been a lot of debate, as has already been mentioned, about companies such as TikTok. There is a concern that engineers who work for TikTok in China, some of whom may be members of the Chinese Communist party, have access to UK user data that may not be stored in China, but is accessed from China, and are using that data to develop products. There is legitimate concern about oversight of that process and what that data might be used for, particularly in a country such as China.

However, there is also a question about data, because one reason the TikTok app is being withdrawn from Government devices around the world is that it is incredibly data-acquisitive. It does not just analyse how people use TikTok and from that create data profiles of users to determine what content to recommend to them, although that is a fundamental part of the experience of using it; it is also gathering, as other big apps do, data from what people do on other apps on the same device. People may not realise that they have given consent, and it is certainly not informed consent, for companies such as TikTok to access data from what they do on other apps, not just when they are TikTok.

It is a question of having trusted systems for how data can be gathered, and giving users the right to opt out of such data systems more easily. Some users might say, “I’m quite happy for TikTok or Meta to have that data gathered about what I do across a range of services.” Others may say, “No, I only want them to see data about what I do when I am using their particular service, not other people’s.”

The Online Safety Bill is one of the principal ways in which we are seeking to regulate AI now. There is debate among people in the tech sectors; a letter was published recently, co-signed by a number of tech executives, including Elon Musk, to say that we should have a six-month pause in the development of AI systems, particularly for large language models. That suggests a problem in the near future of very sophisticated data systems that can make decisions faster than a human can analyse them.

People such as Eric Schmidt have raised concerns about AI in defence systems, where an aggressive system could make decisions faster than a human could respond to them, to which we would need an AI system to respond and where there is potentially no human oversight. That is a frightening scenario in which we might want to consider moratoriums and agreements, as we have in other areas of warfare such as the use of chemical weapons, that we will not allow such systems to be developed because they are so difficult to control.

If we look at the application of that sort of technology closer to home and some of the cases most referenced in the Online Safety Bill, for example the tragic death of the teenager Molly Russell, we see that what was driving the behaviour of concern was data gathered about a user to make recommendations to that person that were endangering their life. The Online Safety Bill seeks to regulate that practice by creating codes and responsibilities for businesses, but that behaviour is only possible because of the collection of data and decisions made by the company on how the data is processed.

This is where the Bill also links to the Government’s White Paper on AI, and this is particularly important: there must be an onus on companies to demonstrate that their systems are safe. The onus must not just be on the user to demonstrate that they have somehow suffered as a consequence of that system’s design. The company should have to demonstrate that they are designing systems with people’s safety and their rights in mind—be that their rights as a worker and a citizen, or their rights to have certain safeguards and protections over how their data is used.

Companies creating datasets should be able to demonstrate to the regulator what data they have gathered, how that data is being trained and what it is being used for. It should be easy for the regulator to see and, if the regulator has concerns up-front, it should be able to raise them with the company. We must try to create that shift, particularly on AI systems, in how systems are tested before they are deployed, with both safety and the principles set out in the legislation in mind.

Kit Malthouse Portrait Kit Malthouse
- Hansard - -

My hon. Friend makes a strong point about safety being designed, but a secondary area of concern for many people is discrimination—that is, the more data companies acquire, the greater their ability to discriminate. For example, in an insurance context, we allow companies to discriminate on the basis of experience or behaviour; if someone has had a lot of crashes or speeding fines, we allow discrimination. However, for companies that process large amounts of data and may be making automated decisions or otherwise, there is no openly advertised line of acceptability drawn. In the future it may be that datasets come together that allow extreme levels of discrimination. For example, if they linked data science, psychometrics and genetic data, there is the possibility for significant levels of discrimination in society. Does he think that, as well as safety, we should be emphasising that line in the sand?