Read Bill Ministerial Extracts
Data Protection and Digital Information (No. 2) Bill (First sitting) Debate
Full Debate: Read Full DebateMark Eastwood
Main Page: Mark Eastwood (Conservative - Dewsbury)(1 year, 5 months ago)
Public Bill CommitteesI am not sure whether this is directly relevant to the Bill or adjacent to it, but I am an unpaid member of the board of the Centre for Countering Digital Hate, which does a lot of work looking at hate speech in the online world.
Given that one of today’s witnesses is from Prospect, I wish to declare that I am a member of that union.
I am a proud member of a trade union. I refer the Committee to my entry in the Register of Members’ Financial Interests.
Data Protection and Digital Information (No. 2) Bill (Second sitting) Debate
Full Debate: Read Full DebateMark Eastwood
Main Page: Mark Eastwood (Conservative - Dewsbury)(1 year, 5 months ago)
Public Bill CommitteesQ
Aimee Reed: It is not as easy as we would like it to be, and provision is not made in the Bill to make that easier. There are some discussions about it going into the Online Safety Bill and other areas. It could be easier. We would push harder in the future, but at the moment, getting parity across the other areas and around national security is a focus that we welcome.
Helen Hitching: I want to pick up on the fact that safeguards are not reducing. It is key that the agency notes the point that our safeguards are not being lowered because of this.
Q
Aimee Reed: I will answer that in respect of where we are now in national policing. It would be of considerable benefit if the guidance was clearer that we could share information without having to redact it, certainly pre-charge, to enable better and easier charging decisions—to be honest—within the Crown Prosecution Service. It would also reduce the current burden on officers: you can think about the volume of data they have to hand over, and it can be video, audio, transcripts—it is not just witness statements, as it used to be 20 or 30 years ago. Reducing that burden would be significant for frontline officers and unleash them to be able to do other things.
Q
Aimee Reed: It certainly would. It is not that we cannot do that now; I just think the guidance could be clearer. It would put it into sharper relief if we could release that burden from policing to the CPS and the CPS felt confident that that was within the rules.
Helen Hitching: The agency agrees with that—there would be the same impact.
Q
Aimee Reed: It is not so much about specific datasets; it is about synchronisation and the speed with which you can exchange data that enables you to make better decisions. Because the Data Protection Act is split into three parts, and law enforcement quite rightly has a section all of its own, you cannot utilise data analytics across each of the parts. Does that make sense? If we wanted to do something with Driver and Vehicle Licensing Agency data and automatic number plate recognition data, we could not join together those two large datasets to enable mass analysis because there would be privacy rights considerations. If want to search datasets from other parts of that Act, we have to do that in quite a convoluted administrative way that perhaps we can share within law enforcement. It is more about the speed of exchange.
Q
Andrew Pakes: We would like to see more. We are worried that the current legislation, because of things such as DPIAs, drops that level of standards, which means that the UK could end up trading on a lower standard than other countries, and that worries us.
Mary Towers: We are also concerned about the change to the test for international data transfers, which might make the requirements less restrictive. There is a change from adequacy to a more risk-based assessment process in terms of international data transfers. Again, we have very similar concerns to Andrew about the use of technologies rooted in international companies and the inevitable international transfers of data, and workers essentially losing control over and knowledge of what is happening with their data beyond the workplace.
In addition, I would also like to make a point about the importance of transparency of source code, and the importance of ensuring that international trade deals do not restrict that transparency, meaning that workers cannot access information about source code once data and AI-powered tools are rooted in other countries.
Q
Mary Towers: I will give my statistics very quickly. Our polling revealed that approximately 60% of workers perceived that some form of monitoring was taking place in their workplace. The CEO of IBM told Bloomberg last week that 30% of non-customer facing roles, including HR functions, could be replaced by AI and automation in the next five years.
A recent report from the European Commission’s Joint Research Centre—the “Science for Policy” report on the platformatisation of work—found that 20% of German people and 35% of Spanish people are subject to algorithmic management systems at the moment. Although that is obviously not UK-based, it gives you a very recent insight on the extent of algorithmic management across Europe.
Andrew Pakes: And that matches our data. Around a third of our members say that they are subject to some form of digital monitoring or tracking. That has grown, particularly with the rise of hybrid and flexible working, which we are in favour of. This is a problem we wish to solve, rather than something to stop, in terms of getting it right.
Over the past two years, we have increasingly seen people being performance managed or disciplined based on data collected from them, whether that is from checking in and out of buildings, their use of emails, or not being in the right place based on tracking software. None of the balances we want should restrict the legitimate right of managers to manage, but there needs to be a balance within that. We know that using this software incorrectly can micromanage people in a way that is bad for their wellbeing.
The big international example, which I will give very quickly, is that if you look at a product like Microsoft—a global product—employers will buy it. My work computer has Office 365 on it. Employers get it on day one. The trouble with these big products is that, over time, they add new products and services. There was an example where Microsoft did bring in a productivity score, which could tell managers how productive and busy their teams were. They rowed back on that, but we know that with these big, global software projects—this is the point of DPIAs—it is not just a matter of consultation on day one.
The importance of DPIAs is that they stipulate that there must be regular reviews, because we know that the power of this technology transforms quickly. The danger is that we make life miserable for people who are good, productive workers and cause more problems for employers. It would be better for all of us to solve it through good legislation than to arm up the lawyers and solve it through the courts.
I am afraid that we are subject to chronological monitoring, so we must bring this session to an end. I thank our two representatives very much indeed for their evidence this afternoon; we are grateful for your time. We will now move on to our 10th panel.
Examination of Witnesses
Alexandra Sinclair, Ms Laura Irvine and Jacob Smith gave evidence.