Data Protection and Digital Information (No. 2) Bill (Second sitting) Debate
Full Debate: Read Full DebateStephanie Peacock
Main Page: Stephanie Peacock (Labour - Barnsley South)(1 year, 6 months ago)
Public Bill CommitteesQ
Jonathan Sellors: I am not sure I am the expert on this particular topic, because my experience is more research-based than in IT systems embedded in clinical care.
Tom Schumacher: I am also not as intimately familiar with that issue, but I would say that interoperability is absolutely critical. One of the challenges we experience with our technologies—I assume this is also the case for your health providers—is the ability to have high-quality data that means the same thing in different systems. That is a challenge that will be improved, but it is really a data challenge more than a privacy challenge. That is how I see it.
Q
Jonathan Sellors: I think it is a thoroughly useful clarification of what constitutes research. It is essentially welcome, because it was not entirely clear under the provisions of the General Data Protection Regulation what the parameters of research were, so this is a helpful clarification.
Tom Schumacher: I completely concur: it is very useful. I would say that a couple of things really stand out. One is that it makes it clear that private industry and other companies can participate in research. That is really important, particularly for a company like Medtronic because, in order to bring our products through to help patients, we need to conduct research, have real-world data and be able to present that to regulators for approval. It will be extremely helpful to have that broader definition.
The other component of the definition that is quite helpful is that it makes it explicit that technology development and other applied research constitutes research. I know there is a lot of administrative churn trying to figure out what constitutes research and what does not, and I think this is a really helpful piece of clarification.
Q
Tom Schumacher: Maybe I can give an example. One of the businesses we purchased is a business based in the UK called Digital Surgery. It uses inter-body videos to try to improve the surgery process and create technologies to aid surgeons in prevention and care. One of the challenges has been, to what extent is the use of surgery videos to create artificial intelligence and a better outcome for patient research? Ultimately, it was often the case that a particular site or hospital would agree, but it created a lot of churn, activity and work back and forth to explain exactly what was to be done. I think this will make it much clearer and easier for a hospital to say, “We understand this is an appropriate research use” and to be in a position to share that data according to all the protections that the GDPR provides around securing and de-identifying the data and so on.
Jonathan Sellors: I think our access test, which we apply to all our 35,000 users, is to ensure they are bona fide researchers conducting health-related research in the public interest. We quite often get asked whether the research they are planning to conduct is legitimate research. For example, a lot of genetic research, rather than being based on a particular hypothesis, is hypothesis-generating—they look at the data first and then decide what they want to investigate. This definition definitely helps clear up quite a few—not major, but minor—confusions that we have. They arise quite regularly, so I think it is a thoroughly helpful development to be able to point to something with this sort of clarity.
Welcome, gentlemen. We will now hear from Harry Weber-Brown, chief engagement officer at ZILO, and Phillip Mind, director of digital technology and innovation at UK Finance. We have until 2.50pm for this session. I now invite the witnesses to please introduce themselves to the Committee for the record, starting with Mr Weber-Brown.
Harry Weber-Brown: Thank you very much. My name is Harry Weber-Brown, chief engagement officer for ZILO Technology Ltd, which is a start-up based in London. I have previously worked for the Investing and Saving Alliance. I have much experience in both smart data, which is dealt with in part 3 of the Bill, and digital identity, which relates to digital verification services in part 2.
Phillip Mind: Good afternoon. I am Phillip Mind, director of digital technology and innovation at UK Finance, a trade body representing over 300 organisations in the bank and finance community. Like Harry, my expertise resides more in parts 2 and 3 of the Bill, although I have a few insights into part 1.
Q
Phillip Mind: The banking community is supportive of the Bill, which is enabling of a digital economy. The data protection reforms reduce compliance burdens on business, which is very welcome. The provisions on digital identity are enabling, and we see digital identity as an essential utility for customers in the future. The provisions on smart data extend an open data regime to other sectors. We already have an open banking regime, and we are keen for that to extend to other sectors. It offers real opportunities in terms of innovative products and services, but we would caution the Committee that there is significant cost and complexity in those measures.
Harry Weber-Brown: The Bill is key to retaining the UK’s place as a hub for technical innovation, and in particular for investment in fintech. It is critical also to make sure the UK remains a global leader in data portability. Building on the work that Phillip just mentioned on open banking, which has over 7 million users among both consumers and small and medium-sized enterprises, it is critical that we make sure we are ahead of the competition.
For the financial services sector, the provisions on ID help to reduce costs for things like onboarding and reduce fraud for things like authorised push payments. It also delivers a better customer experience, so you do not have to rummage around to find your passport every time you want to set up a new account or need to verify yourself to a financial service firm.
Smart data is an opportunity for us to extend ourselves as the world leader in open finance, building on the work of not only open banking but the pensions dashboard, which is yet to be launched but is another open finance scheme. The opportunity to widen up and give consumers more control in their ability to share data is critical for the customer, the economy and the financial services industry.
Q
Phillip Mind: In the banking industry we have open banking, which allows customers to choose and consent to allow an authorised third party provider access to their account to provide products and services—access to see the data. It also allows—again, with customer choice and consent—customers to allow a third party provider to make payments on their behalf. That has been hugely enabling. It has enabled growth in all sorts of innovative products and services and growth in fintech in the UK. As Harry mentioned, there are over 7 million active customers at the moment, but it does come with a cost; it is not a free good. Making that service available has involved cost and complexity.
In extending the provisions to other sectors through secondary legislation, it is really important that we are cognisant of the impacts and the unintended consequences. Many sectors have pre-existing data-sharing arrangements, many of which are commercial, and it is important that we understand the relative costs and benefits and how they fall among different participants in the market. My caution to the Committee and to Government is to go into those smart data schemes with eyes open.
Q
Phillip Mind: Clauses 62 and 64 make provision for the Secretary of State and Treasury to consult on smart data schemes. We think that those provisions could be strengthened. We see a need for impact assessments, cost-benefit analysis and full consultation. The Bill already allows for a post-implementation review, and we would advise that too.
Harry Weber-Brown: I think the other one to call out is the pensions dashboard, which has been driven out of the Money and Pensions Service. Although it has not actually launched yet, it has brought the life assurance industry on the site to develop free access to information. The consumer can see all their pensions holdings in a single place, which will then help them to make better financial decisions.
I think my former employer, the Investing and Saving Alliance, was working on an open savings, investments and pensions scheme. Obviously, that is not mandatory, but this is where the provision for secondary legislation is absolutely imperative to ensure that you get a wide scope of firms utilising this. At the moment, it is optional, but firms are still lining up and wanting to use it. There is a commitment within the financial services industry to do this, but having the legislation in place—secondary legislation, in particular—will ensure that they all do it to the same standards, both technical and data, and have a trust framework that wraps around it. That is why it is so imperative to have smart data.
Q
Harry Weber-Brown: In part 2 or part 3 of the Bill? The digital verification services or smart data?
Welcome, Mr Rosser. We have just 15 minutes, until 3.05 pm, for this session. Would you kindly introduce yourself to the Committee for the record?
Keith Rosser: My name is Keith Rosser. I am the chair of the Better Hiring Institute.
Q
Keith Rosser: Employers have been making hiring decisions using digital identity since 1 October, so we are a live case study. The biggest impact so far has been on the speed at which employers are able to hire staff and on the disconnection between where people live and the location of their job. For example, people in a digital identity scheme could apply for work, get a job and validate who they are without ever necessarily having to go and meet the employer. It is really important across the regions, from St Austell to Glasgow, that we are opening up job opportunities across the UK, including in some of our urban areas—West Bromwich, Barnsley and others—where people get greater job opportunities from where they live because they are not tied to where the employer is. It has had a profound effect already.
We recently looked at a study of 70,000 hires or people going through a hiring process, and 83%—some 58,000—opted to take the digital identity route. They did it in an average time of three minutes and 30 seconds. If we compare that with having to meet an employer and go through a process to provide your physical documents, there is a saving of around a week. If we think about making UK hiring the fastest globally, which is our ambition, people can start work a week earlier and pay taxes earlier, and we are cutting waiting lists and workloads. There is a huge positive impact.
In terms of employers making those hiring decisions, technology is so much better than people at identifying whether a document is genuine and the person is who they say they are. In that case study, we found that 200 of the 70,000 people going through the process had fake documents or fraudulently obtained genuine documents. The question is, would the human eye have spotted that prior to the implementation of digital identity? I am certain that it would not have done. Digital identity is really driving the potential for UK hiring to be a shining example globally.
Q
Keith Rosser: From that 70,000 example, we have not seen evidence yet that public trust has been negatively impacted. There are some very important provisions in the Bill that have to go a long way to assuring that. One is the creation of a governance body, which we think is hugely important. There has to be a monitoring of standards within the market. It also introduces the idea of certifying companies in the market. That is key, because in this market right now 30% of DVSs—nearly one in three companies—are not certified. The provision to introduce certification is another big, important move forward.
We also found, through a survey, that we had about 25% fewer objections when a user, company or employer was working with a certified company. Those are two really important points. In terms of the provision on improving the fraud response, we think there is a real opportunity to improve what DVSs do to tackle fraud, which I will probably talk about later.
Q
Keith Rosser: I have every reason to believe that organisations not certified will not be meeting anywhere near the standards that they should be meeting under a certified scheme. That appears really clear. They certainly will not be doing as much as they need to do to tackle fraud.
My caveat here is that across the entire market, even the certified market, I think that there is a real need for us to do more to make sure that those companies are doing far more to tackle fraud, share data and work with Government. I would say that uncertified is a greater risk, certainly, but even with certified companies we must do more to make sure that they are pushed to meet the highest possible standards.
Welcome and thank you. Aimee Reed?
Aimee Reed: Hello, everybody. This is also my first appearance in front of a Bill Committee. I am the Director of Data at the Metropolitan Police Service. For my sins, I also volunteer to lead all 43 forces on data; I am chair of the national police data board. I am here today in that capacity as well.
Q
Aimee Reed: It is a big requirement across all 43 forces, largely because, as I am sure you are aware, we are operating on various aged systems. Many of the technology systems across the policing sector do not have the capacity to log section 62 requirements, so police officers are having to record extra justification in spreadsheets alongside the searches and release of information that they deliver. So the requirement is a considerable burden across all the forces.
Q
“detecting, investigating or preventing crime”,
to quote the new definition, aid the tackling of serious crime in the UK?
Helen Hitching: Sorry—could you repeat that?
Sure. My understanding of the legislation in front of us is that if the Bill becomes law,
“detecting, investigating or preventing crime”
will be listed as a recognised legitimate interest and therefore be subject to separate, or slightly amended, data rules. How will that change help tackle serious crime in the UK?
Helen Hitching: I think it will bring a level of simplicity across the data protection environment and make sure that we can share data with our policing colleagues and other services in a more appropriate way. It will make the whole environment less complex.
Q
Helen Hitching: Yes, it will aid it. Again, it brings in the ability to put the data protection framework on the same level, so we can share data in an easier fashion and make it less complex.
Q
Helen Hitching: The agency does not believe that those safeguards will be lowered. We will still not be able to share data internationally with countries that do not have the same standards that are met by the UK. It will provide greater clarity about which regimes should be used and at which point. The standards will not reduce.
We now come to our ninth panel. We welcome Andrew Pakes, who is director of communications and research at Prospect, and Mary Towers, who is the policy officer at the Trades Union Congress. We have until 3.55 for this session. I invite the witnesses to introduce themselves to the Committee for the record—ladies first.
Mary Towers: Hi, and thanks very much for inviting the TUC to give evidence today. My name is Mary Towers. I am an employment rights policy officer at the TUC, and I have been leading a project at the TUC looking at the use of AI in the employment relationship for the past couple of years.
Andrew Pakes: Hello, everyone. Thank you for inviting Prospect to give evidence today. My name is Andrew Pakes. I am one of the deputy general secretaries and the research lead for Prospect union, which represents scientific, technical and professional workers. I am also a member of the OECD’s AI expert panel, representing trade unions.
Q
Andrew Pakes: We were already seeing a huge change in the use of digital technology prior to the pandemic. The pandemic itself, not least through all the means that have kept many of us working from home, has transformed that. Our approach as a trade union is to embrace technology. We believe that our economy and the jobs our members do can be made better and more productive through the good deployment of technology to improve jobs.
We also think there is a downside to it all. Everything that needs to be risked and balanced is in that. Alongside the advance in innovation and technology that has brought benefits to the UK, we have seen a rise in the darker or less savoury side of that, which is namely the rise of surveillance software; the ability of software to follow us, including while working from home, and to micromanage us and track people; and the use of technology in performance management—the so-called people analytics or HR management, which is largely an unregulated area.
If you ask me which legislation this should sit in, I would probably say an employment-type Bill, but this is the legislation we have and the Government’s choice. We would definitely like to see checks and balances at least retained in the new legislation compared with GDPR, but maybe they should be enhanced to ensure that there is some form of social partnership and that working people have a say over how technology is introduced and implemented in their workspaces.
Q
Andrew Pakes: There is increasing evidence that while technology has allowed many of us to remain connected to our workspaces—many of us can now take our work anywhere—the downside is that our work can follow us everywhere. It is about the balance of digital disconnection and the ability to switch off from work. I am probably preaching to the wrong crowd, because MPs are constantly on their phones and other technology, but many of us are able to put that away, or should do, because we are contracted workers and have a different relationship with our workplace in terms of how that balance is struck. We very much focus on wellbeing and on information and consultation, ensuring that people are aware of the information that is collected on us.
One of the troubling factors that we and the TUC have picked up is that consistently, in opinion polls and research that is done, working people do not have confidence or knowledge about what level of data is being collected and used on them. When we see the increasing power of technology through AI and automated decisions, anxiety in the workplace is best foiled by transparency, in the first place, and, we would obviously argue, a level of social partnership and negotiation over how technology is introduced.
Q
Andrew Pakes: Absolutely. What strikes me about the legislation you are considering is that just about all our major competitors—who are more productive and more advanced, often in innovation, including the United States—are choosing a path of greater scrutiny and accountability for AI and automated decision making. There is a concern that in this legislation we are taking an alternative path that makes us stand out in the international economy, which is about diluting existing protections we have within GDPR to a lower level. That raises concerns.
We have particular concerns about automated technology, but also about the clauses on the reduction of powers around data protection impact assessments. We think the risk is that the legislation could open the back door to the increase in dodgy surveillance and other forms of software coming into the UK market. I am worried about that for two reasons: first, because of the impact it has on individual workers and what is happening there; and secondly, because most of this technology—we have been part of a project that has tracked over 500 different surveillance software products currently on the international market—is designed largely for a US or Chinese market, with little knowledge of how it is being done.
What we know through ensuring consultation on the existing DPIA arrangements is that there is a break in the current rules that enables or ensures that employers have a consultation and check where their products are taking their data from and what they have stored. Diluting that risks ensuring that we are not sure where that data is being used and we are not sure of the power of this technology, and working people then end up with a worse deal than they currently have.
Q
Mary Towers: On the contrary, we would say that the Bill in fact reduces the collective rights of workers, particularly in relation to data protection impact assessments. As Andrew has mentioned, at the moment the right to a data protection impact assessment involves an obligation on an employer to consult with workers or their representatives. That is an absolutely key tool for trade unions to ensure that worker voice is represented in the path of the introduction of new technologies at work. Also, at the moment, missing from the Bill is the ability of trade unions to act as representatives for data subjects in a collective way. We say that that, too, is missing, could be added and would be an important role that unions could take on.
Another aspect missing from the Bill, which we say is a hugely missed opportunity, is a potential right that workers could have to have an equal right to their data that matches the right employers have over worker data. Once workers had that right, they could then collectivise their own data, which would enable them, for example, to pick up on any discriminatory patterns at work or pick up any problems with equal pay or the gender pay gap. We say that that right to collectivise data and redress the imbalance of power over data at work is really important.
The Bill misses entirely the opportunity to introduce those kinds of concepts, which are actually vital in the modern workplace, where data is everything. Data is about control; data is about influence; data is the route that workers have to establish fair conditions at work. Without that influence and control, there is a risk that only one set of interests is represented through the use of technology at work, and that technology at work, rather than being used to improve the world of work, is used to intensify work to an unsustainable level.
Q
Mary Towers: Yes. This is something that Andrew’s union, Prospect, has been really active in. It has produced some absolutely brilliant guidance that looks in detail at the importance of the process of data protection impact assessments and rolled out training for its trade union reps. Again, several of our other affiliates have undertaken that really important work, which is then being rolled out into the workplace to enable reps to make good use of that process.
I will, however, add the caveat that I understand from our affiliates that there is a very low level of awareness among employers about that obligation, about the importance of that process and about exactly what it involves. So a really important piece of awareness-raising work needs to be done there. We say it is vital to build on the existing rights in the UK GDPR, not dilute or remove them.
Q
Andrew Pakes: We would assert that under the law of GDPR, high risk in the legislation is, I think, in recital 39. I will correct that if I picked the wrong one. It talks about high risk as being decisions that can make material or non-material impact on people. If we now have software and algorithms or automated decisions that can hire and fire us—we have examples of that—and can decide who deserves a promotion or who can be disciplined, if that information can now be used to track individuals and decide whether someone is a good or bad worker, we would assert that that is a high risk. Anything that can actually affect both your standing in your workspace or your contractual relationship, which is essentially what employment is, or which has an impact on the trust and confidence the employer has in you and, equally, your trust and confidence back in the employer, that is a very clear definition of high risk.
What is important about the existing UK GDPR is that it recognises the nature of high risk but, secondarily, it recognises that data subjects themselves must be consulted and involved either directly or, where that is not practicable, through their representatives. Our worry is that the legislation that is tabled now dilutes that and opens up risk to bad practice.
Q
Mary Towers: The right to a data subject access request—again, like the DPIAs—is an absolutely crucial tool for trade unions in terms of establishing transparency over how their data is being used. Really, it provides a route for workers and unions to get information about what is going on in the workplace, how technologies operate and how they are operating in relation to individuals. It is an vital tool for trade unions.
What we are concerned about is that the new test specified in the Bill will provide employers with very broad discretion to decide when they do not have to comply with a data subject access request. The use of the term “vexatious or excessive” is a potential barrier to providing the right to an access request and provides employers with a lot of scope to say, for example, “Well, look, you have made a request several times. Now, we are going to say no.” However, there may be perfectly valid reasons why a worker might make several data subject access requests in a row. One set of information that is revealed may then lead a worker to conclude that they need to make a different type of access request.
We say that it is really vital to preserve and protect the right for workers to access information. Transparency as a principle is something that, again, goes to really important issues. For example, if there is discriminatory operation of a technology at work, how does a worker get information about that technology and about how the algorithm is operating? Data subject access requests are a key way of doing that.
Q
Andrew Pakes: “If we get this right” is doing a lot of heavy lifting there; I will leave it to Members to decide the balance. That should be the goal. There is a wonderful phrase from the Swedish trade union movement that I have cited before: “Workers should not be scared of the new machines; they should be scared of the old ones.” There are no jobs, there is no prosperity and there is no future for the kind of society that our members want Britain to be that does not involve innovation and the use of new technology.
The speed at which technology is now changing and the power of this technology compared with previous periods of economic change make us believe that there has to be a good, robust discussion about the balances of checks and balances in the process. We have seen in larger society—whether through A-level results, the Post Office or other things—that the detriment is significant on the individuals impacted if legislators get that balance wrong. I agree with the big principle and I will leave you to debate that, but we would certainly urge that checks and balances need to be balanced, not one-sided.
Mary Towers: Why does respect for fundamental rights have to be in direct conflict with growth and innovation? There is not necessarily any conflict there. Indeed, in a workplace where people are respected, have dignity at work and are working in a healthy way, that can only be beneficial for productivity and growth.
Q
Alexandra Sinclair: Thank you for the question. In order for the public to have trust and buy-in to these systems overall, so that they can benefit from them, they have to believe that their data is being used fairly and lawfully. That requires knowing which criteria are being used when making a decision, whether those criteria are relevant, and whether they are discriminatory or not. The first step to accountability is always transparency. You can know a decision is fair or lawful only if you know how the decision was made in the first place.
Q
Alexandra Sinclair: Currently the Government have their algorithmic reporting transparency standard—I think I have got that right; they keep changing the acronym. Currently on that system there are about six reports of the use of automated decision-making technology in government. The Public Law Project decided to create a parallel register of the evidence that we could find for automated decision making in government. Our register includes over 40 systems in use right now that involve partly automated decisions about people. It would be great if the Government themselves were providing that information.
Q
“There are clear benefits to organisations, individuals and society in explaining algorithmic decision-making”
in the public sector. Do you think that measures in the Bill achieve that? Do they unlock benefits and explain the Government’s algorithmic decision making to the public?
Alexandra Sinclair: No, and I think they do not do that for three reasons, if I have the time to get into this. The changes to subject access requests, to data protection impact assessments and to the prohibition on article 22 are the key issues that we see. The reason why we are particularly worried about subject access requests and data protection impact assessments is that they are the transparency provisions. They are how you find out information about what is happening. A subject access request is how you realise any other right in the Bill. You can only figure out if an error has been made about your data, or object to your data, if you know how your data is being used in the first place.
What we are worried about with the Bill is that you currently have an almost presumptive right to your data under a subject access request, but the change in the Bill changes the standard from the current “manifestly unfounded or excessive” to “vexatious or excessive”. It also gives a whole load of factors that data controllers are now allowed to take into account when declining your request for your own data. Furthermore, under the proposal in the Bill they do not have to give you the reason why they declined your request for the data. We think that is really problematic for individuals. You have got this information asymmetry there, and it is going to be really difficult for you to prove that your request was not vexatious or excessive if you do not even know why it was denied in the first place.
If we think about some examples that we have been talking about in Committee today, in a lot of the Uber and Ola-led litigation, where individuals were able to show that their employment rights had been unfairly treated, they were able to find out about that through subject access requests. Another example is the London Met police’s gangs matrix. The Information Commissioner’s Office did a review of that matrix and found that the system did not even clearly distinguish between victims and perpetrators of crime, and the only way for individuals to access the matrix and check if the information held on them is accurate is through a subject access request. That is our first concern with the Bill.
Our second concern is the changes to data protection impact assessments. The first thing to note is that they already have to apply only in high-risk processing situations, so we do not think that they are an undue or onerous burden on data controllers because they are already confined in their scope. What a data protection impact assessment does—this is what we think is beneficial about it—is not to be a brake on processing, but to force data controllers to think though the consequences of processing operations. It asks data controllers to think, “Where is that data coming from? What is the data source? Where is that data being trained? For what purpose is that data being used?” The new proposal under the Bill for data protection impact assessments significantly waters down those obligations and means that, essentially, the only requirement is accounting for the purposes for the data. So instead of explaining how the data is being used, you are only requiring that purpose.
We think that has two problems. First, data controllers will not be thinking through all the harms and consequences before they deploy a system. Secondly, if individuals affected by those systems want to get information about how their data was processed and what happened, there will be a lot less information on that impact assessment for them to assess the lawfulness of that processing.
My final critique of the Bill is this. We would say that the UK is world-leading in terms of article 22—other states are certainly looking to the UK—and it is a strange time to be looking to roll back protections. I do not know if Committee members have heard about how Australia recently experienced the Robodebt scandal, on which there is a royal commission at the moment. In that case, the system was a solely automated debt discrepancy system that ended up making over 500,000 incorrect decisions, telling people that they had committed benefit fraud when they had not. Australia is having to pay millions of dollars in compensation to those individuals and to deal with the human cost of that decision. The conversation in Australia right now is, “Maybe we should have article 22. Maybe this wouldn’t have happened if we had had a prohibition on solely automated decision making.” When other states are looking to beef up their AI protections, we need to think carefully about looking to roll them back.
Q
Jacob, what measures do you think should be in place to ensure that data protection legislation balances the need to protect national security with the need to uphold human rights? Does the Bill strike the right balance?
Jacob Smith: Thanks for the question. To take the second part first, we argue that the Bill does not strike the right balance between protecting national security and upholding data and privacy rights. We have three main concerns with how the Bill sets out that balance at the moment, and they come from clauses 24 to 26.
We have this altered regime of national security certificates for when law enforcement is taking measures in the name of national security, and we have this new regime of derogation notices. When law enforcement and the security services are collaborating, the notices allow the law enforcement body working in that collaboration to benefit from the more relaxed rules that are generally only for the intelligence services.
From our perspective, there are three main concerns. First, we are not quite sure why these amendments are necessary. Under human rights law, for an interference with somebody’s data or privacy rights to be lawful, it needs to be necessary, and that is quite a high standard. It is not something akin to it being more convenient for us to have access to this data, or more efficient for us to have access to this data; it has to meet a high standard of strict necessity. Looking through the Second Reading debate, the impact assessment and the European convention on human rights analysis, there is no reference to anything that would be akin to necessity. It is all, “It would be easier for law enforcement to have these extra powers. It would be easier if law enforcement were potentially able to use people’s personal data in more ways than they are at the moment.” But that is not the necessity standard.
The second concern is the lack of safeguards in the Bill. Another thing that human rights law—particularly article 8 of the ECHR—focuses on is the necessity of introducing additional safeguards to prevent the misuse of legislation that allows public bodies to interfere with people’s privacy rights. At the moment, as the Bill sets out, we have very weak safeguards when both national security certificates and designation notices are in place. At the moment, there is an opportunity, at least on the face of the Bill, for both those measures to be challenged before the courts. However, the issue here is that the Secretary of State has almost a monopoly over deciding whether those notices and certificates get published. So yes, although on the face of the Bill an individual may be able to challenge a national security certificate or a designation notice that has impacted them in some way, in practice they will not be able to do that if they do not know that it exists.
Finally, one encompassing issue is the expansive powers for the Secretary of State. One thing that we advocate is increased independent oversight. In the Bill, the Secretary of State has an extremely broad role in authorising law enforcement bodies to process personal data in a way that would otherwise be unlawful and go further than the existing regimes under the Data Protection Act 2018. Those are our three broad concerns in that regard. Ultimately, we do not see that the right balance has been made.
Q
Ms Irvine: We have concerns about the proposed changes and their potential impact on the independence of the Information Commissioner. I was able to listen to John Edwards speaking this morning, and I noted that he did not share those concerns, which I find surprising. The ICO is tasked with producing statutory codes of conduct, which are incredibly useful for my clients and for anyone working in this sector. The fact that the Secretary of State can, in effect, overrule these is concerning, and it must be seen as a limit on the Information Commissioner’s independence.
That leads to a concern that we have in relation to the adequacy decision that is in place between the EU and the United Kingdom. Article 52 of the GDPR states very clearly that a supervisory authority must have clear independence. The provisions relating to the independence of the Commission—the potential interference of the Secretary of State in law is enough to undermine independence—are therefore of concern to us.
Alexandra Sinclair: We would just say that it is not typical for an independent regulator to have its strategic objectives set by a Minister, and for a Minister to set those priorities without necessarily consulting. We consider that the ICO, as subject matter experts, are probably best placed to do that.
Jacob Smith: From our perspective, the only thing to add is that one way to improve the clauses on national security certificates and designation notices would be to give the ICO an increased role in oversight and monitoring, for instance. Obviously, if there are concerns about its independence, we would want to consider other mechanisms.
Q
Ms Irvine: Certainly. There are terms that have been used in data protection law since the 1984 Act. They were used again in the 1998 Act, echoed under the GDPR and included in all the guidance that has come from the Information Commissioner’s Office over the past number of years. In addition to that, there is case law that has interpreted many of those terms. Some of the proposed changes in the Bill introduce unexpected and unusual terms that will require interpretation. Even then, once we have guidance from the Information Commissioner, that guidance is sometimes not as helpful as interpretation by tribunals and courts, which is pretty sparse in this sector. The number of cases coming through the courts is limited—albeit that there is a lot more activity in the sector than there used to be. It simply presents a lot more questions and uncertainty in certain ways.
For my business clients, that is a great difficulty, and I certainly spend a lot of time advising clients on how I believe a matter—a phrase—will be interpreted, because I have knowledge of how data protection law works in general. That is based on my experience of the power of businesses and organisations, particularly in the third sector. Smaller bodies will often be challenged by a lack of knowledge and expertise, and that is a difficulty of introducing in legislation brand-new terms that are not familiar to practitioners, far less the organisations asked to implement the changes.
Q
Alex Lawrence-Archer: There is a group of changes in the Bill that, perhaps in ways that were unintended or at least not fully thought through, quite seriously undermine the protection of individuals’ privacy and data rights. A few of the most concerning ones are the change to the definition of personal data, recognising legitimate interests, purpose limitation, changes to the test for the exercise of data subject rights—I could go on. You will have heard about many of those today. It amounts to an undermining of data rights that seems not to be in proportion to the relatively modest gains in terms of reduction in bureaucracy on the part of data controllers.
Q
Alex Lawrence-Archer: It is quite difficult to predict, because it is complicated, but it is foundational to the regime of data protection. One of the issues is that in seeking to relieve data controllers of certain bureaucratic requirements, we are tinkering with these really foundational concepts such as lawful basis and the definition of personal data.
Two things could happen, I think. Some quite bad-faith arguments could be run to take quite a lot of processing outside the scope of the data protection regime. Although I doubt that those arguments would succeed, there is an additional issue; it is quite complicated to explain, but I will try. If it is unlikely but possible that an individual might be re-identified from a pseudonymised dataset—it could happen if there were a hack, say, but it is unlikely—that processing under the new regime would not, as the Bill is drafted, benefit from the protection of the regime. It would not be considered personal data, as it would not be likely that the individual could be identified from that dataset. That is a real problem because pseudonymised datasets are very common with large datasets. There are real risks there that would not be dealt with.
Q
Alex Lawrence-Archer: Under the current regime, that is a bit like asking, “How long is a piece of string?” It can take quite a long time. There are certain practices that the ICO follows in terms of requiring individuals to complain to the controller first. Some controllers are good; some are quick, but some are not. You might have a lot of back and forth about data access at the beginning, but other controllers might hand over your data really quickly. However, you could be looking at anything up to, say, 10 to 12 months.
Q
Alex Lawrence-Archer: Yes. You have heard from lots of people about the changes to the standard to be applied when any of the rights in chapter 3 are exercised by a data subject, and that includes the right of access. I think it is very likely that many more exercises of the right of access will be refused, at least initially. I think there will be many more complaints about the right of access and there is likely to be satellite litigation about those complaints as well, because you cannot proceed in finding out what has gone on with your data and rectify a problem unless you have access to the copies of it.
So, what you might find in many cases is a two-stage process whereby, first, you must resolve a complaint, maybe even a court case, about your right to access the data and then, and only then, can you figure out what has actually been going on with it and resolve the underlying unlawfulness in the processing. Effectively, therefore, it is a doubling of the process for the individual.
Q
Alex Lawrence-Archer: The new definitions, particularly the list of factors to be taken into consideration in determining whether the test is met, provide a lot of breathing room for controllers, whether or not they have good intentions, to make arguments that they do not need to comply with the right of access. If you are looking not to comply or if you have an incentive not to, as many controllers do, that does not necessarily mean that you are acting in bad faith; you might just not want to hand over the data and think that you are entitled not to do so. If you are looking not to comply, you will look at the Act and see lots of hooks that you can hang arguments on. Ultimately, that will come back to individuals who are just trying to exercise their rights and who will be engaged in big arguments with big companies and their lawyers.
Q
Alex Lawrence-Archer: The age-appropriate design code was a real success for the UK in terms of its regulation and its reputation internationally. It clarified the rights that children have in relation to the processing of their personal data. However, those rights are only helpful if you know what is happening to your personal data, and if and when you find out that you can exercise your rights in relation to that processing.
As I have said, what the Bill does—again, perhaps inadvertently—is undermine in a whole host of ways your ability to know what is happening with your personal data and to do something about it when you find out that things have gone wrong. It seems to me that on the back of a notable success in relation to the AADC, we are now, with this Bill, moving in rather a different direction in terms of that argument for protection of personal data.
Looking at the even longer term, there will be some slightly more nuanced changes if and when the AADC comes to be amended or redrafted, because of the role of the ICO and the factors that it has to take into account in its independence, which again you have already heard about. So you could, in the long term, see a new version of the AADC that is more business-friendly, potentially, because of this Bill.