House of Commons (36) - Written Statements (14) / Commons Chamber (9) / Petitions (8) / Written Corrections (3) / General Committees (2)
House of Lords (15) - Lords Chamber (12) / Grand Committee (3)
Good afternoon, my Lords. As usual, I begin by reminding your Lordships that if there is a Division in the Chamber while we are sitting, the Committee will adjourn as soon as the Division Bells are rung and resume after 10 minutes.
(8 months, 1 week ago)
Grand CommitteeOnce more unto the breach, my Lords—as opposed to “my friends”.
I will also speak to Amendments 112 to 114, 116 and 130. New Article 45B(2) lists conditions that the Secretary of State must consider when deciding whether a third country provides an adequate level of protection for data subjects. It replaces the existing conditions in Article 45(2)(a) to (c) of the UK GDPR, removing important considerations such as the impact of a third country’s laws and practices in relation to national security, defence, public security, criminal law and public authority access to personal data on the level of protection provided to UK data subjects.
Despite this shorter list of conditions to consider, the Secretary of State is none the less required to be satisfied that a third country provides a level of protection that is not materially lower than the UK’s. It is plain that such an assessment cannot be made without considering the impact of these factors on the level of protection for UK data in a third country. It is therefore unclear why the amendment that the Government have made to Article 45 is necessary, beyond a desire for the Government to draw attention away from such contentious and complicated issues.
It may be that through rewriting Article 45 of the UK GDPR, the Government’s intention is that assimilated case law on international data transfers is no longer relevant. If that is the case, that would be a substantial risk for UK data adequacy. Importantly, new Article 45B(2) removes the reference to the need for an independent data protection regulator in the relevant jurisdiction. This, sadly, is consistent with the theme of diminishing the independence of the ICO, which is one of the major concerns in relation to the Bill, and it is also an area where the European Commission has expressed concern. The independence of the regulator is a key part of the EU data adequacy regime and is explicitly referenced in Article 8 of the Charter of Fundamental Rights, which guarantees the right to protection of personal data. Amendment 111 restores the original considerations that the Secretary of State must take into account.
Amendments 112 and 113 would remove the proposed powers in Schedules 5 and 6 of the Secretary of State to assess other countries’ suitability for international transfers of data, and place these on the new information commission instead. In the specific context of HIV—the provenance of these amendments is in the National AIDS Trust’s suggestions—it is unlikely that the Secretary of State or their departmental officials will have the specialist knowledge to assess whether there is a risk of harm to an individual by transferring data related to their HIV status to a third country. Given that the activities of government departments are political by their nature, the Secretary of State making these decisions related to the suitability of transfer to third countries may not be viewed as objective by individuals whose personal data is transferred. Many people living with HIV feel comfortable reporting breaches of data protection law in relation to their HIV status to the Information Commissioner’s Office due to its position as an independent regulator, so the National AIDS Trust and others recommend that the Bill places these regulatory powers on the new information commission created by the Bill instead, as this may inspire greater public confidence.
As regards Amendment 114, paragraph 5 of Schedule 5 should contain additional provisions to mandate annual review of the data protection test for each third country to which data is transferred internationally to ensure that the data protection regime in that third country is secure and that people’s personal data, such as their HIV status, will not be shared inappropriately. HIV is criminalised in many countries around the world, and the transfer to these countries of personal data such as an individual’s HIV status could put an individual living with HIV, their partner or their family members at real risk of harm. This is because HIV stigma is incredibly pronounced in many countries, which fosters a real risk of HIV-related violence. Amendment 114 would mandate this annual review.
As regards Amendment 116, new Article 47A(4) to (7) gives the Secretary of State a broad regulation-making power to designate new transfer mechanisms for personal data being sent to a third country in the absence of adequacy regulations. Controllers would be able to rely on these new mechanisms, alongside the existing mechanisms in Article 46 of the UK GDPR, to transfer data abroad. In order to designate new mechanisms, which could be based on mechanisms used in other jurisdictions, the Secretary of State must be satisfied that these are
“capable of securing that the data protection test set out in Article 46 is met”.
The Secretary of State must be satisfied that the transfer mechanism is capable of providing a level of protection for data subjects that is not materially lower than under the UK GDPR and the Data Protection Act. The Government have described this new regulation-making power as a way to future-proof the UK’s GDPR international transfers regime, but they have not been able to point to any transfer mechanisms in other countries that might be suitable to be recognised in UK law, and nor have they set out examples of how new transfer mechanisms might be created.
In addition to not having a clear rationale to take the power, it is not clear how the Secretary of State could be satisfied that a new mechanism is capable of providing the appropriate level of protection for data subjects. This test is meant to be a lower standard than the test for controllers seeking to rely on a transfer mechanism to transfer overseas, which requires them to consider that the mechanism provides the appropriate level of protection. It is not clear to us how the Secretary of State could be satisfied of a mechanism’s capability without having a clear sense of how it would be used by controllers in reality. That is the reason for Amendment 116.
As regards Amendment 130, Ministers have continued all the adequacy decisions that the EU had made in respect of third countries when the UK stopped being subject to EU treaties. The UK also conferred data adequacy on the EEA, but all this was done on a transitional basis. The Bill now seeks to continue those adequacy decisions, but no analysis appears to have been carried out as to whether these jurisdictions confer an adequate level of protection of personal data. This is not consistent with Section 17B(1) of the DPA 2018, which states that the Secretary of State must carry out a review of whether the relevant country that has been granted data adequacy continues to ensure an adequate level of protection, and that these reviews must be carried out at intervals of not more than four years.
In the EU, litigants have twice brought successful challenges against adequacy decisions. Those decisions were deemed unlawful and quashed by the European Court of Justice. It appears that this sort of challenge would not be possible in the UK because the adequacy decisions are being continued by the Bill and therefore through primary legislation. Any challenge to these adequacy decisions could result only in a declaration of incompatibility under the Human Rights Act; it could not be quashed by the UK courts. This is another example of how leaving the EU has diminished the rights of UK citizens compared with their EU counterparts.
As well as tabling those amendments, I support and have signed Amendment 115 in the names of the noble Lords, Lord Bethell and Lord Kirkhope, and I look forward to hearing their arguments in relation to it. In the meantime, I beg to move.
My Lords, I rise with some temerity. This is my first visit to this Committee to speak. I have popped in before and have been following it very carefully. The work going on here is enormously important.
I am speaking to Amendment 115, thanks to the indulgence of my noble friend Lord Bethell, who is the lead name on that amendment but has kindly suggested that I start the discussions. I also thank the noble Lord, Lord Clement-Jones, for his support. Amendment 115 has one clear objective and that is to prevent transfer of UK user data to jurisdictions where data rights cannot be enforced and there is no credible right of redress. The word “credible” is important in this amendment.
I thank my noble friend the Minister for his letter of 11 April, which he sent to us to try to mop up a number of issues. In particular, in one paragraph he referred to the question of adequacy, which may also touch on what the noble Lord, Lord Clement-Jones, has just said. The Secretary of State’s powers are also referred to, but I must ask: how, in a fast-moving or unique situation, can all the factors referred to in this long and comprehensive paragraph be considered?
The mechanisms of government and government departments must be thorough and in place to satisfactorily discharge what are, I think, somewhat grand intentions. I say that from a personal point of view, because I was one of those who drafted the European GDPR—another reason I am interested in discussing these matters today—and I was responsible for the adequacy decisions with third countries. The word “adequacy” matters very much in this group, in the same way that we were unable to use “adequacy” when we dealt with the United States and had to look at “equivalence”. Adequacy can work only if one is working to similar parameters. If one is constitutionally looking at different parameters, as is the case in the United States, then the word “equivalence” becomes much more relevant, because, although things cannot be quite the same in the way in which administration or regulation is carried out, if you have an equivalence situation, that can be acceptable and lead to an understanding of the adequacy which we are looking for in terms of others being involved.
I have a marvellous note here, which I am sure noble Lords have already talked about. It says that every day we generate 181 zettabytes of personal data. I am sure noble Lords are all aware of zettabytes, but I will clarify. One zettabyte is 1,000 exabytes—which perhaps makes it simpler to understand—or, if you like, 1 billion trillion bytes. One’s mind just has to get around this, but this is data on our movements, finances, health and families, from our cameras, phones, doorbells and, I am afraid, even from our refrigerators—though Lady Kirkhope refuses point blank to have any kind of detector on her fridge door that will tell anybody anything about us or what we eat. Increasingly, it is also data from our cars. Our every moment is recorded—information relating to everything from shopping preferences to personal fitness to our anxieties, even, as they are displayed or discussed. It is stored by companies that we entrust with that data and we have a right to expect that such sensitive and private data will be protected. Indeed, one of the core principles of data protection, as we all know, is accountability.
Article 79 of the UK GDPR and Section 167 of our Data Protection Act 2018 provide that UK users must have the right to effective judicial remedy in the event of a data protection breach. Article 79 says that
“each data subject shall have the right to an effective judicial remedy where he or she considers that his or her rights under this Regulation have been infringed as a result of the processing of his or her personal data in non-compliance with this Regulation”.
My Lords, I will speak to Amendment 115 in my name. I start by saying a huge thanks to the noble Lord, Lord Clement-Jones, and my noble friend Lord Kirkhope, who have put everything so well and persuasively that I have almost nothing else to say in support. I am looking forward to the Minister throwing in the towel and accepting all the measures as suggested. Noble Lords have really landed it well.
I shall not go through the principle behind my amendment because, frankly, its benefit is so self-evident and clear that it does not need to be rehearsed in great detail. What I want to get across is the absolute and paramount urgency of the Government adopting this measure or a similar one. This is a terrific Bill; I thank the Minister for all the work that he and his team have done on it. I sat through Second Reading, although I did not speak on that day, when the Minister gave a persuasive account of the Bill; we are grateful for that.
However, this is a massive gap. It is a huge lacuna in the provisions of a Bill called a data protection Bill. It is a well-known gap in British legislation—and, by the way, in the legislation of lots of other countries. We could try to wait for an international settlement—some kind of Bretton Woods of data—where all the countries of the world put their heads together and try to hammer out an international agreement on data. That would be a wonderful thing but there is no prospect whatever of it in sight, so the time has come for countries to start looking at their own unilateral arrangements on the international transfer of data.
We have sought to duck this commitment by stringing together a Heath Robinson set of arrangements around transfer risk arrestments and bilateral agreements with countries. This has worked to some extent—at least to the extent that there is a booming industry around data. We should not diminish that achievement but there are massive gaps and huge liabilities in that arrangement, as my noble friend Lord Kirkhope rightly described, particularly now that we are living in a new, polarised world where countries of concern deliberately seek to harvest our data for their own security needs.
There are three reasons why this has become not just a chronic issue that could perhaps be kicked down the road a bit but an acute issue that should be dealt with immediately in the Bill’s provisions. The first, which my noble friend hinted at, is the massive flood of new data coming our way. I had the privilege of having a look at a BYD car. It was absolutely awesome and, by the way, phenomenally cheap; if the Chinese taxpayer is okay with subsidising our cars, I would highly recommend them to everyone here. One feature of the car is a camera on the dashboard that looks straight at the driver’s face, including their emotional resonance; for instance, if you look weary, it will prompt you to stop and have a coffee. That is a lovely feature but it is also mapping your face for hours and hours every year and, potentially, conveying that information to the algorithmic artificial intelligence run by the CCP in China—something that causes me huge personal concern. Lady Kirkhope may be worried about her fridge but I am very worried about my potential car. I embrace the huge global growth of data exchanges and technology’s benefits for citizens, taxpayers and voters, but this must be done in a well-curated field. The internet of things, which, as many noble Lords will know, was invented by Charlie Parsons, is another aspect of this.
Secondly, the kind of data being exchanged is becoming increasingly sensitive. I have mentioned the video in the BYD car; genomics data is another area of grave concern. I have an associate fellowship at King’s College London’s Department of War Studies, looking specifically at bioweapons and the transfer of genomic data. Some of this is on the horizon; it is not of immediate use from a strategic and national security point of view today but the idea that there could be, as in a James Bond film, some way of targeting individuals with poisons based on their genomic make-up is not beyond imagination.
The idea that you could create generalised bioweapons around genomics or seek to influence people based in part on insight derived from their genomic information is definitely on the horizon. We know that because China is doing some of this already; in the west of China, it is able to identify members of the Uighur tribes. In fact, China can say to someone, “We’re calling you up because we know that you’re the cousin of someone who is in prison today”, and this has happened. How does China know that? It has done it through the genomic tracking in its databases. China’s domestic use of data, through the social checking of genomic data and financial transactions, is a very clear precedent for the kinds of things that could be applied to the data that we are sharing with such countries.
Thirdly, there is the sensitivity of what uses the data is being put to. The geopolitics of the world are changing considerably. We now have what the Americans call countries of concern that are going out of their way to harvest and collect data on our populations. It is a stated element of their national mission to acquire data that could be used for national security purposes. These are today’s rivals but, potentially, tomorrow’s enemies.
For those three reasons, I very much urge the Minister to think about ways in which provisions on the international transfer of data could be added to the Bill. Other countries are certainly looking at the same; on 28 February this year, President Biden issued executive order 14117, which in many ways echoes the themes of our Amendment 115. It says clearly that there is an “unacceptable risk” to US national security from the large sharing of data across borders and asks the DoJ to publish a “countries of concern” list. That list has already been published and the countries on it are as the Committee would expect. It also seeks to define priority data. In other words, it is a proportionate, thoughtful and sensible set of measures to try to bring some kind of guard-rail to an industry where data transfer is clearly of grave concern to Americans. It looks particularly at genomic and financial transaction data but it has the capacity to be a little broader.
I urge the Minister to consider that this is now the time for unilateral action by the British Government. As my noble friend Lord Kirkhope said, if we do not do that, we may find ourselves being left behind by the EU, including the Irish, by the Americans and so on. There is an important spill-over effect from Britain acting sensibly that will do something to inspire and prod others into action. It is totally inappropriate to continue this pretence that British citizens are having their data suitably protected by the kind of commercial contracts that they are signing, which have no kind of redress or legal standing in the country of destination.
Lastly, the commercial point is very important. For those of us who seek to champion an open, global internet and a free flow of data while facilitating investment in that important trade, we must curate and care for it in a way that instils trust and responsibility, otherwise the whole thing will be blown up and people will start pulling wires out of the back of machines.
My Lords, I am very grateful to the noble Lords, Lord Clement-Jones, Lord Bethell and Lord Kirkhope, for tabling these amendments and for enabling us to have a good debate on the robustness of the proposed international data rules, which are set out in Schedules 5 and 7. Incidentally, I do not share the enthusiasm expressed by the noble Lord, Lord Bethell, for the rest of the Bill, but on this issue we are in agreement—and perhaps the other issues are for debate some other time.
I welcome the Committee back after what I hope was a good Easter break for everybody. I thank all those noble Lords who, as ever, have spoken so powerfully in this debate.
I turn to Amendments 111 to 116 and 130. I thank noble Lords for their proposed amendments relating both to Schedule 5, which reforms the UK’s general processing regime for transferring personal data internationally and consolidates the relevant provisions in Chapter 5 of the UK GDPR, and to Schedule 7, which introduces consequential and transitional provisions associated with the reforms.
Amendment 111 seeks to revert to the current list of factors under the UK GDPR that the Secretary of State must consider when making data bridges. With respect, this more detailed list is not necessary as the Secretary of State must be satisfied that the standard of protection in the other country, viewed as a whole, is not materially lower than the standard of protection in the UK. Our new list of key factors is non-exhaustive. The UK courts will continue to be entitled to have regard to CJEU judgments if they choose to do so; ultimately, it will be for them to decide how much regard to have to any CJEU judgment on a similar matter.
I completely understand the strength of noble Lords’ concerns about ensuring that our EU adequacy decisions are maintained. This is also a priority for the UK Government, as I and my fellow Ministers have repeatedly made clear in public and on the Floor of the House. The UK is firmly committed to maintaining high data protection standards, now and in future. Protecting the privacy of individuals will continue to be a national priority. We will continue to operate a high-quality regime that promotes growth and innovation and underpins the trustworthy use of data.
Our reforms are underpinned by this commitment. We believe they are compatible with maintaining our data adequacy decisions from the EU. We have maintained a positive, ongoing dialogue with the EU to make sure that our reforms are understood. We will continue to engage with the European Commission at official and ministerial levels with a view to ensuring that our respective arrangements for the free flow of personal data can remain in place, which is in the best interests of both the UK and the EU.
We understand that Amendments 112 to 114 relate to representations made by the National AIDS Trust concerning the level of protection for special category data such as health data. We agree that the protection of people’s HIV status is vital. It is right that this is subject to extra protection, as is the case for all health data and special category data. As I have said before this Committee previously, we have met the National AIDS Trust to discuss the best solutions to the problems it has raised. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press these amendments.
Can the Minister just recap? He said that he met the trust then swiftly moved on without saying what solution he is proposing. Would he like to repeat that, or at least lift the veil slightly?
The point I was making was only that we have met with it and will continue to do so in order to identify the best possible way to keep that critical data safe.
The Minister is not suggesting a solution at the moment. Is it in the “too difficult” box?
I doubt that it will be too difficult, but identifying and implementing the correct solution is the goal that we are pursuing, alongside our colleagues at the National AIDS Trust.
I am sorry to keep interrogating the Minister, but that is quite an admission. The Minister says that there is a real problem, which is under discussion with the National AIDS Trust. At the moment the Government are proposing a significant amendment to both the GDPR and the DPA, and in this Committee they are not able to say that they have any kind of solution to the problem that has been identified. That is quite something.
I am not sure I accept that it is “quite something”, in the noble Lord’s words. As and when the appropriate solution emerges, we will bring it forward—no doubt between Committee and Report.
On Amendment 115, we share the noble Lords’ feelings on the importance of redress for data subjects. That is why the Secretary of State must already consider the arrangements for redress for data subjects when making a data bridge. There is already an obligation for the Secretary of State to consult the ICO on these regulations. Similarly, when considering whether the data protection test is met before making a transfer subject to appropriate safeguards using Article 46, the Government expect that data exporters will also give consideration to relevant enforceable data subject rights and effective legal remedies for data subjects.
Our rules mean that companies that transfer UK personal data must uphold the high data protection standards we expect in this country. Otherwise, they face action from the ICO, which has powers to conduct investigations, issue fines and compel companies to take corrective action if they fail to comply. We will continue to monitor and mitigate a wide range of data security risks, regardless of provenance. If there is evidence of threats to our data, we will not hesitate to take the necessary action to protect our national security.
My Lords, we heard from the two noble Lords some concrete examples of where those data breaches are already occurring, and it does not appear to me that appropriate action has been taken. There seems to be a mismatch between what the Minister is saying about the processes and the day-to-day reality of what is happening now. That is our concern, and it is not clear how the Government are going to address it.
My Lords, in a way the Minister is acknowledging that there is a watering down taking place, yet the Government seem fairly relaxed about seeing these issues. If something happens, the Government will do something or other, or the commissioner will. But the Government are proposing to water down Article 45, and that is the essence of what we are all talking about here. We are not satisfied with the current position, and watering down Article 45 will make it even worse; there will be more Yandexes.
The Minister mentioned prosecutions and legal redress in the UK from international data transfer breaches. Can he share some examples of that, maybe by letter? I am not aware of that being something with a long precedent.
A number of important points were raised there. Yes, of course I will share—
I am sorry to interrupt my noble friend, but the point I made—this now follows on from other remarks—was that these requirements have been in place for a long time, and we are seeing abuses. Therefore, I was hoping that my noble friend would be able to offer changes in the Bill that would put more emphasis on dealing with these breaches. Otherwise, as has been said, we look as though we are going backwards, not forwards.
As I said, a number of important points were raised there. First, I would not categorise the changes to Article 45 as watering down—they are intended to better focus the work of the ICO. Secondly, the important points raised with respect to Amendment 115 are points primarily relating to enforcement, and I will write to noble Lords setting out examples of where that enforcement has happened. I stress that the ICO is, as noble Lords have mentioned, an independent regulator that conducts the enforcement of this itself. What was described—I cannot judge for sure—certainly sounded like completely illegal infringements on the data privacy of those subjects. I am happy to look further into that and to write to noble Lords.
Amendment 116 seeks to remove a power allowing the Secretary of State to make regulations recognising additional transfer mechanisms. This power is necessary for the Government to react quickly to global trends and to ensure that UK businesses trading internationally are not held back. Furthermore, before using this power, the Secretary of State must be satisfied that the transfer mechanism is capable of meeting the new Article 46 data protection test. They are also required to consult with the Information Commissioner and such other persons felt appropriate. The affirmative resolution procedure will also ensure appropriate parliamentary scrutiny.
I reiterate that the UK Government’s assessment of the reforms in the Bill is that they are compatible with maintaining adequacy. We have been proactively engaging with the European Commission since the start of the Bill’s consultation process to ensure that it understands our reforms and that we have a positive, constructive relationship. Noble Lords will appreciate that it is important that officials have the ability to conduct candid discussions during the policy-making process. However, I would like to reassure noble Lords once again that the UK Government take the matter of retaining our adequacy decisions very seriously.
Finally, Amendment 130 pertains to EU exit transitional provisions in Schedule 21 to the Data Protection Act 2018, which provide that certain countries are currently deemed as adequate. These countries include the EU and EEA member states and those countries that the EU had found adequate at the time of the UK’s exit from the EU. Such countries are, and will continue to be, subject to ongoing monitoring. As is the case now, if the Secretary of State becomes aware of developments such as changes to legislation or specific practices that negatively impact data protection standards, the UK Government will engage with the relevant authorities and, where necessary, amend or revoke data bridge arrangements.
For these reasons, I hope noble Lords will not press their amendments.
My Lords, I thank the Minister for his response, but I am still absolutely baffled as to why the Government are doing what they are doing on Article 45. The Minister has not given any particular rationale. He has given a bit of a rationale for resisting the amendments, many of which try to make sure that Article 45 is fully effective, that these international transfers are properly scrutinised and that we remain data adequate.
By the way, I thought the noble Lord, Lord Kirkhope, made a splendid entry into our debate, so I hope that he stays on for a number of further amendments—what a début.
The only point on which I disagreed with the noble Lord, Lord Bethell—as the noble Baroness, Lady Jones, said—was when he said that this is a terrific Bill. It is a terrifying Bill, not a terrific one, as we have debated. There are so many worrying aspects—for example, that there is no solution yet for sensitive special category data and the whole issue of these contractual clauses. The Government seem almost to be saying that it is up to the companies to assess all this and whether a country in which they are doing business is data adequate. That cannot be right. They seem to be abrogating their responsibility for no good reason. What is the motive? Is it because they are so enthusiastic about transfer of data to other countries for business purposes that they are ignoring the rights of data subjects?
The Minister resisted describing this as watering down. Why get rid of the list of considerations that the Secretary of State needs to have so that they are just in the mix as something that may or may not be taken into consideration? In the existing article they are specified. It is quite a long list and the Government have chopped it back. What is the motive for that? It looks like data subjects’ rights are being curtailed. We were baffled by previous elements that the Government have introduced into the Bill, but this is probably the most baffling of all because of the real importance of this—its national security implications and the existing examples, such as Yandex, that we heard about from the noble Lord, Lord Kirkhope.
Of course we understand that there are nuances and that there is a difference between adequacy and equivalence. We have to be pragmatic sometimes, but the question of whether these countries having data transferred to them are adequate must be based on principle. This seems to me a prime candidate for Report. I am sure we will come back to it, but in the meantime I beg leave to withdraw.
My Lords, the issue of access to data for researchers is very familiar to all those involved in debates on the Online Safety Bill, now an Act. The issue is relatively simple and I am not going to spell it out in great detail. I will leave it to others to give more concrete examples.
The issue is that in the tech industry, there is a vast amount of data about the effect of social media and the impact on consumers of the technologies, algorithms and content that are in circulation. But there is a blackout when it comes to academics, epidemiologists, journalists or even parliamentarians who are trying to have a dig around to understand what is happening. What is happening on extremism or child safety? What is happening with fraud or to our national security? What is the impact on children of hours and hours spent on YouTube, Facebook, Snapchat and all the other technologies that are now consuming billions of hours of our time?
In other walks of life, such as the finance and retail sectors, there are open platforms where regulators, researchers and even the public can have a peek at what is going on inside. This is not commercial access; instead, it is trying to understand the impact on society and individuals of these very important and influential technologies. That kind of transparency absolutely underpins trust in these systems. The data is essential to policy-making and the surveillance is key to security.
What I want to convey is a sense that there is a very straightforward solution to this. There is a precedent, already being rolled out in the EU, that creates a good framework. Amendment 135 has been thoroughly discussed with the department in previous debates on the Online Safety Bill, and I thank the Minister and the Secretary of State for a number of meetings with parliamentarians and civil society groups to go through it. The idea of creating a data access pathway that has attached to it a clear validation system that secures the independence and privacy of researchers is relatively straightforward. Oversight by the ICO is something that we all agree gives it a sense of credibility and straightforwardness.
I want to try to convey to the Minister the importance of moving on this, because it has been discussed over several years. The regulator is certainly a supporter of the principle: Melanie Dawes, the CEO of Ofcom, gave testimony during the Joint Committee on the Online Safety Bill in which she said it was one of the things she felt was weak about that Bill. She would like to have seen it strengthened up. It was therefore disappointing that there was not a chance to do that then, but there is a chance to do it now.
During the passage of the Online Safety Act, the Minister also made commitments from the Dispatch Box about returning to this subject during the passage of this Bill, so it feels like a good moment to be discussing this. There are 40 impressive civic society groups that have written in clear terms about the need for this, so there is a wide body of opinion in support. One reason why it is so urgent that we get this measure in the Bill—and do not kick the can down the road—is that it is currently getting harder and harder for researchers, academics and scientists to look into the impact of the actions of our technology companies.
Twitter/X has withdrawn almost all access to the kind of data that makes this research possible. Facebook has announced that it will be stopping the support of CrowdTangle, the very important facility it had created, which had become a very useful tool. The feedback from the Meta live content library that is its theoretical replacement has not been very positive; it is a clunky and awkward tool to use. TikTok is a total black box and we have no idea what is going on in there; and the action by Elon Musk against the Center for Countering Digital Hate, which he pursued in the courts over its analysis of data, gives a sense of the very aggressive tone from tech companies towards researchers who are trying to do what is widely considered to be very important work.
My Lords, I support Amendment 135 in the name of the noble Lord, Lord Bethell, to which I have added my name. He set out our struggle during the passage of the Online Safety Bill, when we made several attempts to get something along these lines into the Bill. It is worth actually quoting the Minister, Paul Scully, who said at the Dispatch Box in the other place:
“we have made a commitment to explore this … further and report back to the House in due course on whether further measures to support researcher access to data are required and, if so, whether they could also be implemented through other legislation such as the Data Protection and Digital Information Bill”.—[Official Report, Commons, 12/9/23; col. 806.]
When the Minister responds, perhaps he could update the House on that commitment and explain why the Government decided not to address it in the Bill. Although the Bill proposes a lessening of the protections on the use of personal data for research done by commercial companies, including the development of products and marketing, it does nothing to enable public interest research.
I would like to add to the list that the noble Lord, Lord Bethell, started, because as well as Melanie Dawes, the CEO of Ofcom, so too the United States National Academy of Sciences, the Lancet commission, the UN advisory body on AI, the US Surgeon General, the Broadband Commission and the Australian eSafety Commissioner have all in the last few months called for greater access to independent research.
I ask the noble Viscount to explain the Government’s thinking in detail, and I really do hope that we do not get more “wait and see”, because it does not meet the need. We have already passed online safety legislation that requires evidence, and by denying access to independent researchers, we have a perverse situation in which the regulator has to turn to the companies it is regulating for the evidence to create their codes, which, as the noble Viscount will appreciate, is a formula for the tech companies to control the flow of evidence and unduly temper the intent of the legislation. I wish to make most of my remarks on that subject.
In Ofcom’s consultation on its illegal harms code, the disparity between the harms identified and Ofcom’s proposed code caused deep concern. Volume 4 states the following at paragraph 14.12 in relation to content moderation:
“We are not proposing to recommend some measures which may be effective in reducing risks of harm. This is principally due to currently limited evidence”.
Further reading of volume 4 confirms that the lack of evidence is the given reason for failing to recommend measures across a number of harms. Ofcom has identified harms for which it does not require mitigation. This is not what Parliament intended and spectacularly fails to deliver on the promises made by Ministers. Ofcom can use its information-gathering powers to build evidence on the efficacy required to take a bolder approach to measures but, although that is welcome, it is unsatisfactory for many reasons.
First, given the interconnectedness between privacy, safety, security and competition, regulatory standards cannot be developed in silo. We have a thriving academic community that can work across different risks and identify solutions across different parts of the tech ecosystem.
Secondly, a regulatory framework in which standards are determined exclusively through private dialogue between the regulator and the regulated does not have the necessary transparency and accountability to win public trust.
Thirdly, regulators are overstretched and under-resourced. Our academics stand ready and willing to work in the public interest and in accordance with the highest ethical standards in order to scrutinise and understand the data held so very closely by tech companies, but they need a legal basis to demand access.
Fourthly, if we are to maintain our academic institutions in a post-Brexit world, we need to offer UK academics the same support as those in Europe. Article 40(4) of the European Union’s Digital Services Act requires platforms to
“provide access to data to vetted researchers”
seeking to carry out
“research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35”.
It will be a considerable loss to the UK academic sector if its European colleagues have access to data that it does not.
Fifthly, by insisting on evidence but not creating a critical pathway to secure it, the Government have created a situation in which the lack of evidence could mean that Ofcom’s codes are fixed at what the tech companies tell it is possible in spring 2024, and will always be backward-looking. There is considerable whistleblower evidence revealing measures that the companies could have taken but chose not to.
I have considerable personal experience of this. For example, it was nearly a decade ago that I told Facebook that direct messaging on children’s accounts was dangerous, yet only now are we beginning to see regulation reflecting that blindingly obvious fact. That is nearly a decade in which something could have been done by the company but was not, and of which the regulator will have no evidence.
Finally, as we discussed on day one in Committee, the Government have made it easier for commercial companies to use personal data for research by lowering the bar for the collection of data and expanding the concept of research, further building the asymmetry that has been mentioned in every group of amendments we have debated thus far. It may not be very parliamentary language, but it is crazy to pass legislation and then obstruct its implementation by insisting on evidence that you have made it impossible to gather.
I would be grateful if the Minister could answer the following questions when he responds. Is it the Government’s intention that Ofcom codes be based entirely on the current practice of tech companies and that the regulator can demand only mitigations that exist currently, as evidenced by those companies? Do the Government agree that whistleblowers, NGO experts and evidence from user experience can be taken by regulators as evidence of what could or should be done? What route do the Government advise Ofcom to take to mitigate identified risks for which there are no current measures in place? For example, should Ofcom describe the required outcome and leave it to the companies to determine how they mitigate the risk, should it suggest mitigations that have been developed but not tried—or is the real outcome of the OSA to identify risk and leave that risk in place?
Do the Government accept that EU research done under the auspices of the DSA should be automatically considered as an adequate basis for UK regulators where the concerns overlap with UK law? Will the new measures announced for testing and sandboxing of AI models allow for independent research, in which academics, independent of government or tech, will have access to data? Finally, what measures will the Government take to mitigate the impact on universities of a brain drain of academics to Europe, if we do not provide equivalent legislative support to enable them to access the data required to study online safety and privacy? If the Minister is unable to answer me from the Dispatch Box, perhaps he will agree to write to me and place his letter in the Library for other noble Lords to read.
My Lords, there is little for me to say. The noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, have left no stone unturned in this debate. They introduced this amendment superbly, and I pay tribute to them and to Reset, which was with us all the way through the discussions on online harms at the Joint Committee on the draft Online Safety Bill, advocating for these important provisions.
As the noble Lord, Lord Bethell, said, there is a strong body of opinion out there. Insight from what might be called approved independent researchers would enable policy-making and regulatory innovation to keep pace with emerging trends and threats, which can span individual harms, matters of public safety and even national security. We have seen the kinds of harms taking place in social media, and it is absolutely vital that we understand what is happening under the bonnet of social media. It is crucial in detecting, identifying and understanding the systemic risks of online harms and non-compliance with law.
When we discussed the Online Safety Bill, it was a question of not just content but functionality. That was one of the key things. An awful lot of this research relates to that: how algorithms operate in amplifying content and some of the harms taking place on social media. The noble Lord, Lord Bethell, referred to X closing its API for researchers and Meta’s move to shut CrowdTangle. We are going into reverse, whereas we should be moving forward in a much more positive way. When the Online Safety Bill was discussed, we got the review from Ofcom, but we did not get the backup—the legislative power for Ofcom or the ICO to be able to authorise and accredit researchers to carry out the necessary research.
The Government’s response to date has been extremely disappointing, given the history behind this and the pressure and importance of this issue. This dates from discussions some way back, even before the Joint Committee met and heard the case for this kind of researcher access. This Bill is now the best vehicle by which to introduce a proper regime on access for researchers. As the noble Baroness, Lady Kidron, asked, why, having had ministerial assurances, are we not seeing further progress? Are we just going to wait until Ofcom produces its review, which will be at the tail end of a huge programme of work which it has to carry out in order to implement the Online Safety Act?
My Lords, I am grateful to the noble Lord, Lord Bethell, and his cosignatories for bringing this comprehensive amendment before us this afternoon. As we have heard, this is an issue that was debated at length in the Online Safety Act. It is, in effect, unfinished business. I pay tribute to the noble Lords who shepherded that Bill through the House so effectively. It is important that we tie up the ends of all the issues. The noble Lord made significant progress, but those issues that remain unresolved come, quite rightly, before us now, and this Bill is an appropriate vehicle for resolving those outstanding issues.
As has been said, the heart of the problem is that tech companies are hugely protective of the data they hold. They are reluctant to share it or to give any insight on how their data is farmed and stored. They get to decide what access is given, even when there are potentially illegal consequences, and they get to judge the risk levels of their actions without any independent oversight.
During the course of the Online Safety Bill, the issue was raised not only by noble Lords but by a range of respected academics and organisations representing civil society. They supported the cross-party initiative from Peers calling for more independent research, democratic oversight and accountability into online safety issues. In particular, as we have heard, colleagues identified a real need for approved researchers to check the risks of non-compliance in the regulated sectors of UK law by large tech companies—particularly those with large numbers of children accessing the services. This arose because of the increasing anecdotal evidence that children’s rights were being ignored or exploited. The noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, have given an excellent exposition of the potential and real harms that continue to be identified by the lack of regulatory action on these issues.
Like other noble Lords, I welcome this amendment. It is well-crafted, takes a holistic approach to the problem, makes the responsibilities of the large tech companies clear and establishes a systematic research base of vetted researchers to check compliance. It also creates important criteria for the authorisation of those vetted researchers: the research must be in the public interest, must be transparent, must be carried out by respected researchers, and must be free from commercial interests so that companies cannot mark their own homework. As has been said, it mirrors the provisions in the EU Digital Services Act and ensures comparable research opportunities. That is an opportunity for the UK to maintain its status as one of the top places in the world for expertise on the impact of online harms.
Since the Online Safety Act was passed, the Information Commissioner has been carrying out further work on the children’s code of practice. The latest update report says:
“There has been significant progress and many organisations have started to assess and mitigate the potential privacy risks to children on their platforms”.
That is all well and good but the ICO and other regulators are still reliant on the information provided by the tech companies on how their data is used and stored and how they mitigate risk. Their responsibilities would be made much easier if they had access to properly approved and vetted independent research information that could inform their decisions.
I am grateful to noble Lords for tabling this amendment. I hope that the Minister hears its urgency and necessity and that he can assure us that the Government intend to table a similar amendment on Report—as the noble Baroness, Lady Kidron, said, no more “wait and see”. The time has come to stop talking about this issue and take action. Like the noble Lord, Lord Clement-Jones, I was in awe of the questions that the noble Baroness came up with and do not envy the Minister in trying to answer them all. She asked whether, if necessary, it could be done via a letter but I think that the time has come on this and some other issues to roll up our sleeves, get round the table and thrash it out. We have waited too long for a solution and I am not sure that exchanges of letters will progress this in the way we would hope. I hope that the Minister will agree to convene some meetings of interested parties—maybe then we will make some real progress.
My Lords, as ever, many thanks to all noble Lords who spoke in the debate.
Amendment 135, tabled by my noble friend Lord Bethell, would enable researchers to access data from data controllers and processors in relation to systemic risks to the UK and non-compliance with regulatory law. The regime would be overseen by the ICO. Let me take this opportunity to thank both my noble friend for the ongoing discussions we have had and the honourable Members in the other place who are also interested in this measure.
Following debates during the passage of the Online Safety Act, the Government have been undertaking further work in relation to access to data for online safety researchers. This work is ongoing and, as my noble friend Lord Bethell will be aware, the Government are having ongoing conversations on this issue. As he knows, the online safety regime is very broad and covers issues that have an impact on national security and fraud. I intend to write to the Committee with an update on this matter, setting out our progress ahead of Report, which should move us forward.
While we recognise the benefits of improving researchers’ access to data—for example, using data to better understand the impact of social media on users—this is a highly complex issue with several risks that are not currently well understood. Further analysis has reiterated the complexities of the issue. My noble friend will agree that it is vital that we get this right and that any policy interventions are grounded in the evidence base. For example, there are risks in relation to personal data protection, user consent and the disclosure of commercially sensitive information. Introducing a framework to give researchers access to data without better understanding these risks could have significant consequences for data security and commercially sensitive information, and could potentially destabilise any data access regime as it is implemented.
In the meantime, the Online Safety Act will improve the information available to researchers by empowering Ofcom to require major providers to publish a broad range of online safety information through annual transparency reports. Ofcom will also be able to appoint a skilled person to undertake a report to assess compliance or to develop its understanding of the risk of non-compliance and how to mitigate it. This may include the appointment of independent researchers as skilled persons. Further, Ofcom is required to conduct research into online harms and has the power to require companies to provide information to support this research activity.
Moving on to the amendment specifically, it is significantly broader than online safety and the EU’s parallel Digital Services Act regime. Any data controllers and processors would be in scope if they have more than 1 million UK users or customers, if there is a large concentration of child users or if the service is high-risk. This would include not just social media platforms but any organisation, including those in financial services, broadcasting and telecoms as well as any other large businesses. Although we are carefully considering international approaches to this issue, it is worth noting that much of the detail about how the data access provisions in the Digital Services Act will work in practice is yet to be determined. Any policy interventions in this space should be predicated on a robust evidence base, which we are in the process of developing.
The amendment would also enable researchers to access data to research systemic risks to compliance with any UK regulatory law that is upheld by the ICO, Ofcom, the Competition and Markets Authority, and the Financial Conduct Authority. The benefits and risks of such a broad regime are not understood and are likely to vary across sectors. It is also likely to be inappropriate for the ICO to be the sole regulator tasked with vetting researchers across the remits of the other regulators. The ICO may not have the necessary expertise to make this determination about areas of law that it does not regulate.
Ofcom already has the power to gather information that it requires for the purpose of exercising its online safety functions. This power applies to companies in scope of the duties and, where necessary, to other organisations or persons who may have relevant information. Ofcom can also issue information request notices to overseas companies as well as to UK-based companies. The amendment is also not clear about the different types of information that a researcher may want to access. It refers to a data controller and processors—concepts that relate to the processing of personal data under data protection law—yet researchers may also be interested in other kinds of data, such as information about a service’s systems and processes.
Although the Government continue to consider this issue—I look forward to setting out our progress between now and Report—for the reasons I have set out, I am not able to accept this amendment. I will certainly write to the Committee on this matter and to the noble Baroness, Lady Kidron, with a more detailed response to her questions—there were more than four of them, I think—in particular those about Ofcom.
Perhaps I could encourage the Minister to say at least whether he is concerned that a lack of evidence might be impacting on the codes and powers that we have given to Ofcom in order to create the regime. I share his slight regret that Ofcom does not have this provision that is in front of us. It may be that more than one regulator needs access to research data but it is the independents that we are talking about. We are not talking about Ofcom doing things and the ICO doing things. We are talking about independent researchers doing things so that the evidence exists. I would like to hear just a little concern that the regime is suffering from a lack of evidence.
I am thinking very carefully about how best to answer. Yes, I do share that concern. I will set this out in more detail when I write to the noble Baroness and will place that letter in the House of Lords Library. In the meantime, I hope that my noble friend will withdraw his amendment.
I am enormously grateful to the Minister for his response. However, it falls short of my hopes. Obviously, I have not seen the letter that he is going to send us, but I hope that the department will have taken on board the commitments made by previous Ministers during discussions on the Online Safety Bill and the very clear evidence that the situation is getting worse, not better.
Any hope that the tech companies would somehow have heard the debate in the House of Lords and that it would have occurred to them that they needed to step up to their responsibilities has, I am afraid, been dashed by their behaviours in the last 18 months. We have seen a serious withdrawal of existing data-sharing provisions. As we approach even more use of AI, the excitement of the metaverse, a massive escalation in the amount of data and the impact of their technologies on society, it is extremely sobering to think that there is almost no access to the black box of their data.
That was a very good conclusion to the response from the noble Lord, Lord Bethell—urging a Minister to lean in. I have not heard that expression used in the House before, but it is excellent because, faced with a Home Office Minister, I am sure that is the kind of behaviour that we can expect imminently.
Last time we debated issues relating to national security and data protection, the noble Lord, Lord Ashton, was the responsible Minister and I had the support of the noble Lord, Lord Paddick. Now I have the Minister all to myself on Amendments 135A to 135E and the stand part notices on Clauses 28 to 30. These Benches believe that, as drafted, these clauses fall foul of the UK’s obligations under the ECHR, because they give the Home Secretary too broad a discretion and do not create sufficient safeguards to prevent their misuse.
Under the case law of the European Court of Human Rights, laws that give unfettered or overly broad discretion to the Government to interfere with privacy will violate the convention, because the laws must be sufficiently specific to prevent abuses of power. This means they must make sure that, any time they interfere with the privacy of people in the UK, they obey the law, have a goal that is legitimate in a democratic society and do only what is truly necessary to achieving that goal. The court has repeatedly stressed that this is what the rule of law means; it is an essential principle of democracy.
Despite multiple requests from MPs, and from Rights and Security International in particular, the Government have also failed to explain why they believe that these clauses are necessary to safeguard national security. So far, they have explained only why these new powers would be “helpful” or would ensure “greater efficiency”. Those justifications do not meet the standard that the ECHR requires when the Government want to interfere with our privacy. They are not entitled to do just anything that they find helpful.
Under Clause 28(7), the Home Secretary would be able to issue a national security certificate to tell the police that they do not need to comply with many important data protection laws and rules that they would otherwise have to obey. For instance, a national security certificate would give the police immunity when they commit crimes by using personal data illegally. It would also exempt them from certain provisions of the Freedom of Information Act 2000. The Bill would expand what counts as an intelligence service for the purposes of data protection law—again, at the Home Secretary’s wish. Clause 29 would allow the Home Secretary to issue a designation notice, allowing law enforcement bodies to take advantage of the more relaxed rules in the Data Protection Act 2018, otherwise designed for the intelligence agencies whenever they collaborate with the security services.
Both the amended approach to national security certificates and the new designation notice regime would be unaccountable. The courts would not be able to review what the Government are doing and Parliament might therefore never find out. National security certificates are unchallengeable before the courts, meaning that the police and the Home Secretary would be unaccountable if they abused those powers. If the Home Secretary says that the police need to use these increased—and, in our view, unnecessary—powers in relation to national security, his word will be final. This includes the power to commit crimes.
As regards designation notices, the Home Secretary is responsible for approving and reviewing their use. Only a person who is directly affected by a designation notice will be able to challenge it, yet the Home Secretary would have the power to keep the notice secret, in which case how could anybody know that the police had been snooping on their lives under this law?
Clauses 28 to 30 could, in our view, further violate the UK’s obligations under the Human Rights Act 1998 and the European Convention on Human Rights because they remove the courts’ role in reviewing how the Government use their surveillance power. The European Court of Human Rights has ruled in the past that large aspects of the law previously governing the UK’s surveillance powers were unlawful because they gave the Government too much discretion and lacked important safeguards to prevent misuse. Clauses 28 to 30 could be challenged on similar grounds, and the court has shown that it is willing to rule on these issues. These weaknesses in the law could also harm important relationships that the UK has with the EU as regards data adequacy, a subject that we will no doubt discuss in further depth later this week.
The Government argue that the clauses create a simplified legal framework that would improve the efficiency of police operations when working with the intelligence services. This is far from meeting the necessity standard under the ECHR.
The Government have frequently used the Fishmongers’ Hall and Manchester Arena attacks to support the idea that Clauses 28 to 30 are desirable. However, a difference in data protection regimes was not the issue in either case; instead, the problem centred around failures in offender management, along with a lack of communication between the intelligence services and local police. The Government have not explained how Clauses 28 to 30 would have prevented either incident or why they think these clauses are necessary to prevent whatever forms of violence the Government regard as most likely to occur in the future. The Government have had sufficient opportunity to date to explain the rationale for these clauses, yet they have so far failed to do so. For these reasons, we are of the view that Clauses 28 to 30 should not stand part of the Bill.
However, it is also worth putting down amendments to try to tease out additional aspects of these clauses, so Amendments 135A and 135D would put proportionality back in. It is not clear why the word “proportionality” has been taken out of the existing legislation. Similarly, Amendment 135B attempts to put back in the principles that should underpin decisions. Those are the most troubling changes, since they seem to allow for departure from basic data protection principles. These were the principles that the Government, during the passage of the Data Protection Act 2018, assured Parliament would always be secure. The noble Lord, Lord Ashton of Hyde, said:
“People will always have the right to ensure that the data held about them is fair and accurate, and consistent with the data protection principles”.—[Official Report, 10/10/17; col. 126.]
Thirdly, on the introduction of oversight by a judicial commissioner for Clause 28 certificates, now seems a good time to do that. During the passage of the Data Protection Act through Parliament, there was much debate over the Part 2 national security exemption for general processing in Section 26 and the national security certificates in Section 27. We expressed concern then but, sadly, the judicial commissioner role was not included. This is a timely moment to suggest that again.
Finally, on increasing the oversight of the Information Commissioner under Amendment 135E, I hope that this will be an opportunity for the Minister, despite the fact that I would prefer to see Clauses 28 to 30 not form part of the Bill, to explain in greater detail why they are constructed in the way they are and why the Home Office believes that it needs to amend the legislation in the way it proposes. I beg to move.
My Lords, I come to this topic rather late and without the star quality in this area that has today been attributed to the noble Lord, Lord Kirkhope. I acknowledge both the work of Justice in helping me to understand what Clause 28 does and the work of the noble Lord, Lord Clement-Jones, in formulating the probing amendments in this group. I echo his questions on Clause 28. I will focus on a few specific matters.
First, what is the difference between the existing formulation for restricting data protection rights “when necessary and proportionate” to protect national security and the new formulation,
“when required to safeguard national security”?
What is the purpose of that change? Does “required” mean the same as “necessary” or something different? Do the restrictions not need to be proportionate any more? If so, why? Could we have a practical example of what the change is likely to mean in practice?
Secondly, why is it necessary to expand the number of rights and obligations from which competent law enforcement authorities can be exempted for reasons of national security? I can understand why it may for national security reasons be necessary to restrict a person’s right to be informed, right of access to data or right to be notified of a data breach, as under the existing law, but Clause 28 would allow the disapplication of some very basic principles of data protection law—including, as I understand it, the right to have your data processed only for a specified, explicit and legitimate purpose, as well as the right to have decisions made about you not use solely automated methods.
Thirdly, as the noble Lord, Lord Clement-Jones, asked, why is it necessary to remove the powers of the Information Commissioner to investigate, to enter and inspect, and, where necessary, to issue notices? I appreciate that certificates will remain appealable to the Upper Tribunal by the person directly affected, applying judicial review principles, but that is surely not a substitute for review by the skilled and experienced ICO. Apart from anything else, the subject is unlikely even to know that they have been affected by the provisions, given that a certificate would exempt law enforcement from having to provide information to them. That is precisely why the oversight of a commissioner in the national security area is so important.
As for Clauses 29 and 30, I am as keen as anybody to improve the capabilities for the joint processing of data by the police and intelligence agencies. That was a major theme of the learning points from the London and Manchester attacks of 2017, which I helped to formulate in that year and on which I reported publicly in 2019. A joint processing regime certainly sounds like a good idea in principle but I would be grateful if the Minister could confirm which law enforcement competent authorities will be subject to this new regime. Are they limited to Counter Terrorism Policing and the National Crime Agency?
My Lords, we have heard some fine words from the noble Lord, Lord Clement-Jones, in putting the case for his Amendments 135A, 135B, 135C and 135D, which are grouped with the clause stand part debates. As he explained, they seek to test and probe why the Government have sought to extend the ability of the security and intelligence services to disapply basic data protection principles.
The new Government-drafted clause essentially, as well as disapplying current provisions, disapplies the rights of data subjects and the obligations placed on competent authorities and processors. The Explanatory Notes say that this is to create a regime that
“ensures that there is consistency in approach”.
Section 29 is designed to facilitate joint processing by the various agencies with a common regime. Like the noble Lord, Lord Anderson, I well understand why they might want to do that. The noble Lord, Lord Clement-Jones, has done the Committee a service in tabling these amendments because, as he said, during the passage of the 2018 Act assurances were given that law enforcement would always abide by basic data protection principles. On the face of it, that assurance no longer applies. Is this because it is inconvenient for the security and intelligence services? What are the Government seeking to do here?
Can the Minister explain from the Government’s perspective what has changed since 2018 that has led Ministers to conclude that those critical principles should be compromised? The amendments also seek to assert the importance of proportionality considerations when deciding whether national security exemptions apply. This principle is again raised in relation to the issuing of a national security certificate.
The noble Lord, Lord Clement-Jones, with Amendment 135E effectively poses the question of where the balance of oversight should rest. Should it be with the Secretary of State or the commissioner? All that new Clause 29 does is oblige the Secretary of State to consult the commissioner with the expectation that the commissioner then makes public a record of designation orders. However, it strips out quite a lot of the commissioner’s current roles and responsibilities. We should surely have something more convincing than that to guarantee transparency in the process. We on these Benches will take some convincing that the Government have got the right balance in regard to the interests of national security and the security services. Why, for instance, is Parliament being sidelined in the exercise of the Secretary of State’s powers? Did Ministers give any consideration to reporting duties and obligations so far as Parliament is concerned? If not, why not?
Labour does not want to see national security compromised in any way, nor do we want to undermine the essential and vital work that our intelligence services have to perform to protect us all. However, we must also ensure that we build confidence in our security and intelligence services by making them properly accountable, as the noble Lord, Lord Clement-Jones, argued, and that the checks and balances are sufficient and the right ones.
The noble Lord, Lord Anderson, got it right in questioning the change of language, and I want to better understand from the Minister what that really means. But why extend the range of exemptions? We could do with some specific reasons as to why that is being changed and why that is the case. Why has the Information Commissioner’s role been so fundamentally changed with regard to these clauses and the exemptions?
We will, as always, listen carefully to the Minister’s reply before we give further thought to this framework on Report, but we are very unhappy with the changes that are taking away some of the fundamental protections that were in place before, and we will need quite a lot of convincing on these government changes.
My Lords, I thank the noble Lord, Lord Clement-Jones, for his amendments and thank the other noble Lords who spoke in this short debate. These amendments seek to remove Clauses 28, 29 and 30 in their entirety, or, as an alternative, to make amendments to Clauses 28 and 29. I will first speak to Clause 28, and if I fail to answer any questions I will of course guarantee to write.
Clause 28 replaces the current provision under the law enforcement regime for the protection of national security data, with a revised version that mirrors the existing exemptions available to organisations operating under the UK GDPR and intelligence services regimes. It is also similar to what was available to law enforcement agencies under the 1998 Data Protection Act. It is essential that law enforcement agencies can properly protect data where required for national security reasons, and they should certainly be able to apply the same protections that are available to other organisations.
The noble Lord, Lord Clement-Jones, asked whether the exemption was in breach of a person’s Article 8 rights, but the national security exemption will permit law enforcement agencies to apply an exemption to the need to comply with certain parts of the law enforcement data protection regime, such as the data protection principles or the rights of the data subject. It is not a blanket exemption and it will be able to be applied only where this is required for the purposes of safeguarding national security—for instance, in order to prevent the tipping-off of a terror suspect. It can be applied only on a case-by-case basis. We do not, therefore, believe that the exemption breaches the right to privacy.
In terms of the Government taking away the right to lodge a complaint with the commissioner, that is not the case—the Government are not removing that right. Those rights are being consolidated under Clause 44 of this DPDI Bill. We are omitting Article 77 as Clause 44 will introduce provisions that allow a data subject to lodge a complaint with a controller.
In terms of how the subject themselves will know how to complain to the Information Commissioner, all organisations, including law enforcement agencies, are required to provide certain information to individuals, including their right to make a complaint to the Information Commissioner and, where applicable, the contact details of the organisation’s data protection officer or, in line with other amendments under the Bill, the organisation’s senior responsible individual, if they suspect that their personal information is being process unlawfully.
Amendments 135A and 135D seek to introduce a proportionality test in relation to the application of the national security exemption and the issuing of a ministerial certificate for law enforcement agencies operating under Part 3 of the Data Protection Act. The approach we propose is consistent with the similar exemptions for the UK GDPR and intelligence services, which all require a controller to evaluate on a case-by-case basis whether an exemption from a provision is required for the purpose of safeguarding national security.
Amendment 135B will remove the ability for law enforcement agencies to apply the national security exemption to data protection principles, whereas the approach we propose is consistent with the other data protection regimes and will provide for exemption from the data protection principles in Chapter 2—where required and on a case-by-case basis—but not from the requirement for processing to be lawful and the safeguards which apply to sensitive data.
The ability to disapply certain principles laid out in Chapter 2 is crucial for the efficacy of the national security exemption. This is evident in the UK GDPR and Part 4 exemption which disapplies similar principles. To remove the ability to apply the national security exemption to any of the data protection principles for law enforcement agencies only would undermine their ability to offer the same protections as those processing under the other data protection regimes.
Not all the principles laid out in Chapter 2 can be exempted from; for example, law enforcement agencies are still required to ensure that all processing is lawful and cannot exempt from the safeguards that apply to sensitive data. There are safeguards in place to ensure that the exemption is used correctly by law enforcement agencies. Where a data subject feels that the national security exemption has not been applied correctly, the legislation allows them to complain to the Information Commissioner and, ultimately, to the courts. Additionally, the reforms require law enforcement agencies to appoint a senior responsible individual whose tasks include monitoring compliance with the legislation.
Amendment 135C would make it a mandatory requirement for a certificate to be sought from and approved by a judicial commissioner whenever the national security exemption is to be invoked by law enforcement agencies only. This bureaucratic process does not apply to organisations processing under the other data protection regimes; forcing law enforcement agencies to apply for a certificate every time they need to apply the exemption would be unworkable as it would remove their ability to act quickly in relation to matters of national security. For these reasons, I hope that the noble Lord, Lord Clement-Jones, will not press his amendments.
On Clauses 29 and 30 of the Bill, currently, only the intelligence services can operate under Part 4 of the Data Protection Act. This means that, even when working together, the intelligence services and law enforcement cannot work on a single shared dataset but must instead transfer data back and forth, applying the provisions of their applicable data protection regimes, which creates significant friction. Removing barriers to joint working was flagged as a recommendation following the Manchester Arena inquiry, as was noted by the noble Lord, Lord Anderson, and following Fishmongers’ Hall, which also recommended closer working.
Clauses 29 and 30 enable qualifying competent authorities and an intelligence service jointly to process data under a single data protection regime in authorised, specific circumstances to safeguard national security. In order to jointly process data in this manner, the Secretary of State must issue a designation notice to authorise it. A notice can be granted only if the Secretary of State is satisfied that the processing is required for the purpose of safeguarding national security and following consultation with the ICO.
Amendment 135E would make the ICO the final arbiter of whether a designation notice is granted by requiring it to—
May I just intrude on the Minister’s flow? As I understand it, there is a possibility that relatives of the families affected by the Manchester Arena bombing will take to court matters relating to the operation of the security services, including relating to intelligence that it is felt they may have had prior to the bombing. How will this new regime, as set out in the Bill, affect the rights of those who may seek to hold the security services to account in the courts? Will their legal advisers ever be able to discover materials that might otherwise be exempt from public view?
That is a very good question but the noble Lord will understand that I am somewhat reluctant to pontificate about a potential forthcoming court case. I cannot really answer the question, I am afraid.
But understanding the impact on people’s rights is important in the context of this legislation.
As I say, it is a good question but I cannot comment further on that one. I will see whether there is anything that we can commit to in writing and have a further chat about this subject but I will leave it for now, if I may.
Amendment 135E would make the ICO the final arbiter of whether a designation notice is granted by requiring it to judge whether the notice is required for the purposes of the safeguarding of national security. It would be wholly inappropriate for the ICO to act as a judge of national security; that is not a function of the ICO in its capacity as regulator and should be reserved to the Secretary of State. As is generally the case with decisions by public bodies, the decision of the Secretary of State to grant a designation notice can be challenged legally; this is expressly provided for under new Section 82E, as is proposed to be included in the DPA by Clause 29.
On the subject of how a data subject is supposed to exercise their rights if they do not know that their data is being processed under a notice subject to Part 4, the ICO will publish designation notices as soon as is reasonably practical. Privacy information notices will also be updated if necessary to enable data subjects to identify a single point of contact should they wish to exercise their rights in relation to data that might be processed under a designation notice. This single point of contact will ease the process of exercising their data rights.
The noble Lord, Lord Anderson, asked which law enforcement agencies this will apply to. That will be set out separately in the subsequent affirmative SI. I cannot be more precise than that at the moment.
For these reasons, I hope that the noble Lord, Lord Clement-Jones, will be prepared to withdraw his amendment.
The Minister left us on a tantalising note. He was unable to say whether the law enforcement organisations affected by these clauses will be limited to Counter Terrorism Policing and the NCA or whether they will include others as well. I am rather at a loss to think who else might be included. Do we really have to wait for the affirmative regulations before we can be told about that? It seems pretty important. As the Minister knows well, there are quite a few precedents—following some recent ones—for extending to those bodies some of the privileges and powers that attach to the intelligence agencies. I suspect that a number of noble Lords might be quite alarmed if they felt that those powers or privileges were being extended more widely—certainly without knowing, or at least having some idea, in advance to whom they might be extended.
While I am on my feet and causing mischief for the Minister, may I return to the rather lawyerly question that I put to him? I do not think I had an answer about the formulation in new Section 78A, which talks about an exemption applying
“if exemption from the provision is required for the purposes of safeguarding national security”.
What does “required” mean? Does it simply mean the same as “necessary”—in which case, why not stick with that? Or does it mean something else? Does it mean that someone has required or requested it? It could be a pretty significant difference and this is a pretty significant ambiguity in the Bill. If the Minister is not willing to explain it now, perhaps he will feel able to write to us to explain exactly what is meant by replacing the well-worn phrase “necessary and proportionate” with “required”.
I thank the noble Lord for that. It is a lawyerly question and, as he knows, I am not a lawyer. With respect, I will endeavour to write and clarify on that point, as well as on his other good point about the sorts of authorities that we are talking about.
Perhaps the same correspondence could cover the point I raised as well.
My Lords, I am immensely grateful to the noble Lords, Lord Anderson and Lord Bassam, for their interventions. In particular, given his background, if the noble Lord, Lord Anderson, has concerns about these clauses, we all ought to have concerns. I am grateful to the Minister for the extent of his unpacking—or attempted unpacking—of these clauses but I feel that we are on a slippery slope here. I feel some considerable unease about the widening of the disapplication of principles that we were assured were immutable only six years ago. I am worried about that.
We have had some reassurance about the right to transparency, perhaps when it is convenient that data subjects find out about what is happening. The right to challenge was also mentioned by the Minister but he has not really answered the question about whether the Home Office has looked seriously at the implications as far as the human rights convention is concerned, which is the reason for the stand part notice. The Minister did not address that matter at all; I do not know why. I am assuming that the Home Office has looked at the clauses in the light of the convention but, again, he did not talk about that.
The only assurance the Minister has really given is that it is all on a case-by-case basis. I do not think that that is much of a reassurance. On the proportionality point made by the noble Lord, Lord Anderson, I think that we are going to be agog in waiting for the Minister’s correspondence on that, but it is such a basic issue. There were two amendments specifically on proportionality but we have not really had a reply on that issue at all, in terms of why it should have been eliminated by the legislation. So a feeling of unease prevails. I do not even feel that the Minister has unpacked fully the issue of joint working; I think that the noble Lord, Lord Anderson, did that more. We need to know more about how that will operate.
The final point that the Minister made gave even greater concern—to think that there will be an SI setting out the bodies that will have the powers. We are probably slightly wiser than when we started out with this group of amendments, but only slightly and we are considerably more concerned. In the meantime, I beg leave to withdraw the amendment.
My Lords, I shall speak to Amendment 137 in my name. I apologise to the Committee that I was unable to speak in the Second Reading debate on this Bill, which seems a long time ago now.
This is a discrete amendment designed to address an extremely burdensome and potentially unnecessary redaction exercise in relation to a situation where the police are preparing a case file for submission to the Crown Prosecution Service for a charging decision. The amendment was originally tabled in the House of Commons by Jane Hunt MP; both of us would like to thank the Police Federation of England and Wales for its assistance in briefing us in preparing the draft clause.
Perhaps it would be helpful to say by way of background that the existing data protection legislation requires our police forces to spend huge amounts of time and resources, first, in going through the information that has been gathered by investigating officers to identify every single item of personal data contained in that information; secondly, in deciding whether it is necessary or, in many cases, strictly necessary for the CPS to consider each item of personal data when making a charging decision; and, thirdly, in redacting every item of personal data that does not meet this test. I ask noble Lords to imagine, with things such as body cams being worn by the police, how much personal data is being collected these days every time officers respond to incidents. The police federation and the National Police Chiefs’ Council estimate that the national cost of this redaction exercise is approximately £5,642,900 per annum and that, since 1 January 2021, 365,000 policing hours have been consumed with this redaction exercise.
In his Budget last month, the Chancellor of the Exchequer asked for ideas to improve public sector productivity, so it will come as no surprise to the Minister that the Police Federation has rushed to submit this idea as one of those suggestions about how we might improve that productivity puzzle. I want to share one example of what this redaction requirement means in practice. This came from a detective constable in Suffolk who was attached to a regional crime unit. They said that the case they were involved with was
“a multi-million pound fraud offence from Suffolk with 115 victims. After a five year investigation two persons were charged (in Oct 2023) however, these charges would have been brought far sooner had the CPS not insisted that all used and unused material in the case be provided and redacted prior to the actual charges being brought. The redactions took six months to complete and at times both officers and civilian staff were deployed full time to accommodate”
this exercise. Due to the nature of the investigation, the victims in this case were elderly and some had, sadly, passed away over the years.
While the detective constable accepted that the investigation itself was lengthy, they
“were able to manage the expectations of the victims by providing routine updates on the progress of the case”.
However:
“It was more difficult to explain come early 2023 that documents in the case then had to be redacted before the CPS would allow us to charge the suspects. The fact that documents of varying sizes (some several pages in length) of the unused material had to be redacted prior to charge, when these documents may or not be served and ultimately would be served secondary to the used items is difficult to understand for the officers let alone explaining this to victims who are losing interest and respect for both the Police and CPS. Anyone would question why we were spending time redacting documents that MAY NEVER be served. It is … easy to say redact everything! In turn the additional months redacting affected the court process, delaying that also. Victims are questioning whether they will be alive to see”
the conclusion of the process. While the delay was
“not solely down to the redaction demands a more targeted redaction process after charge is more logical and cost effective for all”.
The redaction exercise is potentially unnecessary in the case of any given case file because the CPS decides to charge in approximately only 75% of cases. In the 25% of cases where the CPS decides not to charge, the unredacted file could simply be deleted by the CPS. Where the CPS decides to charge, the case file could then be returned to the police to carry out the redaction exercise before there is any risk of the file being disclosed to any person or body other than the CPS.
The simple and practical solution, as the Police Federation has put forward, is for the police to carry out the redaction exercise in relation to any given case file only after the CPS has taken the decision to charge. I should be clear that what is being proposed here does not remove any substantive protection of the personal data in question. It does not remove the obligation to review and redact the personal data contained in material in a case file; it simply provides for that review and redaction to be conducted by the police after, rather than before, a charging decision has been made by the CPS.
The law enforcement directive on which the relevant part of the Data Protection Act 2018 was based would have permitted this when that Act was passed. Part 3 of the 2018 Act implemented that directive and makes provision for data processing by “competent authorities”, including police forces and the Crown Prosecution Service, for defined “law enforcement purposes”. However, although recital 4 to the law enforcement directive emphasised:
“The free flow of personal data between competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences … should be facilitated while ensuring a high level of protection of personal data”,
Part 3 of the 2018 Act contains no provisions at all to facilitate the free flow of personal data between the police and the CPS.
The effect of the proposed new clause as set out in this amendment would be, first, to exempt the police from complying with the first data protection principle—except in so far as that principle requires processing to be fair—and from the third data protection principle, when the police are undertaking processing that consists of preparing for submission and submitting to the CPS a case file seeking a charging decision. Secondly, the amendment would exempt the CPS from the first and third data principles to the same extent when it makes that charging decision. Thirdly, it would require the CPS to return the case file to the police if a decision to charge is made, after which the data protection principles will apply in full to any subsequent processing.
I appreciate—particularly with the Minister here—that the Home Office is really in the driving seat here. We understand that the Home Office objections to this amendment seem to boil down to the belief that it will only partially resolve the problem, because the legal requirements around sharing of data are broader than just the first and third data principles, and that there are other relevant provisions not addressed by this drafting. It is of course absolutely open to the Minister and the Home Office to say that they support the broad principles of this draft clause, while suggesting that the drafting of this particular amendment should identify some other relevant provisions, and it would be helpful if they did that rather than just objecting to the whole amendment as put forward.
My Lords, the noble Baroness, Lady Morgan, has done us a service by raising this issue. My question is about whether the advice given to date about redaction is accurate. I have not seen the Home Office’s guidance or counsel’s analysis. I have taken advice on the Police Federation’s case—I received an email and I was very interested in what it had to say, because we all want to make sure that the bureaucracy involved in charging and dealing with the CPS is as minimal as possible within the bounds of data protection law.
Section 35(2)(b) of the Data Protection Act simply requires the police to ensure that their processing is necessary for the performance of their tasks. You would have thought that sending an investigation file to the CPS to decide whether to charge a suspect seems necessary for the performance of that task. Some of that personal data may end up not being relevant to the charge or any trial, but that is a judgment for the CPS and the prosecutor. It does not mean, in the view of those I have consulted, that the file has to be redacted at vast taxpayer cost before the CPS or prosecutor have had a chance to see the investigation’s file. When you look at sensitive data, the test is “strictly necessary”, which is a higher test, but surely the answer to that must be that officers should collect this information only where they consider it relevant to the case. So this can be dealt with through protocols about data protection, which ensure that officers do not collect more sensitive data than is necessary for the purposes of the investigation.
Similarly, under Section 37, the question that the personal data must be adequate, relevant and not excessive in relation to the purpose for which it is processed should not be interpreted in such a way that this redaction exercise is required. If an officer thinks they need to collect the relevant information for the purpose of the investigation, that seems to me—and to those advising me—in broad terms to be sufficient to comply with the principle. Conversely, if officers are collecting too much data, the answer is that they should be trained to avoid doing this. If officers really are collecting more information than they should be, redactions cannot remedy the fact that the collection was unlawful in the first place. The solution seems to be to stop them collecting that data.
I assume—maybe I am completely wrong—that the Minister will utter “suitable guidance” in response to the noble Baroness’s amendment and say that there is no need to amend the legislation, but, if there is no need to do so, I hope that they revise the guidance, because the Police Federation and its members are clearly labouring under a misapprehension about the way the Act should be interpreted. It would be quite a serious matter if that has taken place for the last six years.
My Lords, we should be very grateful to the noble Baroness, Lady Morgan of Cotes, for her amendment. I listened very carefully to her line of argument and find much that we can support in the approach. In that context, we should also thank the Police Federation of England and Wales for a particularly useful and enlightening briefing paper.
We may well be suffering under the law of unintended consequences in this context; it seems to have hit quite hard and acted as a barrier to the sensible processing and transfer of data between two parts of the law enforcement machinery. It is quite interesting coming off the back of the previous debate, when we were discussing making the transfer of information and intelligence between different agencies easier and having a common approach. It is a very relevant discussion to have.
I do not think that the legislation, when it was originally drafted, could ever have been intended to work in the way the Police Federation has set out. The implementation of the Data Protection Act 2018, in so far as law enforcement agencies are concerned, is supposed to be guided by recital 4, which the noble Baroness read into the record and which makes good sense.
As the noble Baroness explained, the Police Federation’s argument that the DPA makes no provisions at all that are designed to facilitate, in effect, the free flow of information, that it should be able to hold all the relevant data prior to the charging decision being made by the CPS, and that redaction should take place only after a decision on charging has been made seems quite a sensible approach. As she argued, it would significantly lighten the burden on police investigating teams and enable the decision on charging to be more broadly informed.
So this is a piece of simplification that we can all support. The case has been made very well. If it helps speed up charging and policing processes, which I know the Government are very concerned about, as all Governments should be, it seems a sensible move—but this is the Home Office. We do not always expect the most sensible things to be delivered by that department, but we hope that they are.
I thank all noble Lords for their contributions—I think. I thank my noble friend Lady Morgan of Cotes for her amendment and for raising what is an important issue. Amendment 137 seeks to permit the police and the Crown Prosecution Service to share unredacted data with one another when making a charging decision. Perhaps to the surprise of the noble Lord, Lord Bassam, we agree: we must reduce the burden of redaction on the police. As my noble friend noted, this is very substantial and costly.
We welcome the intent of the amendment. However, as my noble friend has noted, we do not believe that, as drafted, it would achieve the stated aim. To fully remove it would require the amendment of more than just the Data Protection Act.
However, the Government are committed to reducing the burden on the police, but it is important that we get it right and that the solution is comprehensive. We consider that the objective which my noble friend is seeking would be better achieved through other means, including improved technology and new, simplified guidance to prevent overredaction, as all speakers, including the noble Lord, Lord Clement-Jones, noted.
The Home Office provided £960,000 of funding for text and audio-visual multimedia redaction in the 2023-24 financial year. Thanks to that funding, police forces have been able to procure automated text redaction tools, the trials of which have demonstrated that they could save up 80% of the time spent by the police on this redaction. Furthermore, in the latest Budget, the Chancellor announced an additional £230 million of funding for technology to boost police productivity. This will be used to develop, test and roll out automated audio-visual redaction tools, saving thousands more hours of police time. I would say to my noble friend that, as the technology improves, we hope that the need for it to be supervised by individuals will diminish.
I can also tell your Lordships’ House that officials from the Home Office have consulted with the Information Commissioner’s Office and have agreed that a significant proportion of the burden caused by existing pre-charge redaction processes could be reduced safely and lawfully within the current data protection framework in a way that will maintain standards and protections for individuals. We are, therefore, actively working to tackle this issue in the most appropriate way by exploring how we can significantly reduce the redaction burden at the pre-charge stage through process change within the existing legislative framework. This will involve creating simplified guidance and, obviously, the use of better technology.
Is the Minister almost agreeing with some of my analysis in that case?
No, I think I was agreeing with my noble friend’s analysis.
I thank all noble Lords for their contributions. We acknowledge this particular problem and we are working to fix it. I would ask my noble friend to withdraw her amendment.
My Lords, I thank my noble friend the Minister for his response. I also thank the noble Lords, Lord Clement-Jones and Lord Bassam, for their support. I hope that those watching from outside will be heartened by what they have heard. I think there is general agreement that this problem should be simplified, and the burden taken off policing.
I am interested to hear about redaction but, with bodycams and images, as well as the mass amount of data on items such as mobile phones, it is complicated. My noble friend the Minister mentioned that the Home Office and the Information Commissioner’s Office were consulting with each other to reduce this pre-charge redaction burden. Perhaps he could write to me, or we could have a meeting to work it out. The challenge in all this is that we have a debate in which everybody agrees and then it all slows down again. Perhaps we can keep the momentum going by continuing discussions outside, involving the Police Federation as well. For now, I beg leave to withdraw the amendment.
My Lords, I will speak also to Amendment 140 and the submissions that Clauses 32 to 35 should not stand part. These amendments are designed to clarify the statutory objective of the new information commission; increase its arm’s-length relationship with the Government; allow effective judicial scrutiny of its regulatory function; allow not-for-profit organisations to lodge representative complaints; retain the Office of the Biometrics and Surveillance Camera Commissioner; and empower the Equality and Human Rights Commission to scrutinise the new information commission. The effective supervision and enforcement of data protection and the investigation and detection of offenders are crucial to achieve deterrence, prevent violations, maintain transparency and control options for redress against data misuse.
My Lords, I will speak to Amendments 142, 143 and 150 in my name, and I thank other noble Lords for their support.
We have spent considerable time across the digital Bills—the online safety, digital markets and data Bills—talking about the speed at which industry moves and the corresponding need for a more agile regulatory system. Sadly, we have not really got to the root of what that might look like. In the meantime, we have to make sure that regulators and Governments are asked to fulfil their duties in a timely manner.
Amendment 142 puts a timeframe on the creation of codes under the Act at 18 months. Data protection is a mature area of regulatory oversight, and 18 months is a long time for people to wait for the benefits that accrue to them under legislation. Similarly, Amendment 143 ensures that the transition period from the code being set to it being implemented is no more than 12 months. Together, that creates a minimum of two and half years. In future legislation on digital matters, I would like to see a very different approach that starts with the outcome and gives companies 12 months to comply, in any way they like, to ensure that outcome. But while we remain in the world of statutory code creation, it must be bound by a timeframe.
I have seen time and again, after the passage of a Bill, Parliament and civil society move on, including Ministers and key officials—as well as those who work at the regulator—and codes lose their champions. It would be wonderful to imagine that matters progress as intended, but they do not. In the absence of champions, and without ongoing parliamentary scrutiny, codes can languish in the inboxes of people who have many calls on their time. Amendments 142 and 143 simply mirror what the Government agreed to in the OSA—it is a piece of good housekeeping to ensure continuity of attention.
I am conscious that I have spent most of my time highlighting areas where the Bill falls short, so I will take a moment to welcome the reporting provisions that the Government have put forward. Transparency is a critical aspect of effective oversight, and the introduction of an annual report on regulatory action would be a valuable source of information for all stakeholders with an interest in understanding the work of the ICO and its impact.
Amendment 150 proposes that those reporting obligations also include a requirement to provide details of all activities carried out by the Information Commissioner to support, strengthen and uphold the age-appropriate design code. It also proposes that, when meeting its general reporting obligations, it should provide the information separately for children. The ICO published an evaluation of the AADC as a one-off in March 2023 and its code strategy on 3 April this year. I recognise the effort that the commissioner has made towards transparency, and the timing of his report indicates that having reporting on children specifically is something that the ICO sees as relevant and useful. However, neither of those are sufficient in terms of the level of detail provided, the reporting cadence or the focus on impact rather than the efforts that the ICO has made.
There are many frustrations for those of us who spend our time advocating for children’s privacy and safety. Among them is having to try to extrapolate child-specific data from generalised reporting. When it is not reported separately, it is usually to hide inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snap provides a breakdown of the violation rate data by age group, even though this would provide valuable information for academics, Governments, legislators and NGOs. Amendment 150 would go some way to addressing this gap by ensuring that the ICO is required to break down its reporting for children.
Having been momentarily positive, I would like to put on the record my concerns about the following extract from the email that accompanied the ICO’s children’s code strategy of 2 April. Having set out the very major changes to companies that the code has ushered in and explained how the Information Commissioner would spend the next few months looking at default settings, geolocation, profiling, targeting children and protecting under-13s, the email goes on to say:
“With the ongoing passage of the bill, our strategy deliberately focusses in the near term on compliance with the current code. However, once we have more clarity on the final version of the bill we will of course look to publicly signal intentions about our work on implementation and children’s privacy into the rest of the year and beyond”.
The use of the phrase “current code”, and the fact that the ICO has decided it is necessary to put its long-term enforcement strategy on hold, contradict government assurances that standards will remain the same.
The email from the ICO arrived in my inbox on the same day as a report from the US Institute of Digital Media and Child Development, which was accompanied by an impact assessment on the UK’s age-appropriate design code. It stated:
“The Institute’s review identifies an unprecedented wave of … changes made across leading social media and digital platforms, including YouTube, TikTok, Snapchat, Instagram, Amazon Marketplace, and Google Search. The changes, aimed at fostering a safer, more secure, and age-appropriate online environment, underscore the crucial role of regulation in improving the digital landscape for children and teens”.
In June, the Digital Futures Commission will be publishing a similar report written by the ex-Deputy Information Commissioner, Steve Wood, which has similarly positive but much more detailed findings. Meanwhile, we hear the steady drumbeat of adoption of the code in South America, Australia and Asia, and in additional US states following California’s lead. Experts in both the US and here in the UK evidence that this is a regulation that works to make digital services safer and better for children.
I therefore have to ask the Minister once again why the Government are downgrading child protection. If he, or those in the Box advising him, are even slightly tempted to say that they are not, I ask that they reread the debates from the last two days in Committee, in which the Government removed the balancing test to automated decision-making and the Secretary of State’s powers were changed to have regard to children rather than to mandate child protections. The data impact assessment provisions have also been downgraded, among the other sleights of hand that diminish the AADC.
The ICO has gone on record to say that it has put its medium to long-term enforcement strategy on hold, and the Minister’s letter sent on the last day before recess says that the AADC will be updated to reflect the Bill. I would like nothing more than a proposal from the Government to put the AADC back on a firm footing. I echo the words said earlier by the noble Baroness, Lady Jones, that it is time to start talking and stop writing. I am afraid that, otherwise, I will be tabling amendments on Report that will test the appetite of the House for protecting children online. In the meantime, I hope the Minister will welcome and accept the very modest proposals in this group.
My Lords, as is so often the case on this subject, I support the noble Baroness, Lady Kidron, and the three amendments that I have added my name to: Amendments 142, 143 and 150. I will speak first to Amendments 142 and 143, and highlight a couple of issues that the noble Baroness, Lady Kidron, has already covered.
My Lords, I am grateful to the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling these amendments and raising important points about the Information Commissioner’s independence and authority to carry out his role efficiently. The amendments from the noble Lord, Lord Clement-Jones, range widely, and I have to say that I have more sympathy with some of them than others.
I start by welcoming some of the things in the Bill—I am very pleased to be able to do this. It is important that we have an independent regulator that is properly accountable to Parliament, and this is vital for a properly functioning data protection regime. We welcome a number of the changes that have been made to the ICO’s role in the Bill. In particular, we think the move to have a board and a chief executive model, with His Majesty appointing the chair of the board, is the right way to go. We also welcome the strengthening of enforcement powers and the obligation to establish stakeholder panels to inform the content of codes of practice. The noble Baroness, Lady Kidron, also highlighted that.
However, we share the concern of the noble Lord, Lord Clement-Jones, about the Secretary of State’s requirement every three years to publish a statement of strategic priorities for the commissioner to consider, respond to and have regard to. We share his view, and that of many stakeholder groups, that this crosses the line into political involvement and exposes the ICO to unwarranted political direction and manipulation. We do not believe that this wording provides sufficient safeguards from that in its current form.
I have listened carefully to the explanation of the noble Lord, Lord Clement-Jones, of Amendment 138. I understand his concern, but we are going in a slightly different direction to him on this. We believe that the reality is that the ICO does not have the resources to investigate every complaint. He needs to apply a degree of strategic prioritisation in the public interest. I think that the original wording in the Bill, rather than the noble Lord’s amendment, achieved that objective more clearly.
Amendment 140, in the name of the noble Lord, Lord Clement-Jones, raises a significant point about businesses being given assured advice to ensure that they follow the procedures correctly, and we welcome that proposal. There is a role for leadership of the ICO in this regard. His proposal also addresses the Government’s concern that data controllers struggle to understand how they should be applying the rules. This is one of the reasons for many of the changes that we have considered up until now. I hope that the Minister will look favourably on this proposal and agree that we need to give more support to businesses in how they follow the procedures.
Finally, I have added my name to the amendment of the noble Baroness, Lady Kidron, which rightly puts a deadline on the production of any new codes of practice, and a deadline on the application of any transitional arrangements which apply in the meantime. We have started using the analogy of the codes losing their champions, and in general terms she is right. Therefore, it is useful to have a deadline, and that is important to ensure delivery. This seems eminently sensible, and I hope the Minister agrees with this too.
Amendment 150 from the noble Baroness, Lady Kidron, also requires the ICO annual report to spell out specifically the steps being taken to roll out the age-appropriate design code and to specifically uphold children’s data rights. Going back to the codes losing their champions, I am sure that the Minister got the message from the noble Baronesses, Lady Kidron and Lady Harding, that in this particular case, this is not going to happen, and that this code and the drive to deliver it will be with us for some time to come.
The noble Baroness, Lady Kidron, raised concerns about the approach of the ICO, which need to be addressed. We do not want a short-term approach but a longer-term approach, and we want some guarantees that the ICO is going to address some of the bigger issues that are being raised by the age-appropriate design code and other codes. Given the huge interest in the application of children’s data rights in this and other Bills, I am sure that the Information Commissioner will want to focus his report on his achievements in this space. Nevertheless, for the avoidance of doubt, it is useful to have it in the Bill as a specific obligation, and I hope the Minister agrees with the proposal.
We have a patchwork of amendments here. I am strongly in support of some; on others, perhaps the noble Lord and I can debate further outside this Room. In the meantime, I am interested to hear what the Minister has to say.
I thank the noble Lord, Lord Clement-Jones, the noble Baroness, Lady Kidron, and other noble Lords who have tabled and signed amendments in this group. I also observe what a pleasure it is to be on a Committee with Batman and Robin—which I was not expecting to say, and which may be Hansard’s first mention of those two.
The reforms to the Information Commissioner’s Office within the Bill introduce a strategic framework of objectives and duties to provide context and clarity on the commissioner’s overarching objectives. The reforms also put best regulatory practice on to a statutory footing and bring the ICO’s responsibilities into line with that of other regulators.
With regard to Amendment 138, the principal objective upholds data protection in an outcomes-focused manner that highlights the discretion of the Information Commissioner in securing those objectives, while reinforcing the primacy of data protection. The requirement to promote trust and confidence in the use of data will encourage innovation across current and emerging technologies.
I turn now to the question of Clause 32 standing part. As part of our further reforms, the Secretary of State can prepare a statement of strategic priorities for data protection, which positions these aims within its wider policy agenda, thereby giving the commissioner helpful context for its activities. While the commissioner must take the statement into account when carrying out functions, they are not required to act in accordance with it. This means that the statement will not be used in a way to direct what the commissioner may and may not do when carrying out their functions.
Turning to Amendment 140, we believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. This amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without necessarily full knowledge of the facts, undermining their regulatory enforcement role.
In response to the amendments concerning Clauses 33 to 35 standing part, I can say that we are introducing a series of measures to increase accountability, robustness and transparency in the codes of practice process, while safeguarding the Information Commissioner’s role. The requirements for impact assessments and panel of experts mean that the codes will consider the application to, and impact on, all potential use cases. Given that the codes will have the force of law, the Secretary of State must have the ability to give her or his comments. The Information Commissioner is required to consider but not to act on those comments, preserving the commissioner’s independence. It remains for Parliament to give approval for any statutory code produced.
Amendments 142 and 143 impose a requirement on the ICO to prepare codes and for the Secretary of State to lay them in Parliament as quickly as practicable. They also limit the time that transitional provisions can be in place to a maximum of 12 months. This could mean that drafting processes are truncated or valid concerns are overlooked to hit a statutory deadline, rather than the codes being considered properly to reflect the relevant perspectives.
Given the importance of ensuring that any new codes are robust, comprehensive and considered, we do not consider imposing time limits on the production of codes to be a useful tool.
Finally, Amendment 150—
We had this debate during the passage of the Online Safety Act. In the end, we all agreed—the House, including the Government, came to the view—that two and a half years, which is 18 months plus a transition period, was an almost egregious amount of time considering the rate at which the digital world moves. So, to consider that more than two and a half years might be required seems a little bit strange.
I absolutely recognise the need for speed, and my noble friend Lady Harding made this point very powerfully as well, but what we are trying to do is juggle that need with the need to go through the process properly to design these things well. Let me take it away and think about it more, to make sure that we have the right balancing point. I very much see the need; it is a question of the machinery that produces the right outcome in the right timing.
Before the Minister sits down, I would very much welcome a meeting, as the noble Baroness, Lady Harding, suggested. I do not think it is useful for me to keep standing up and saying, “You are watering down the code”, and for the Minister to stand up and say, “Oh no, we’re not”. We are not in panto here, we are in Parliament, and it would be a fantastic use of all our time to sit down and work it out. I would like to believe that the Government are committed to data protection for children, because they have brought forward important legislation in this area. I would also like to believe that the Government are proud of a piece of legislation that has spread so far and wide—and been so impactful—and that they would not want to undermine it. On that basis, I ask the Minister to accede to the noble Baroness’s request.
I am very happy to try to find a way forward on this. Let me think about how best to take this forward.
My Lords, I thank the Minister for his response and, in particular, for that exchange. There is a bit of a contrast here—the mood of the Committee is probably to go with the grain of these clauses and to see whether they can be improved, rather than throw out the idea of an information commission and revert to the ICO on the basis that perhaps the information commission is a more logical way of setting up a regulator. I am not sure that I personally agree, but I understand the reservations of the noble Baroness, Lady Jones, and I welcome her support on the aspect of the Secretary of State power.
We keep being reassured by the Minister, in all sorts of different ways. I am sure that the spirit is willing, but whether it is all in black and white is the big question. Where are the real safeguards? The proposals in this group from the noble Baroness, Lady Kidron, to which she has spoken to so well, along with the noble Baroness, Lady Harding, are very modest, to use the phrase from the noble Baroness, Lady Kidron. I hope those discussions will take place because they fit entirely with the architecture of the Bill, which the Government have set out, and it would be a huge reassurance to those who believe that the Bill is watering down data subject rights and is not strengthening children’s rights.
I am less reassured by other aspects of what the Minister had to say, particularly about the Secretary of State’s powers in relation to the codes. As the noble Baroness, Lady Kidron, said, we had a lot of discussion about that in relation to the Ofcom codes, under the Online Safety Bill, and I do not think we got very far on that either. Nevertheless, there is disquiet about whether the Secretary of State should have those powers. The Minister said that the ICO is not required to act in accordance with the advice of the Secretary of State so perhaps the Minister has provided a chink of light. In the meantime, I beg leave to withdraw the amendment.
My Lords, Amendment 146 is in my name and those of the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Harding and Lady Jones; I thank them all for their support. Before I set out the amendment that would provide a code of practice for edtech and why it is so urgently required, I thank the noble Baroness, Lady Barran, and officials in the Department for Education for their engagement on this issue. I hope the Minister can approach this issue with the same desire they have shown to fill the gap that it seeks to address.
A child does not have a choice about whether they go to school. For those who do not fall into the minority who are homeschooled or who, for a reason of health or development, fall outside the education system, it is compulsory. The reason I make this point at the outset is that, if school is compulsory, it must follow that a child should enjoy the same level of privacy and safety at school as they do in any other environment. Yet we have allowed a gap in our data legislation, meaning that a child’s data is unprotected at school and, at the same time, invested in an unregulated and uncertified edtech market to develop promises of learning outcomes that range from unsubstantiated to false.
Schools are keen to adopt new technologies and say that they feel pressure to do so. In both cases, they lack the knowledge and time to assess the privacy and safety risks of the technology products that they are being sold. Amendment 146 would enable children and schools to benefit from emerging technologies. It would reduce the burden on schools in ensuring compliance so that they can get on with the job of teaching our children in a safe, developmentally appropriate and rights-respecting environment, and it would deal with companies that fail to provide evidence for their products and routinely exploit the complexity of data protection law to children’s detriment. In sum, the amendment brings forward a code of conduct for edtech.
Subsections (1) and (2) would require the ICO to bring forward a data code for edtech and tech used in education settings. In doing so, the commissioner would be required to consider children’s fundamental rights, as set out in the Convention on the Rights of the Child, and their relevance to the digital world, as adopted by the Committee on the Rights of the Child in general comment 25 in 2021. The commissioner would have to consider the fact that children are legally entitled to a higher standard of protection in respect to their personal data than adults. In keeping with other data codes, the amendment also sets out whom the ICO must consult when preparing the code, including children, parents and teachers, as well as edtech companies.
Subsection (3) would require edtech companies to provide schools with transparent information about their data-processing practices and their impact on children. This is of particular importance because the department’s own consultation showed that schools are struggling to understand the implications of being a data controller and most often accept the default settings of products and services. Having a code of conduct would allow the Information Commissioner not only to set the standards in subsections (1) and (2) but to insist on the way that information is given in order to support schools to make the right choices for their pupils.
Subsection (4) would allow schools to use edtech providers’ adherence to the code as proof of fulfilling their own data protection duties. Once again, this would alleviate the burden on teachers and school leaders.
Subsection (5) would simply give the commissioner a role in supporting a certification scheme to enable the industry to demonstrate both the compliance of edtech services and products with the UK GDPR and conformity with the age-appropriate design code of practice and the edtech code of practice. The IEEE Standards Association and For Humanity have published certification standards for the AADC but they have not yet been approved by the ICO or UKAS standards. Subsection (5) would act as a catalyst, ensuring that the ICO and the certification partners work together efficiently. Ultimately, schools will respond better to certification than to pure data law.
If the edtech sector was formally in scope of the AADC and it was robustly applied, that would do some, though not all, of what the amendment seeks to do. But in 2018, Her Majesty’s Government, as they were then, made the decision that schools are responsible for children and that the AADC would be confusing. I am not sure whether the Government of the day did not understand the AADC. It requires companies to offer children privacy by design and default. Nothing in the code would have infringed—or will infringe—on a school’s safeguarding duties, but leaving schools out of scope leaves teachers or school data protection officers with vast responsibilities for wilfully leaky products that simply should not fall to them. Many in this House thought that the Government were wrong, and since then we have seen grand abuse of the gap that was created. This is an opportunity to put that error right.
My Lords, I rise once again in my Robin role to support the noble Baroness, Lady Kidron, on this amendment. We had a debate on 23 November last year that the noble Baroness brought on this very issue of edtech. Rather than repeat all the points that were made in that very useful debate, I point my noble friend the Minister to it.
I would just like to highlight a couple of quick points. First, in supporting this amendment, I am not anti-edtech in any way, shape or form. It is absolutely clear that technology can bring huge benefits to students of all ages but it is also clear that education is not unique. It is exactly like every other part of society: where technology brings benefit, it also brings substantial risk. We are learning the hard way that thinking that any element of society can mitigate the risks of technology without legal guard-rails is a mistake.
We have seen really clearly with the age-appropriate design code that commercial organisations operating under its purview changed the way they protected children’s data as a result of that code. The absence of the equivalent code for the edtech sector should show us clearly that we will not have had those same benefits. If we bring edtech into scope, either through this amendment or simply through extending the age-appropriate design code, I would hazard a strong guess that we would start to see very real improvements in the protection of children’s data.
In the debate on 23 November, I asked my noble friend the Minister, the noble Baroness, Lady Barran, why the age-appropriate design code did not include education. I am not an expert in education, by any stretch of the imagination. The answer I received was that it was okay because the keeping children safe in education framework covered edtech. Since that debate, I have had a chance to read that framework, and I cannot find a section in it that specifically addresses children’s data. There is lots of really important stuff in it, but there is no clearly signposted section in that regard. So even if all the work fell on schools, that framework on its own, as published on GOV.UK, does not seem to meet the standards of a framework for data protection for children in education. However, as the noble Baroness, Lady Kidron, said, this is not just about schools’ responsibility but the edtech companies’ responsibility, and it is clear that there is no section on that in the keeping children safe in education framework either.
The answer that we received last year in this House does not do justice to the real question: in the absence of a specific code—the age-appropriate design code or a specific edtech code—how can we be confident that there really are the guardrails, which we know we need to put in place in every sector, in this most precious and important sector, which is where we teach our children?
My Lords, I am absolutely delighted to be able to support this amendment. Like the noble Baroness, Lady Harding, I am not anti-edtech at all. I did not take part in the debate last year. When I listen to the noble Baroness, Lady Kidron, and even having had the excellent A Blueprint for Education Data from the 5Rights Foundation and the Digital Futures for Children brief in support of a code of practice for education technology, I submit that it is chilling to hear what is happening as we speak with edtech in terms of extraction of data and not complying properly with data protection.
I got involved some years ago with the advisory board of the Institute for Ethical AI in Education, which Sir Anthony Seldon set up with Professor Rose Luckin and Priya Lakhani. Our intention was slightly broader—it was designed to create a framework for the use of AI specifically in education. Of course, one of the very important elements was the use of data, and the safe use of data, both by those procuring AI systems and by those developing them and selling them into schools. That was in 2020 and 2021, and we have not moved nearly far enough since that time. Obviously, this is data specific, because we are talking about the data protection Bill, but what is being proposed here would cure some of the issues that are staring us in the face.
As we have been briefed by Digital Futures for Children, and as the noble Baroness, Lady Kidron, emphasised, there is widespread invasion of children’s privacy in data collection. Sometimes there is little evidence to support the claimed learning benefits, while schools and parents lack the technical and legal expertise to understand what data is collected. As has been emphasised throughout the passage of this Bill, children deserve the highest standards of privacy and data protection—especially in education, of course.
From this direction, I wholly support what the noble Baroness, Lady Kidron, is proposing, so well supported by the noble Baroness, Lady Harding. Given that it again appears that the Government gave an undertaking to bring forward a suitable code of practice but have not done so, there is double reason to want to move forward on this during the passage of the Bill. We very much support Amendment 146 on that basis.
My Lords, I have added my name to Amendment 146 in the name of the noble Baroness, Lady Kidron, and I thank all noble Lords who have spoken.
These days, most children learn to swipe an iPad long before they learn to ride a bike. They are accessing the internet at ever younger ages on a multitude of devices. Children are choosing to spend more time online, browsing social media, playing games and using apps. However, we also force children to spend an increasing amount of time online for their education. A growing trend over the last decade or more, this escalated during the pandemic. Screen time at home became lesson time; it was a vital educational lifeline for many in lockdown.
Like other noble Lords, I am not against edtech, but the reality is that the necessary speed of the transition meant that insufficient regard was paid to children’s rights and the data practices of edtech. The noble Baroness, Lady Kidron, as ever, has given us a catalogue of abuses of children’s data which have already taken place in schools, so there is a degree of urgency about this, and Amendment 146 seeks to rectify the situation.
One in five UK internet users are children. Schools are assessing their work online; teachers are using online resources and recording enormous amounts of sensitive data about every pupil. Edtech companies have identified that such a large and captive population is potentially profitable. This amendment reinforces that children are also a vulnerable population and that we must safeguard their data and personal information on this basis. Their rights should not be traded in as the edtech companies chase profits.
The code of practice proposed in this amendment establishes standards for companies to follow, in line with the fundamental rights and freedoms as set out in the UN Convention on the Rights of the Child. It asserts that they are entitled to a higher degree of protection than adults in the digital realm. It would oblige the commissioner to prepare a code of practice which ensures this. It underlines that consultations with individuals and organisations who have the best interests of children at heart is vital, so that the enormous edtech companies cannot bamboozle already overstretched teachers and school leaders.
In education, data has always been processed from children in school. It is necessary for the school’s functioning and to monitor the educational development of individual children. Edtech is now becoming a permanent fixture in children’s schooling and education, but it is largely untested, unregulated and unaccountable. Currently, it is impossible to know what data is collected by edtech providers and how they are using it. This blurs the boundaries between the privacy-preserving and commercial parts of services profiting from children’s data.
Why is this important? First, education data can reveal particularly sensitive and protected characteristics about children: their ethnicity, religion, disability or health status. Such data can also be used to create algorithms that profile children and predict or assess their academic ability and performance; it could reinforce prejudice, create siloed populations or entrench low expectations. Secondly, there is a risk that data-profiling children can lead to deterministic outcomes, defining too early what subjects a child is good at, how creative they are and what they are interested in. Safeguards must be put in place in relation to the processing of children’s personal data in schools to protect those fundamental rights. Thirdly, of course, is money. Data is appreciating in value, resulting in market pressure for data to be collected, processed, shared and reused. Increasingly, such data processed from children in schools is facilitated by edtech, an already major and expanding sector with a projected value of £3.4 billion.
The growth of edtech’s use in schools is promoted by the Department for Education’s edtech strategy, which sets out a vision for edtech to be an
“inseparable thread woven throughout the processes of teaching and learning”.
Yet the strategy gives little weight to data protection beyond noting the importance of preventing data breaching. Tech giants have become the biggest companies in the world because they own data on us. Schoolchildren have little choice as to their involvement with these companies in the classroom, so we have a moral duty to ensure that they are protected, not commodified or exploited, when learning. It must be a priority for the Government to keep emerging technologies in education under regular review.
Equally important is that the ICO should invest in expertise specific to the domain of education. By regularly reviewing emerging technologies—those already in use and those proposed for use—in education, and their potential risks and impacts, such experts could provide clear and timely guidance for schools to protect individual children and entire cohorts. Amendment 146 would introduce a new code of practice on the processing and use of children’s data by edtech providers. It would also ensure that edtech met their legal obligations under the law, protected children’s data and empowered schools.
I was pleased to hear that the noble Baroness, Lady Kidron, has had constructive discussions with the Education Minister, the noble Baroness, Lady Barran. The way forward on this matter is some sort of joint work between the two departments. The noble Baroness, Lady Kidron, said that she hopes the Minister today will respond with equal positivity; he could start by supporting the principles of this amendment. Beyond that, I hope that he will agree to liaise with the Department for Education and embrace the noble Baroness’s request for more meetings to discuss this issue on a joint basis.
I am grateful, as ever, to the noble Baroness, Lady Kidron, for both Amendment 146 and her continued work in championing the protection of children.
Let me start by saying that the Government strongly agree with the noble Baroness that all providers of edtech services must comply with the law when collecting and making decisions about the use of children’s data throughout the duration of their processing activities. That said, I respectfully submit that this amendment is not necessary, for the reasons I shall set out.
The ICO already has existing codes and guidance for children and has set out guidance about how the children’s code, data protection and e-privacy legislation apply to edtech providers. Although the Government recognise the value that ICO codes can have in promoting good practice and improving compliance, they do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by them.
The guidance covers broad topics, including choosing a lawful basis for the processing; rules around information society services; targeting children with marketing; profiling children or making automated decisions about them; data sharing; children’s data rights; and exemptions relating to children’s data. Separately, as we have discussed throughout this debate, the age-appropriate design code deals specifically with the provision of online services likely to be accessed by children in the UK; this includes online edtech services. I am pleased to say that the Department for Education has begun discussions with commercial specialists to look at strengthening the contractual clauses relating to the procurement of edtech resources to ensure that they comply with the standards set out in the UK GDPR and the age-appropriate design code.
On the subject of requiring the ICO to develop a report with the edtech sector, with a view to creating a certification scheme and assessing compliance and conformity with data protection, we believe that such an approach should be at the discretion of the independent regulator.
The issues that have been raised in this very good, short debate are deeply important. Edtech is an issue that the Government are considering carefully—especially the Department for Education, given the increasing time spent online for education. I note that the DPA 2018 already contains a power for the Secretary of State to request new codes of practice, which could include one on edtech if the evidence warranted it. I would be happy to return to this in future but consider the amendment unnecessary at this time. For the reasons I have set out, I am not able to accept the amendment and hope that the noble Baroness will withdraw it.
I thank everyone who spoke, particularly for making it absolutely clear that not one of us, including myself, is against edtech. We just want it to be fair and want the rules to be adequate.
I am particularly grateful to the noble Baroness, Lady Jones, for detailing what education data includes. It might feel as though it is just about someone’s exam results or something that might already be public but it can include things such as how often they go to see the nurse, what their parents’ immigration status is or whether they are late. There is a lot of information quite apart from this personalised education provision, to which the noble Baroness referred. In fact, we have a great deal of emerging evidence that it has no pedagogical background to it. There is also the question of huge investment right across the sector in things where we do not know what they are. I thank the noble Baroness for that.
As to the Minister’s response, I hope that he will forgive me for being disappointed. I am grateful to him for reminding us that the Secretary of State has that power under the DPA 2018. I would love for her to use that power but, so far, it has not been forthcoming. The evidence we saw from the freedom of information request is that the scheme the department wanted to put in place has been totally retracted—and clearly for resource reasons rather than because it is not needed. I find it quite surprising that the Minister can suggest that it is all gung ho here in the UK but that Germany, Holland, France, et cetera are being hysterical in regard to this issue. Each one of them has found it to be egregious.
Finally, the AADC applies only to internet society services; there is an exception for education. Where they are joint controllers, they are outsourcing the problems to the schools, which have no level of expertise in this and just take default settings. It is not good enough, I am afraid. I feel bound to say this: I understand the needs of parliamentary business, which puts just a handful of us in this Room to discuss things out of sight, but, if the Government are not willing to protect children’s data at school, when they are in loco parentis to our children, I am really bewildered as to what this Bill is for. Education is widely understood to be a social good but we are downgrading the data protections for children and rejecting every single positive move that anybody has made in Committee. I beg leave to withdraw my amendment but I will bring this back on Report.