(2 weeks, 4 days ago)
Grand CommitteeMy Lords, a key aspect of data protection rests in how it restricts the use of personal data once it has been collected. The public need confidence that their data will be used for the reasons they had shared it and not further used in ways that breach their legitimate expectations—or they will become suspicious as regards providing their data. The underlying theme that we heard on the previous group was the danger of losing public trust, which very much applies in the area of law enforcement and national security.
However, Schedules 4 and 5 would remove the requirement to consider the legitimate expectations of the individuals whose data is being processed, or the impact that this would have on their rights, for the purposes of national security, crime detection and prevention, safeguarding or answering to a request by a public authority. Data used for the purposes listed in these schedules would not need to undergo either a balancing test under Article 6.1(f) or a compatibility test under Article 6.4 of the UK GDPR. The combined effect of these provisions would be to authorise almost unconditional data sharing for law enforcement and other public security purposes while, at the same time, reducing accountability and traceability over how the police use the information being shared with them.
As with the previous DPDI Bill, Clauses 87 to 89 of this Bill grant the Home Secretary and police powers to view and use people’s personal data through the use of national security certificates and designation notices, which are substantially the same as Clauses 28 to 30 of the previous DPDI Bill. This risks further eroding trust in law enforcement authorities. Accountability for access to data for law enforcement purposes should not be lowered, and data sharing should be underpinned by a robust test to ensure that individuals’ rights and expectations are not disproportionately impacted. It is a bafflement as to why the Government are so slavishly following their predecessor and believe that these new and unaccountable powers are necessary.
By opposing that Clause 81 stand part, I seek to retain the requirement for police forces to record the reason they are accessing data from a police database. The public need more, not less, transparency and accountability over how, why and when police staff and officers access and use records about them. Just recently, the Met Police admitted that they investigated more than 100 staff over the inappropriate accessing of information in relation to Sarah Everard. This shows that the police can and do act to access information inappropriately, and there may well be less prominent cases where police abuse their power by accessing information without worry for the consequences.
Regarding Amendments 126, 128 and 129, Rights and Security International has repeatedly argued that the Bill would violate the UK’s obligations under the European Convention on Human Rights. On Amendment 126, the requirements in the EU law enforcement directive for logging are, principally, to capture in all cases the justification for personal data being examined, copied, amended or disclosed when it is processed for a law enforcement process—the objective is clearly to ensure that data is processed only for a legitimate purpose—and, secondarily, to identify when, how and by whom the data has been accessed or disclosed. This ensures that individual accountability is captured and recorded.
Law enforcement systems in use in the UK typically capture some of the latter information in logs, but very rarely do they capture the former. Nor, I am informed, do many commodity IT solutions on the market capture why data was accessed or amended by default. For this reason, a long period of time was allowed under the law enforcement directive to modify legacy systems installed before May 2016, which, in the UK, included services such as the police national computer and the police national database, along with many others at a force level. This transitional relief extended to 6 May 2023, but UK law enforcement did not, in general, make the required changes. Nor, it seems, did it ensure that all IT systems procured after 6 May 2016 included a strict requirement for LED-aligned logging. By adopting and using commodity and hyperscaler cloud services, it has exacerbated this problem.
In early April 2023, the Data Protection Act 2018 (Transitional Provision) Regulations 2023 were laid before Parliament. These regulations had the effect of unilaterally extending the transitional relief period under the law enforcement directive for the UK from May 2023 to May 2026. The Government now wish to strike the requirement to capture the justification for any access to data completely, on the basis that this would free up to 1.5 million hours a year of valuable police time for our officers so that they can focus on tackling crime on our streets, rather than being bogged down by administration, and that this would save approximately £42.8 million per year in taxpayers’ money.
This is a serious legislative issue on two counts: it removes important evidence that may identify whether a person was acting with malicious intent when accessing data, as well as removing any deterrent effect of them having to do so; and it directly deviates from a core part of the law enforcement directive and will clearly have an impact on UK data adequacy. The application of effective control over access to data is very much a live issue in policing, and changing the logging requirement in this way does nothing to improve police data management. Rather, it excuses and perpetuates bad practice. Nor does it increase public confidence.
Clause 87(7) introduces new Section 78A into the Act. This lays down a number of exemptions and exclusions from Part 3 of that Act when the processing is deemed to be in the interests of national security. These exemptions are wide ranging, and include the ability to suspend or ignore principles 2 through 6 in Part 3, and thus run directly contrary to the provisions and expectations of the EU law enforcement directive. Ignoring those principles in itself also negates many of the controls and clauses in Part 3 in its entirety. As a result, they will almost certainly result in the immediate loss of EU law-enforcement adequacy.
I welcome the ministerial letter from the noble Lord, Lord Hanson of Flint, to the noble Lord, Lord Anderson, of 6 November, but was he really saying that all the national security exemption clause does is bring the 2018 Act into conformity with the GDPR? I very much hope that the Minister will set out for the record whether that is really the case and whether it is really necessary to safeguard national security. Although it is, of course, appropriate and necessary for the UK to protect its national security interests, it is imperative that balance remains to protect the rights of a data subject. These proposals do not, as far as we can see, strike that balance.
Clause 88 introduces the ability of law enforcement, competent authorities and intelligence agencies to act as joint controllers in some circumstances. If Clause 88 and associated clauses go forward to become law, they will almost certainly again result in withdrawal of UK law enforcement adequacy and will quite likely impact on the TCA itself.
Amendment 127 is designed to bring attention to the fact that there are systemic issues with UK law enforcement’s new use of hyperscaler cloud service providers to process personal data. These issues stem from the fact that service providers’ standard contracts and terms of service fail to meet the requirements of Part 3 of the UK’s Data Protection Act 2018 and the EU law enforcement directive. UK law enforcement agencies are subject to stringent data protection laws, including Part 3 of the DPA and the GDPR. These laws dictate how personal data, including that of victims, witnesses, suspects and offenders, can be processed. Part 3 specifically addresses data transfers to third countries, with a presumption against such transfers unless strictly necessary. This contrasts with UK GDPR, which allows routine overseas data transfer with appropriate safeguards.
Cloud service providers routinely process data outside the UK and lack the necessary contractual guarantees and legal undertakings required by Part 3 of the DPA. As a result, their use for law enforcement data processing is, on the face of it, not lawful. This non-compliance creates significant financial exposure for the UK, including potential compensation claims from data subjects for distress or loss. The sheer volume of data processed by law enforcement, particularly body-worn video footage, exacerbates the financial risk. If only a small percentage of cases result in claims, the compensation burden could reach hundreds of millions of pounds annually. The Government’s attempts to change the law highlight the issue and suggest that past processing on cloud service providers has not been in conformity with the UK GDPR and the DPA.
The current effect of Section 73(4)(b) of the Data Protection Act is to restrict transfers for competent authorities who may have a legitimate operating need, and should possess the internal capability to assess that need, from making transfers to recipients who are not relevant authorities or international organisations and that cloud service provider. This amendment is designed to probe what impact removal of this restriction would have and whether it would enable them to do so where such a transfer is justified and necessary. I beg to move.
My Lords, I will speak to Amendment 124. I am sorry that I was not able to speak on this issue at Second Reading. I am grateful to the noble and learned Lord, Lord Thomas of Cwmgiedd, for his support, and I am sorry that he has not been able to stay, due to a prior engagement.
Eagle-eyed Ministers and the Opposition Front Bench will recognise that this was originally tabled as an amendment to the Data Protection and Digital Information (No. 2) Bill. It is still supported by the Police Federation. I am grateful to the former Member of Parliament for Loughborough for originally raising this with me, and I thank the Police Federation for its assistance in briefing us in preparing this draft clause. The Police Federation understands that the Home Secretary is supportive of the objective of this amendment, so I shall listen with great interest to what the Minister has to say.
This is a discrete amendment designed to address an extremely burdensome and potentially unnecessary redaction exercise, in relation to a situation where the police are preparing a case file for submission to the Crown Prosecution Service for a charging decision. Given that this issue was talked about in the prior Bill, I do not intend to go into huge amounts of detail because we rehearsed the arguments there, but I hope very much that with the new Government there might be a willingness to entertain this as a change in the law.
(3 months, 3 weeks ago)
Lords ChamberMy Lords, I agree with my noble friend that we must protect the UK’s democratic integrity. Our Defending Democracy Taskforce safeguards our democratic institutions and processes from threats, including misinformation and disinformation. Sharing best practice and strategic insights with international partners helps industry and Government to protect our democracy from media threats. Under the Online Safety Act, companies must act against illegal content, including the incitement of violence, hate speech and state-backed disinformation, and remove it. Where hateful content or misinformation and disinformation are prohibited in the largest platforms’ terms of service, they must remove it.
My Lords, false information is as likely to be spread through online platforms with smaller numbers of users as those with many users. We have heard about the role of Telegram in spreading disinformation about this summer’s disorder, as well as the terrible suicide forums. I was very pleased to see the Secretary of State’s letter to Ofcom this week on “small but risky” online services. Will the Minister meet me to discuss the issue of platform categorisation, given the amendment I proposed to the then Online Safety Bill, which this House passed in July 2023?
My Lords, of course I am very happy to meet the noble Baroness to discuss this further, and I pay tribute to the work she has done on this issue in the past. On “small but risky” services, as she knows, the Secretary of State has written to Melanie Dawes, the CEO of Ofcom, and a very detailed reply was received today from Ofcom. We are still absorbing everything that it is proposing, but it is clear that it is taking this issue very seriously. That will give us the focus for our discussion when we meet.
(10 months, 4 weeks ago)
Grand CommitteeMy Lords, it is a pleasure to speak here this afternoon. I apologise to the Committee for not being able to speak at Second Reading. I declare my interest as the founder and trustee of a mental health charity in Leicestershire, the Loughborough Wellbeing Centre.
It will not surprise my noble friend the Minister, I suspect, to know that this is a probing amendment. However, given that we are debating in this part of the Bill the enforcement of consumer protection, the matter that I raise relates directly to the greatest harm that a consumer can suffer: their death.
In June 2022, I asked my noble friend Lord Parkinson the following Oral Question: what plans do
“Her Majesty’s Government … have to address online retailers’ algorithmic recommendations for products that can be used for the purposes of suicide”?
At the time, the most obvious Bill to address this matter was the Online Safety Bill, which, as we know, focused on harmful content in particular. In my follow-up question, I said:
“When a particular well-known suicide manual is searched for on Amazon, the site’s algorithmic recommendations then specifically suggest material that can be used, or easily assembled, into a device intended to take one’s own life. If this is not to be regulated as harmful content under the Online Safety Bill, how can this sort of harm be regulated?”—[Official Report, 27/6/22; col. 434.]
This amendment is particularly close to my heart because, sadly, when I was a Member of Parliament, a constituent bought a manual on Amazon then completed suicide. The amendment would amend Clause 149 by expanding the specified prohibition condition definition by adding a commercial practice that
“targets consumers with marketing material for products intended to be used by that person to take their own life.”
I am grateful to the Mental Health Foundation for its support with this amendment.
Even today, Amazon continues to algorithmically recommend products that can be used to take one’s own life to users viewing suicide manuals online. To be specific, users searching for a suicide manual will be recommended specific materials that are touted as being highly effective and painless ways to take one’s own life. Amazon facilitates users purchasing the key items that they need, from instructions to materials, in a few clicks. I would like to think that this is not intentional.
In the overwhelming majority of cases, such automatic recommendation will be harmless and will help consumers to find products that might interest them. However, in this instance, a usually harmless algorithm is functioning to provide people with material that they may use to end their own lives. This risk is not just theoretical. Amazon is recommending products that there have been concerted public health efforts to address in this country and which are known to have caused deaths. So as not to make them better known, I will not name them.
It is particularly important that Amazon ceases to highlight novel suicide methods, as its recommendation algorithm currently does by recommending products to users. There is clear evidence that, when a particular suicide method becomes better known, the effect is not simply that suicidal people switch from one intended method to the novel one but that suicide occurs in people who would not otherwise have taken their own lives. This probing amendment is intended to draw the Government’s attention to this concerning issue. I have spoken about Amazon today given its position in the market and its known bad practice in this area, but the principle of course goes beyond Amazon. New retailers may well emerge in the future and a principle should be established that this type of behaviour is not acceptable.
While I suspect that my noble friend the Minister is going to tell me that the Bill is not the right place for this amendment, I hope that he will agree that a crackdown on these harmful algorithmic recommendations to protect consumers—it was the word “consumers” that meant that it was not suitable for the Online Safety Bill—is needed, in the spirit of consumer protection sought in the Bill. I hope that, at the very least, he will agree to meet me to discuss this further and to help me to raise it with the relevant department, if it is not his. I beg to move Amendment 110.
My Lords, I have one amendment in this group, Amendment 110A, which will be echoed in subsequent groups as part of a general concern about making sure that trading standards are an effective body in the UK and are able to do what they are supposed to do to look after consumers.
As the Minister will know, because we were part of the same conversation, the CMA is concerned that trading standards may have been reduced to the point where they are not as effective as they ought to be. Looking at some of the local cuts—in Enfield, for instance, four officers have been cut down to one—and listening to various people involved in trading standards, there is a general concern that, as they are set up and funded at the moment, they are not able to perform the role that they should be. Given the importance that enforcers have in the structure that the Government are putting together, I am asking in this amendment that the Government review that effectiveness, take a serious look at the structures that they have created and their capability of performing as they would wish under the Bill and report within a reasonable period.
My Lords, I am grateful to my noble friend the Minister for his response, which I will come back to in a moment.
I thank the noble Lords, Lord Clement-Jones and Lord Bassam, for their support for my amendment. It is small but, I hope, would be highly effective if it were accepted. The noble Lord, Lord Clement-Jones, and I spent a long time debating the Online Safety Act last year. It is clear that online marketplaces are not covered. My noble friend the Minister mentioned user-to-user sites and search engines. They are obviously online marketplaces and highly significant businesses—I have mentioned Amazon but there are others—and I do not think the Department for Business and Trade should be agnostic about harmful materials sold on these sites.
I thank the noble Lords who have spoken on Amendment 110 for the sensitivity that they have shown on this difficult topic. I am grateful to my noble friend for the offer of a meeting to look at the scope of the Bill before Report. I will of course withdraw Amendment 110 at this stage, but I look forward to that meeting and further discussions on this important topic.