Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateViscount Camrose
Main Page: Viscount Camrose (Conservative - Excepted Hereditary)Department Debates - View all Viscount Camrose's debates with the Department for Science, Innovation & Technology
(1 day, 16 hours ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord Clement-Jones, for raising these significant issues. While I share some of the concerns expressed, I find myself unable—at least for the moment—to offer support for the amendments in their current form.
Amendment 17 seeks to remove the powers granted to the Secretary of State to override primary legislation and to modify aspects of UK data protection law via statutory instrument. I agree with the principle underpinning this amendment: that any changes to data protection law must be subject to appropriate scrutiny. It is essential that parliamentary oversight remains robust and meaningful, particularly when it comes to matters as sensitive and far-reaching as data protection.
However, my hesitation lies in the practical implications of the amendment. While I sympathise with the call for greater transparency, I would welcome more detail on how this oversight mechanism might work in practice. Would it involve enhanced scrutiny procedures or a stronger role for relevant parliamentary committees? I fear that, without this clarity, we risk creating uncertainty in an area that requires, above all, precision and confidence.
The Minister’s Amendment 18 inserts specific protections for children’s personal data into the UK GDPR framework. The Government have rightly emphasised the importance of safeguarding children in the digital age. I commend the intention behind the amendment and agree wholeheartedly that children deserve special protections when it comes to the processing of their personal data.
It is worth noting that this is a government amendment to their own Bill. While Governments amending their own legislation is not unprecedented—the previous Government may have indulged in the practice from time to time—it is a practice that can give rise to questions. I will leave my comments there; obviously it is not ideal, but these things happen.
Finally, Amendment 21, also tabled by the noble Lord, Lord Clement-Jones, mirrors Amendment 17 in seeking to curtail the Secretary of State’s powers to amend primary legislation via statutory instrument. My earlier comments on the importance of parliamentary oversight apply here. As with Amendment 17, I am of course supportive of the principle. The delegation of such significant powers to the Executive should not proceed without robust scrutiny. However, I would appreciate greater clarity on how this proposed mechanism would function in practice. As it stands, I fear that the amendment raises too many questions. If these concerns could be addressed, I would be most grateful.
In conclusion, these amendments raise important points about the balance of power between the Executive and Parliament, as well as the protection of vulnerable individuals in the digital sphere. I look forward to hearing more detail and clarity, so that we can move forward with confidence.
My Lords, government Amendment 18 is similar to government Amendment 40 in the previous group, which added an express reference to children meriting specific protection to the new ICO duty. This amendment will give further emphasis to the need for the Secretary of State to consider the fact that children merit specific protection when deciding whether to use powers to amend the list of recognised legitimate interests.
Turning to Amendment 17 from the noble Lord, Lord Clement-Jones, I understand the concerns that have been raised about the Secretary of State’s power to add or vary the list of recognised legitimate interests. This amendment seeks to remove the power from the Bill.
In response to some of the earlier comments, including from the committees, I want to make it clear that we have constrained these powers more tightly than they were in the previous data Bill. Before making any changes, the Secretary of State must consider the rights and freedoms of individuals, paying particular attention to children, who may be less aware of the risks associated with data processing. Furthermore, any addition to the list must meet strict criteria, ensuring that it serves a clear and necessary public interest objective as described in Article 23.1 of the UK GDPR.
The Secretary of State is required to consult the Information Commissioner and other stakeholders before making any changes, and any regulations must then undergo the affirmative resolution procedure, guaranteeing parliamentary scrutiny through debates in both Houses. Retaining this regulation-making power would allow the Government to respond quickly if future public interest activities are identified that should be added to the list of recognised legitimate interests. However, the robust safeguards and limitations in Clause 70 will ensure that these powers are used both sparingly and responsibly.
I turn now to Amendment 21. As was set out in Committee, there is already a relevant power in the current Data Protection Act to provide exceptions. We are relocating the existing exemptions, so the current power, so far as it relates to the purpose limitation principle, will no longer be relevant. The power in Clause 71 is intended to take its place. In seeking to reassure noble Lords, I want to reiterate that the power cannot be used for purposes other than the public interest objectives listed in Article 23.1 of the UK GDPR. It is vital that the Government can act quickly to ensure that public interest processing is not blocked. If an exemption is misused, the power will also ensure that action can be swiftly taken to protect data subjects by placing extra safeguards or limitations on it.
My Lords, as we reach the end of this important group, I thank particularly my noble friend Lady Harding for her contribution and detailed account of some of the issues being faced, which I found both interesting and valuable. I thought the example about the jazz concert requiring the combination of those different types of data was very illuminating. These proposed changes provide us the opportunity to carefully balance economic growth with the fundamental right to data privacy, ensuring that the Bill serves all stakeholders fairly.
Amendment 24 introduces a significant consideration regarding the use of the open electoral register for direct marketing purposes. The proposal to include data from the OER, combined with personal data from other sources, to build marketing profiles creates a range of issues that require careful consideration.
Amendment 24 stipulates that transparency obligations must be fulfilled when individuals provide additional data to a data provider, and that this transparency should be reflected both in the privacy policy and via a data notification in a direct mail pack. While there is certainly potential to use the OER to enhance marketing efforts and support economic activity, we have to remain vigilant to the privacy implications. We need to make sure that individuals are informed of how and where their OER data is being processed, especially when it is combined with other data sources to build profiles.
The requirement for transparency is a positive step, but it is essential that these obligations are fully enforced and that individuals are not left in the dark about how their personal information is being used. I hope the Minister will explain a little more about how these transparency obligations will be implemented in practice and whether additional safeguards are proposed.
Amendment 49 introduces a change to Regulation 22, creating an exception for charities to use electronic mail for direct marketing in specific circumstances. This amendment enables charities to send direct marketing emails when the sole purpose is to further one or more of their charitable purposes, provided that certain conditions are met. These conditions include that the charity obtained the recipient’s contact details when the individual expressed interest in the charity or offered previous support for the charity. This provision recognises the role of charities in fundraising and that their need to communicate with volunteers, supporters or potential donors is vital for their work.
However, I understand the argument that we must ensure that the use of email marketing does not become intrusive or exploitative. The amendment requires that recipients are clearly informed about their right to refuse future marketing communications and that this option is available both when the data is first collected and with every subsequent communication. This helps strike the right balance between enabling charities to raise funds for their causes and protecting individuals from unwanted marketing.
I welcome the Government’s commitment to ensuring that charities continue to engage with their supporters while respecting individuals’ right to privacy. However, it is essential that these safeguards are robustly enforced to prevent exploitation. Again, I look forward to hearing from the Minister on how the Government plan to ensure that their provisions will be properly implemented and monitored.
Amendment 50 introduces the concept of soft opt-ins for email marketing by charities, allowing them to connect with individuals who have previously expressed interest in their charitable causes. This can help charities maintain and grow their supporter base but, again, we must strike the right balance with the broader impact this could have on people in receipt of this correspondence. It is crucial that any system put in place respects individuals’ right to privacy and their ability to opt out easily. We must ensure that charities provide a clear, simple and accessible way for individuals to refuse future communications, and that this option is consistently available.
Finally, we should also consider the rules governing the use of personal data by political parties. This is, of course, an area where we must ensure that transparency, accountability and privacy are paramount. Political parties, like any other organisation, must be held to the highest standards in their handling of personal data. I hope the Government can offer some clear guidance on improving and strengthening the rules surrounding data use by political parties to ensure that individuals’ rights are fully respected and protected.
My Lords, I rise to speak to Amendments 26, 31 and 32 tabled in my name and that of my noble friend Lord Markham. I will address the amendments in reverse order.
Amendment 32 would ensure that, where a significant decision is taken by ADM, the data subject was able to request intervention by a human with sufficient competency and authority. While that is clearly the existing intent of the ADM provisions in the Bill, this amendment brings further clarity. I am concerned that, where data processors update their ADM procedures in the light of this Bill, it should be abundantly clear to them at every stage what the requirements are and that, as currently written, there may be a risk of misunderstanding. Given the significance of decisions that may be made by ADM, we should make sure this does not happen. Data subjects must have recourse to a person who both understands their problem and is able to do something about it. I look forward to hearing the Minister’s views on this.
Amendment 31 would require the Secretary of State to provide guidance on how consent should be obtained for ADM involving special category data. It would also ensure that this guidance was readily available and reviewed frequently. The amendment would provide guidance for data controllers who wish to use ADM, helping them to set clear processes for obtaining consent, thus avoiding complaints and potential litigation.
We all know that litigation can be slow, disruptive and sometimes prohibitively expensive. If we want to encourage the use of ADM so that customers and businesses can save both time and money, we should seek to ensure that the sector does not become a hotbed of litigation. The risk can be mitigated by providing ample guidance for the sector. For relatively minimal effort on the part of the Secretary of State, we may be able to facilitate substantial growth in the use and benefits of ADM. I would be most curious to hear the Minister’s opinions on this matter and, indeed, the opinions of noble Lords more broadly.
Amendment 26 would insert the five principles set out in the AI White Paper published by the previous Government, requiring all data controllers and processors who partake in AI-driven ADM to have due regard for them. In the event that noble Lords are not familiar with these principles, they are: safety, security and robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress.
These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real, and popular, safeguards against the risks of AI while continuing to foster innovation.
There is a requirement. Going back to the issue of principles, which was discussed earlier on, one of the existing principles—which I am now trying to locate and cannot—is transparency. I expect that we would make as much of the information public as we can in order to ensure good decision-making and assure people as to how the decisions have been reached.
I thank all noble Lords and the Minister for their comments and contributions to what has been a fascinating debate. I will start by commenting on the other amendments in this group before turning to those in my name.
First, on Amendments 28 and 29, I am rather more comfortable with the arrangements for meaningful human intervention set out in the Bill than the noble Lord, Lord Clement-Jones. For me, either a decision has meaningful human intervention or it does not. In the latter case, certain additional rights kick in. To me, that binary model is clear and straightforward, and could only be damaged by introducing some of the more analogue concepts such as “predominantly”, “principally”, “mainly” or “wholly”, so I am perfectly comfortable with that as it is.
However, I recognise that puts a lot of weight on to the precise meaning of “meaningful human involvement”. Amendment 36 in the name of the noble Lord, Lord Clement-Jones, which would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the ICO, seems to take on some value in those circumstances, so I am certainly more supportive of that one.
As for Amendments 34 and 35 in the names of the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Freeman, I absolutely recognise the value and potential of efficacy; I agree it is a very valuable term. I have more faith in the rollout and use of the ATRS but on a non-statutory basis, believing, as I do, that this would allow it to continue to develop in an agile and adaptive manner. I welcome the Minister’s words on this subject, and for now I remain comfortable that the ATRS is the direction forward for that.
I turn to the amendments in my name. I thank all noble Lords and, indeed, the Minister for their comments and contributions regarding Amendments 31 and 32. I very much take the Minister’s point that definitions of consent feature elsewhere in the Bill. That reduces my concern somewhat.
However, I continue to strongly commend Amendment 26 to the House. I believe it will foster innovation while protecting data rights. It is popular with the public and with private sector stakeholders. It will bring about outcomes that we all want to see in AI safety without stifling this new and exciting technology. In the absence of an AI Bill—and possibly even in the presence of one—it is the only AI-specific legislation that will be around. It is important somehow to get those AI principles in the Bill, at least until an AI Bill comes along. With this in mind, I wish to test the opinion of the House.
My Lords, I will speak very briefly, given the hour, just to reinforce three things that I have said as the wingman to the noble Baroness, Lady Kidron, many times, sadly, in this Chamber in child safety debates. The age-appropriate design code that we worked on together and which she championed a decade ago has driven real change. So we have evidence that setting in place codes of conduct that require technology companies to think in advance about the potential harms of their technologies genuinely drives change. That is point one.
Point two is that we all know that AI is a foundational technology which is already transforming the services that our children use. So we should be applying that same principle that was so hard fought 10 years ago for non-AI digital to this foundational technology. We know that, however well meaning, technology companies’ development stacks are always contended. They always have more good things that they think they can do to improve their products for their consumers, that will make them money, than they have the resources to do. However much money they have, they just are contended. That is the nature of technology businesses. This means that they never get to the safety-by-design issues unless they are required to. It was no different 150 or 200 years ago as electricity was rolling through the factories of the mill towns in the north of England. It required health and safety legislation. AI requires health and safety legislation. You start with codes of conduct and then you move forward, and I really do not think that we can wait.
My Lords, Amendment 41 aims to establish a code of practice for the use of children’s data in the development of AI technologies. In the face of rapidly advancing AI, it is, of course, crucial that we ensure children’s data is handled with the utmost care, prioritising their best interests and fundamental rights. We agree that AI systems that are likely to impact children should be designed to be safe and ethical by default. This code of practice will be instrumental in guiding data controllers to ensure that AI development and deployment reflect the specific needs and vulnerabilities of children.
However, although we support the intent behind the amendment, we have concerns, which echo concerns on amendments in a previous group, about the explicit reference to the UN Convention on the Rights of the Child and general comment 25. I will not rehearse my comments from earlier groups, except to say that it is so important that we do not have these explicit links to international frameworks, important as they are, in UK legislation.
In the light of this, although we firmly support the overall aim of safeguarding children’s data in AI, we believe this can be achieved more effectively by focusing on UK legal principles and ensuring that the code of practice is rooted in our domestic context.
I thank the noble Lord, Lord Clement-Jones, for Amendment 33, and the noble Baroness, Lady Kidron, for Amendment 41, and for their thoughtful comments on AI and automated decision-making throughout this Bill’s passage.
The Government have carefully considered these issues and agree that there is a need for greater guidance. I am pleased to say that we are committing to use our powers under the Data Protection Act to require the ICO to produce a code of practice on AI and solely automated decision-making through secondary legislation. This code will support controllers in complying with their data protection obligations through practical guidance. I reiterate that the Government are committed to this work as an early priority, following the Bill receiving Royal Assent. The secondary legislation will have to be approved by both Houses of Parliament, which means it will be scrutinised by Peers and parliamentarians.
I can also reassure the noble Baroness that the code of practice will include guidance about protecting data subjects, including children. The new ICO duties set out in the Bill will ensure that where children’s interests are relevant to any activity the ICO is carrying out, it should consider the specific protection of children. This includes when preparing codes of practice, such as the one the Government are committing to in this area.
I understand that noble Lords will be keen to discuss the specific contents of the code. The ICO, as the independent data protection regulator, will have views as to the scope of the code and the topics it should cover. We should allow it time to develop those thoughts. The Government are also committed to engaging with noble Lords and other stakeholders after Royal Assent to make sure that we get this right. I hope noble Lords will agree that working closely together to prepare the secondary legislation to request this code is the right approach instead of pre-empting the exact scope.
The noble Lord, Lord Clement-Jones, mentioned edtech. I should add—I am getting into a habit now—that it is discussed in a future group.
I have added my name to this amendment, about which the noble Lord, Lord Clement-Jones, has spoken so eloquently, because of the importance to our economic growth of maintaining data adequacy with the EU. I have two points to add to what he said.
First, as I said and observed on some occasions in Committee, this is legislation of unbelievable complexity. It is a bad read, except if you want a cure for insomnia. Secondly, it has the technique of amending and reamending earlier legislation. Thirdly, this is not the time to go into detail of the legal problems that arise, some of which we canvassed in Committee, as to whether this legislation has no holes in it. I do not think I would be doing any favours either to the position of the United Kingdom or to those who have been patient enough to stay and listen to this part of the debate by going into any of those in any detail, particularly those involving the European Convention on Human Rights and the fundamental charter. That is my first point, on the inherent nature of the legislative structure that we have created. As I said earlier, I very much hope we will never have such legislation again.
Secondly, in my experience, there is a tendency among lawyers steeped in an area or department often to feel, “Well, we know it’s all right; we built it. The legislation’s fine”. Therefore, there is an additional and important safeguard that I think we should adopt, which is for a fresh pair of eyes, someone outside the department or outside those who have created the legislation, to look at it again to see whether there are any holes in it. We cannot afford to go into this most important assessment of data adequacy without ensuring that our tackle is in order. I appreciate what the Minister said on the last occasion in Committee—it is for the EU to pick holes in it—but the only prudent course when dealing with anything of this complexity in a legal dispute or potential dispute is to ensure that your own tackle is in order and not to go into a debate about something without being sure of that, allowing the other side to make all the running. We should be on top of this and that is why I very much support this amendment.
My Lords, I thank the noble Lord, Lord Clement-Jones—as ever—and the noble and learned Lord, Lord Thomas, for tabling Amendment 37 in their names. It would introduce a new clause that would require the Secretary of State to carry out an impact assessment of this Act and other changes to the UK’s domestic and international frameworks relating to data adequacy before the European Union’s reassessment of data adequacy in June this year.
I completely understand the concerns behind tabling this amendment. In the very worst-case scenario, of a complete loss of data adequacy in the assessment by the EU, the effect on many businesses and industries in this country would be knocking at the door of catastrophic. It cannot be allowed to happen.
However, introducing a requirement to assess the impact of the Bill on the European Union data adequacy decision requires us to speculate on EU intentions in a public document, which runs the risk of prompting changes on its part or revealing our hand to it in ways that we would rather not do. It is important that we do two things: understand our risk, without necessarily publishing it publicly; and continue to engage at ministerial and official level, as I know we are doing intensively. I think the approach set out in this amendment runs the risk of being counterproductive.
I thank the noble Lord, Lord Clement-Jones, for his amendment, and the noble and learned Lord, Lord Thomas, for his contribution. I agree with them on the value and importance placed on maintaining our data adequacy decisions from the EU this year. That is a priority for the Government, and I reassure those here that we carefully considered all measures in the light of the EU’s review of our adequacy status when designing the Bill.
The Secretary of State wrote to the House of Lords European Affairs Committee on 20 November 2024 on this very point and I would be happy to share this letter with noble Lords if that would be helpful. The letter sets out the importance this Government place on renewal of our EU adequacy decisions and the action we are taking to support this process.
It is important to recognise that the EU undertakes its review of its decisions for the UK in a unilateral, objective and independent way. As the DSIT Secretary of State referenced in his appearance before the Select Committee on 3 December, it is important that we acknowledge the technical nature of the assessments. For that reason, we respect the EU’s discretion about how it manages its adequacy processes. I echo some of the points made by the noble Viscount, Lord Camrose.
That being said, I reassure noble Lords that the UK Government are doing all they can to support a swift renewal of our adequacy status in both technical preparations and active engagement. The Secretary of State met the previous EU Commissioner twice last year to discuss the importance of personal data sharing between the UK and EU. He has also written to the new Commissioner for Justice responsible for the EU’s review and looks forward to meeting Commissioner McGrath soon.
I also reassure noble Lords that DSIT and the Home Office have dedicated teams that have been undertaking preparations ahead of this review, working across government as needed. Those teams are supporting European Commission officials with the technical assessment as required. UK officials have met with the European Commission four times since the introduction of the Bill, with future meetings already in the pipeline.