(2 months ago)
Lords ChamberIt is important that the convention does not introduce new human rights. Instead, it is meant to make sure that, during its development, AI takes into account the existing rules and regulations and the appropriate respect of democracy and freedoms that are already enshrined in laws and taken into account in practice. I agree that this can be done in a way that does not mean new red tape.
My Lords, it is important to note the remarks of the Prime Minister, and indeed his Ministers, at the investment conference yesterday. When talking about artificial intelligence, they encouraged entrepreneurs in particular to have as little limitation on the development of AI as possible. Bearing in mind the position of the United States, which has a very free approach, and the European Union, which now has strict regulation, is the Minister confident that this Government will be putting in place the right balance in regulating AI?
The convention has been signed by the US as well as the EU, the UK and various other nations. On the point about red tape, it is very important that, as we think about AI, we do not introduce measures which restrict innovation. At the investment summit yesterday, Eric Schmidt said very clearly that some guidelines are rather important; otherwise, companies do not have certainty and cannot progress. Getting that balance—getting some guidelines without restrictions—will be our clear priority.
(7 months, 1 week ago)
Lords ChamberOne of the principles we set out in our AI White Paper is transparency. That principle—repeated across the OECD and in the EU’s AI Act—will go a long way towards doing what the noble Baroness asks. There are, though, a number of technical difficulties in implementing transparency—not legally, from our side, but rather, the computer science problems associated with processing the very large quantities of data required to generate true transparency.
My Lords, a lot of people are excited by the prospects for AI. Indeed, this country is in the lead in developing such policies and the associated opportunities. As one of those involved in preparing the GDPR in Brussels, I am concerned that the opportunities and excitement associated with the use of AI must be balanced against the protection of individual privacy and the rights of corporate structures and individuals who are worried about the abuses that might occur unless legislators are up to date and moving fast enough to deal with these matters.
My noble friend makes some important points: AI must advance on the back of well-executed data protection. Let me take the opportunity to thank him for his outstanding contributions during the recently completed Committee stage of the Data Protection and Digital Information Bill. We continue to share the goal that he set up.
(7 months, 1 week ago)
Lords ChamberI start by saying that I very much share the view of the importance of protecting the forthcoming general election—and indeed every election—from online deepfakes, whether generated by AI or any other means. I think it is worth reminding the House that a range of existing criminal offences, such as the foreign interference offence, the false communications offence and offences under the Representation of the People Act, already address the use of deepfakes to malignly influence elections. While these Acts will go some way to deterring, I also think it is important to remind the House of the crucial non-legislative measures that we can take, continue to take and will take up to the completion of the election.
My Lords, would my noble friend not agree that there is an issue regarding the distortion of what politicians say, both through video and through the written word? Would he give me some indication of what the position is regarding Hansard and the coverage of what is said in this House and in the other place? Are we sufficiently protected if that written record is distorted or abused by others in the media?
Indeed—and let me first thank my noble friend for bringing up this important matter. That sounds to me like something that would be likely to be applied under the false communications offence in the Online Safety Act—Section 179—although I would not be able to say for sure. The tests that it would need to meet are that the information would have to be knowingly false and cause non-trivial physical or psychological harm to those offended, but that would seem to be the relevant offence.
(8 months ago)
Grand CommitteeOnce more unto the breach, my Lords—as opposed to “my friends”.
I will also speak to Amendments 112 to 114, 116 and 130. New Article 45B(2) lists conditions that the Secretary of State must consider when deciding whether a third country provides an adequate level of protection for data subjects. It replaces the existing conditions in Article 45(2)(a) to (c) of the UK GDPR, removing important considerations such as the impact of a third country’s laws and practices in relation to national security, defence, public security, criminal law and public authority access to personal data on the level of protection provided to UK data subjects.
Despite this shorter list of conditions to consider, the Secretary of State is none the less required to be satisfied that a third country provides a level of protection that is not materially lower than the UK’s. It is plain that such an assessment cannot be made without considering the impact of these factors on the level of protection for UK data in a third country. It is therefore unclear why the amendment that the Government have made to Article 45 is necessary, beyond a desire for the Government to draw attention away from such contentious and complicated issues.
It may be that through rewriting Article 45 of the UK GDPR, the Government’s intention is that assimilated case law on international data transfers is no longer relevant. If that is the case, that would be a substantial risk for UK data adequacy. Importantly, new Article 45B(2) removes the reference to the need for an independent data protection regulator in the relevant jurisdiction. This, sadly, is consistent with the theme of diminishing the independence of the ICO, which is one of the major concerns in relation to the Bill, and it is also an area where the European Commission has expressed concern. The independence of the regulator is a key part of the EU data adequacy regime and is explicitly referenced in Article 8 of the Charter of Fundamental Rights, which guarantees the right to protection of personal data. Amendment 111 restores the original considerations that the Secretary of State must take into account.
Amendments 112 and 113 would remove the proposed powers in Schedules 5 and 6 of the Secretary of State to assess other countries’ suitability for international transfers of data, and place these on the new information commission instead. In the specific context of HIV—the provenance of these amendments is in the National AIDS Trust’s suggestions—it is unlikely that the Secretary of State or their departmental officials will have the specialist knowledge to assess whether there is a risk of harm to an individual by transferring data related to their HIV status to a third country. Given that the activities of government departments are political by their nature, the Secretary of State making these decisions related to the suitability of transfer to third countries may not be viewed as objective by individuals whose personal data is transferred. Many people living with HIV feel comfortable reporting breaches of data protection law in relation to their HIV status to the Information Commissioner’s Office due to its position as an independent regulator, so the National AIDS Trust and others recommend that the Bill places these regulatory powers on the new information commission created by the Bill instead, as this may inspire greater public confidence.
As regards Amendment 114, paragraph 5 of Schedule 5 should contain additional provisions to mandate annual review of the data protection test for each third country to which data is transferred internationally to ensure that the data protection regime in that third country is secure and that people’s personal data, such as their HIV status, will not be shared inappropriately. HIV is criminalised in many countries around the world, and the transfer to these countries of personal data such as an individual’s HIV status could put an individual living with HIV, their partner or their family members at real risk of harm. This is because HIV stigma is incredibly pronounced in many countries, which fosters a real risk of HIV-related violence. Amendment 114 would mandate this annual review.
As regards Amendment 116, new Article 47A(4) to (7) gives the Secretary of State a broad regulation-making power to designate new transfer mechanisms for personal data being sent to a third country in the absence of adequacy regulations. Controllers would be able to rely on these new mechanisms, alongside the existing mechanisms in Article 46 of the UK GDPR, to transfer data abroad. In order to designate new mechanisms, which could be based on mechanisms used in other jurisdictions, the Secretary of State must be satisfied that these are
“capable of securing that the data protection test set out in Article 46 is met”.
The Secretary of State must be satisfied that the transfer mechanism is capable of providing a level of protection for data subjects that is not materially lower than under the UK GDPR and the Data Protection Act. The Government have described this new regulation-making power as a way to future-proof the UK’s GDPR international transfers regime, but they have not been able to point to any transfer mechanisms in other countries that might be suitable to be recognised in UK law, and nor have they set out examples of how new transfer mechanisms might be created.
In addition to not having a clear rationale to take the power, it is not clear how the Secretary of State could be satisfied that a new mechanism is capable of providing the appropriate level of protection for data subjects. This test is meant to be a lower standard than the test for controllers seeking to rely on a transfer mechanism to transfer overseas, which requires them to consider that the mechanism provides the appropriate level of protection. It is not clear to us how the Secretary of State could be satisfied of a mechanism’s capability without having a clear sense of how it would be used by controllers in reality. That is the reason for Amendment 116.
As regards Amendment 130, Ministers have continued all the adequacy decisions that the EU had made in respect of third countries when the UK stopped being subject to EU treaties. The UK also conferred data adequacy on the EEA, but all this was done on a transitional basis. The Bill now seeks to continue those adequacy decisions, but no analysis appears to have been carried out as to whether these jurisdictions confer an adequate level of protection of personal data. This is not consistent with Section 17B(1) of the DPA 2018, which states that the Secretary of State must carry out a review of whether the relevant country that has been granted data adequacy continues to ensure an adequate level of protection, and that these reviews must be carried out at intervals of not more than four years.
In the EU, litigants have twice brought successful challenges against adequacy decisions. Those decisions were deemed unlawful and quashed by the European Court of Justice. It appears that this sort of challenge would not be possible in the UK because the adequacy decisions are being continued by the Bill and therefore through primary legislation. Any challenge to these adequacy decisions could result only in a declaration of incompatibility under the Human Rights Act; it could not be quashed by the UK courts. This is another example of how leaving the EU has diminished the rights of UK citizens compared with their EU counterparts.
As well as tabling those amendments, I support and have signed Amendment 115 in the names of the noble Lords, Lord Bethell and Lord Kirkhope, and I look forward to hearing their arguments in relation to it. In the meantime, I beg to move.
My Lords, I rise with some temerity. This is my first visit to this Committee to speak. I have popped in before and have been following it very carefully. The work going on here is enormously important.
I am speaking to Amendment 115, thanks to the indulgence of my noble friend Lord Bethell, who is the lead name on that amendment but has kindly suggested that I start the discussions. I also thank the noble Lord, Lord Clement-Jones, for his support. Amendment 115 has one clear objective and that is to prevent transfer of UK user data to jurisdictions where data rights cannot be enforced and there is no credible right of redress. The word “credible” is important in this amendment.
I thank my noble friend the Minister for his letter of 11 April, which he sent to us to try to mop up a number of issues. In particular, in one paragraph he referred to the question of adequacy, which may also touch on what the noble Lord, Lord Clement-Jones, has just said. The Secretary of State’s powers are also referred to, but I must ask: how, in a fast-moving or unique situation, can all the factors referred to in this long and comprehensive paragraph be considered?
The mechanisms of government and government departments must be thorough and in place to satisfactorily discharge what are, I think, somewhat grand intentions. I say that from a personal point of view, because I was one of those who drafted the European GDPR—another reason I am interested in discussing these matters today—and I was responsible for the adequacy decisions with third countries. The word “adequacy” matters very much in this group, in the same way that we were unable to use “adequacy” when we dealt with the United States and had to look at “equivalence”. Adequacy can work only if one is working to similar parameters. If one is constitutionally looking at different parameters, as is the case in the United States, then the word “equivalence” becomes much more relevant, because, although things cannot be quite the same in the way in which administration or regulation is carried out, if you have an equivalence situation, that can be acceptable and lead to an understanding of the adequacy which we are looking for in terms of others being involved.
I have a marvellous note here, which I am sure noble Lords have already talked about. It says that every day we generate 181 zettabytes of personal data. I am sure noble Lords are all aware of zettabytes, but I will clarify. One zettabyte is 1,000 exabytes—which perhaps makes it simpler to understand—or, if you like, 1 billion trillion bytes. One’s mind just has to get around this, but this is data on our movements, finances, health and families, from our cameras, phones, doorbells and, I am afraid, even from our refrigerators—though Lady Kirkhope refuses point blank to have any kind of detector on her fridge door that will tell anybody anything about us or what we eat. Increasingly, it is also data from our cars. Our every moment is recorded—information relating to everything from shopping preferences to personal fitness to our anxieties, even, as they are displayed or discussed. It is stored by companies that we entrust with that data and we have a right to expect that such sensitive and private data will be protected. Indeed, one of the core principles of data protection, as we all know, is accountability.
Article 79 of the UK GDPR and Section 167 of our Data Protection Act 2018 provide that UK users must have the right to effective judicial remedy in the event of a data protection breach. Article 79 says that
“each data subject shall have the right to an effective judicial remedy where he or she considers that his or her rights under this Regulation have been infringed as a result of the processing of his or her personal data in non-compliance with this Regulation”.
A number of important points were raised there. Yes, of course I will share—
I am sorry to interrupt my noble friend, but the point I made—this now follows on from other remarks—was that these requirements have been in place for a long time, and we are seeing abuses. Therefore, I was hoping that my noble friend would be able to offer changes in the Bill that would put more emphasis on dealing with these breaches. Otherwise, as has been said, we look as though we are going backwards, not forwards.
(8 months, 3 weeks ago)
Lords ChamberMy Lords, as has been illustrated this morning, we stand on the cusp of a technological revolution. We find ourselves at the crossroads between innovation and responsibility. Artificial intelligence, a marvel of modern science, promises to reshape the world. Yet with great power comes great responsibility, and it is therefore imperative that we approach this with caution. Regulation in the realm of AI is not an adversary to innovation; rather, it is the very framework within which responsible and sustainable innovation must occur. Our goal should not be to stifle the creative spirit but to channel it, ensuring that it serves the common good while safeguarding our societal values and ethical standards.
However, we must not do this in isolation. In the digital domain, where boundaries blur, international collaboration becomes not just beneficial but essential. The challenges and opportunities presented by AI do not recognise national borders, and our responses too must be global in perspective. The quest for balance in regulation must be undertaken with a keen eye on international agreements, ensuring that the UK remains in step with the global community, not at odds with it. In our pursuit of this regulatory framework suitable for the UK, we must consider others. The European Union’s AI Act, authored by German MEP Axel Voss, offers valuable insights and, by examining what works within the EU’s and other approaches, as well as identifying areas for improvement, we can learn from the experiences of our neighbours to forge a path that is distinctly British, yet globally resonant.
Accountability stands as a cornerstone in the responsible deployment of AI technologies. Every algorithm and every application that is released into the world must have a clearly identifiable human or corporate entity behind it. This is where the regulatory approach must differ to that inherent in the general data protection regulations, which I had the pleasure of helping to formulate in Brussels. This accountability is crucial for ethical, legal and social reasons, ensuring that there is always a recourse and a responsible party when AI systems interact with our world.
Yet, as we delve into the mechanics of regulation and oversight, we must also pause to reflect on the quintessentially human aspect of our existence that AI can never replicate: emotion. The depth and complexity of emotions that define our humanity remain beyond the realm of AI and always will. These elements, intrinsic to our being, highlight the irreplaceable value of the human touch. While AI can augment, it can never replace human experience. The challenge before us is to foster an environment where innovation thrives within a framework of ethical and responsible governance. We must be vigilant not to become global enforcers of compliance at the expense of being pioneers of innovation.
The journey we embark on with the regulation of AI is not one that ends with the enactment of laws; that is merely the beginning. The dynamic nature of AI demands that our regulatory frameworks be agile and capable of adapting to rapid advancements and unforeseen challenges. So, as I have suggested on a number of occasions, we need smart legislation—a third tier of legislation behind the present primary and secondary structures—to keep up with these things.
In the dynamic landscape of AI, the concept of sandboxes stands out as a forward-thinking approach to innovation in this field. This was referred to by my noble friend in introducing his Bill. They offer a controlled environment where new technologies can be tested and refined without the immediate pressures and risks associated with full-scale deployment.
I emphasise that support for small and medium-sized enterprises in navigating the regulatory landscape is of paramount importance. These entities, often the cradles of innovation, must be equipped with the tools and knowledge to flourish within the bounds of regulation. The personnel in our regulatory authorities must also be of the highest calibre—individuals who not only comprehend the technicalities of AI but appreciate its broader implications for society and the economy.
At this threshold of a new era shaped by AI, we should proceed with caution but also with optimism. Let us never lose sight of the fact that at the heart of all technological advancement lies the indomitable spirit of human actions and emotions, which no machine or electronic device can create alone. I warmly welcome my noble friend Lord Holmes’s Bill, which I will fully support throughout its process in this House.
(10 months ago)
Lords ChamberNSOIT is indeed scrutinised by Ministers; it sits within DSIT and then Ministers, as we see, come before this House to explain matters. As a national security team, I dare say that we would have some concerns about a standing report to Parliament about its activities, but I can continue to reassure the House on its role.
My Lords, can my noble friend the Minister explain how this very interesting unit is comprised? Who are the members of the unit and from where do they come?
The unit comprises civil servants who sit within DSIT, and it occasionally makes use of external consulting services. It adjusts its size and membership from within the DSIT team according to the nature of the threat at any given moment.
(11 months, 4 weeks ago)
Lords ChamberMy Lords, at this late stage in any debate much of the field is likely to have been covered, but, as someone deeply involved in the crafting, drafting and evolution of the EU GDPR while an MEP in Brussels, I declare a strong vested interest in this subject. I hope that the Minister will not be too negative about the work that we did —much of it was done by Brits in Europe—on producing the GDPR in the first place.
I raised this issue at the recent UK-EU Parliamentary Partnership Assembly and in bilateral discussions with the European Parliament’s civil liberties committee, on which I served for many years, on its recent visit to London. Let me be candid: while the GDPR stands as a significant achievement, it is not without need for enhancement or improvement. The world has undergone a seismic shift since the GDPR’s inception, particularly in the realm of artificial intelligence. Both the UK and the EU need to get better at developing smart legislation. Smart legislation is not only adaptive and forward-looking; it is also flexible enough to evolve alongside emerging trends and challenges.
The importance of such legislation is highlighted by the rapid advancement in various sectors, and particularly in areas such as artificial intelligence—as so well referred to by my noble friend Lord Holmes of Richmond—and how our data is used. These fields are evolving at a pace that traditional legislative processes struggle to match. Such an approach is vital, not only to foster innovation but to ensure that regulations remain relevant and effective in a swiftly changing world, helping to maintain our competitive edge while upholding our core values and standards.
The aspirations of this Bill, which is aimed at modernising and streamlining the UK’s data protection framework while upholding stringent standards, are indeed laudable. I regret that, when my noble friend Lord Kamall was speaking about cookies, I was temporarily out of the Chamber enjoying a culinary cookie for lunch. While there may be further advantages to be unearthed in the depths of this complex legislation, so far, the biggest benefit I have seen is its commitment to removing cookie pop-ups. Above all, we must tread carefully to ensure international compliance, which has been referred to by a number of noble Lords, and steadfastly adhere to the bedrock GDPR principles of lawfulness, fairness, transparency, purpose limitation, data minimisation, accuracy, storage limitation and citizens’ redress.
On a procedural note, following other noble Lords, the Government’s recent flurry of amendments—I think there were 266 in total, including 38 new clauses and two new schedules, a staggering 240 of which were introduced at the 11th hour—places a key duty on our House to meticulously scrutinise the new legislation line by line. I have heard other speakers refer to my friend, the right honourable Member for Haltemprice and Howden, in the other place, who astutely observed that that House has
“in effect delegated large parts of the work on this important Bill to the House of Lords”.—[Official Report, Commons, 29/11/23; col. 888.]
I have to say that that is wonderful because, for those of us who are always arguing that this is the House that does the work, that is an acknowledgement of its skills and powers. It is a most welcome reference.
I wish to draw the House’s attention briefly to three important terms: adequacy, which noble Lords have heard about, equivalence and approximation. Adequacy in data protection primarily comes from the EU’s legal framework. It describes the standard that non-EU countries must meet to allow free flow of personal data from the EU. The European Commission assesses this adequacy, considering domestic laws and international commitments. The UK currently benefits from the EU’s two data adequacy decisions, which, I remind the House, are unilateral. However, we stand on the cusp of a crucial review in 2024, when the Commission will decide the fate of extending data adequacy for another four years and it has the power to withdraw its decision in the meantime if we threaten the basis for it. This Bill must not increase the risk of that happening.
Equivalence in the realm of data protection signifies that different systems or standards, while not mirror images, offer comparable levels of protection. It is about viewing a non-EU country’s data protection laws through a lens that recognises their parity with GDPR in safeguarding personal data. Past EU adequacy decisions have not demanded a carbon copy of laws; rather, they seek an essentially equivalent regulatory landscape.
Approximation refers to aligning the laws of EU member states with each other. In data protection, it could describe efforts to align national laws with GDPR standards. The imperative of maintaining data adequacy with the EU cannot be overstated; in fact, it has been stated by many noble Lords today. It stands as a top priority for UK business and industry, a linchpin in law enforcement co-operation, and a gateway to other vital databases. The economic stakes are monumental for both sides: EU personal data-enabled services exports to the UK were worth approximately £42 billion in 2018, and exports from the UK to the EU were worth £85 billion.
I commend the Government for listening to concerns that I and others have raised about democratic oversight and the independence of the Information Commissioner’s Office. The amendment to Clause 35, removing the proposal for the Secretary of State to veto ICO codes of practice, was welcome. This move has, I am informed, sent reassuring signals to our friends in Brussels. However, a concern still remains regarding the UK’s new ambition for adequacy partnerships with third countries. The Government’s impact assessment lists the United States, Australia, the Republic of Korea, Dubai International Finance Centre, Singapore and Colombia, with future agreements with India, Brazil, Kenya and Indonesia listed as priorities.
Some of these nations have data standards that may not align with those of the EU or in fact offer fewer safeguards than our current system. I urge extreme caution in this area. We do not want to be in the situation where we gain a data partnership with Kenya but jeopardise our total data adequacy with the EU. Fundamentally, this Bill should not weaken data protection rights and safeguards. It should ensure transparency in data use and decision-making, uphold requirements for data processors to consider the rights and interests of affected individuals and, importantly, not stray too far from international regulations.
I urge my noble friend the Minister and others to see that adopting a policy of permanent dynamic alignment with the EU GDPR is important, engaging actively with the EU as a partner, not just implementing new rules blindly. Protecting and strengthening the UK-EU data partnership offers an opportunity for closer co-operation, benefiting businesses, consumers, innovation and law enforcement; and together, we can reach out to others to encourage them to join these truly international standards.
(1 year, 1 month ago)
Lords ChamberTo ask His Majesty’s Government what assessment they have made of existing regulations and practices in relation to artificial intelligence, and what plans they have to monitor and control artificial intelligence (1) in the UK, and (2) in cooperation with international partners.
The AI Regulation White Paper set out our proposed framework for governing AI, including plans to establish a monitoring and evaluation process to track performance. This will complement the central AI risk function which we have established to identify measures and mitigate risks. We work closely with international partners through the G7, the GPAI and the Council of Europe to understand AI risks, and are leading the way by convening the AI Safety Summit in November.
My Lords, I welcome the Government hosting the AI summit at Bletchley Park, which is an opportunity to define the guard-rails on the use and misuse of AI with international partners. AI is borderless, as we know, so co-operation with others such as the USA, China and the EU is vital. Given the advances in draft legislation on AI by our neighbours in the EU, what plans do the Government have to continue the co-operation and dialogue with these other interests to give our thriving UK AI businesses certainty in their ability to sell and trade into all jurisdictions?
My noble friend is absolutely right to highlight the essential need for interoperability of AI given the way that AI is produced across so many jurisdictions. In addition to the global safety summit next week, we continue our very deep engagement with a huge range of multilateral groups. These include the OECD, the Council of Europe, the GPAI, the UN, various standards development groups, the G20 and the G7, along with a range of bilateral groups, including —just signed this year—the Atlantic declaration with the US and the Hiroshima accord with Japan.
(1 year, 5 months ago)
Lords ChamberI thank the noble Lord for that question. The starting point for the AI White Paper—of which I do not accept the characterisation of tentative—was, first, not to duplicate existing regulators’ work; secondly, not to go after specific technologies, because the technology space is changing so quickly; and, thirdly, to remain agile and adaptive. We are seeing the benefits of being agile and adapting to a very rapidly shifting landscape.
My Lords, I congratulate my noble friend the Minister and the Government on getting involved in international negotiations and discussions in this area. However, is this not an area where we have to be careful that we do not have a situation where there is nothing to fear but fear itself, and where we will lose out, if we are not careful, in having overregulation that prevents us using AI to the fullest extent for positive, excellent reasons on behalf of the people of this country?
My noble friend is absolutely right that the potential benefits of AI are extremely great, but so too are the risks. One of the functions of our recently announced Foundation Model Taskforce will be to scan the horizon on both sides of this—for the risks, which are considerable, and for the benefits, which are considerable too.