(1 week, 6 days ago)
Grand CommitteeMy Lords, Amendments 66, 67 and 80 in this group are all tabled in my name. Amendment 66 requires scientific research carried out for commercial purposes to
“be subject to the approval of an independent ethics committee”.
Commercial research is, perhaps counterintuitively, generally subjected to fewer ethical safeguards than research carried out purely for scientific endeavour by educational institutions. Given the current broad definition of scientific research in the Bill—I am sorry to repeat this—which includes research for commercial purposes, and the lower bar for obtaining consent for data reuse should the research be considered scientific, I think it would be fair to require more substantial ethical safeguards on such activities.
We do not want to create a scenario where unscrupulous tech developers use the Bill to harvest significant quantities of personal data under the guise of scientific endeavour to develop their products, without having to obtain consent from data subjects or even without them knowing. An independent ethics committee would be an excellent way to monitor scientific research that would be part of commercial activities, without capping data access for scientific research, which aims more purely to expand the horizon of our knowledge and benefit society. Let us be clear: commercial research makes a huge and critically important contribution to scientific research, but it is also surely fair to subject it to the same safeguards and scrutiny required of non-commercial scientific research.
Amendment 67 would ensure that data controllers cannot gain consent for research purposes that cannot be defined at the time of data collection. As the Bill stands, consent will be considered obtained for the purposes of scientific research if, at the time consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed. I fully understand that there needs to be some scope to take advantage of research opportunities that are not always foreseeable at the start of studies, particularly multi-year longitudinal studies, but which emerge as such studies continue. I am concerned, however, that the current provisions are a little too broad. In other words: is consent not actually being given at the start of the process for, effectively, any future purpose?
Amendment 80 would prevent the data reuse test being automatically passed if the reuse is for scientific purposes. Again, I have tabled this amendment due to my concerns that research which is part of commercial activities could be artificially classed as scientific, and that other clauses in the Bill would therefore allow too broad a scope for data harvesting. I beg to move.
My Lords, it seems very strange indeed that Amendment 66 is in a different group from group 1, which we have already discussed. Of course, I support Amendment 66 from the noble Viscount, Lord Camrose, but in response to my suggestion for a similar ethical threshold, the Minister said she was concerned that scientific research would find this to be too bureaucratic a hurdle. She and many of us here sat through debates on the Online Safety Bill, now an Act. I was also on the Communications Committee when it looked at digital regulations and came forward with one of the original reports on this. The dynamic and impetus which drove us to worry about this was the lack of ethics within the tech companies and social media. Why on earth would we want to unleash some of the most powerful companies in the world on reusing people’s data for scientific purposes if we were not going to have an ethical threshold involved in such an Act? It is important that we consider that extremely seriously.
(7 months, 2 weeks ago)
Lords ChamberFirst, let me absolutely endorse the noble Lord’s sentiment: this is a deplorable way to behave that should not be tolerated. From hearing the noble Lord speak of the actions, my assumption is that they would fall foul of the false communications offence under Section 179 of the Online Safety Act. As I say, these actions are absolutely unacceptable.
My Lords, noble Lords will be aware of the threat of AI-generated deepfake election messages flooding the internet during an election campaign. At the moment, only registered users have to put a digital imprint giving the provenance of the content on unpaid election material. Does the Minister think that a requirement to put a digital imprint on all unpaid election material should be introduced to counter fake election messages?
The noble Viscount is right to point to the digital imprint regime as one of the tools at our disposal for limiting the use of deepfakes. I think we would hesitate to have a blanket law that all materials of any kind would be required to have a digital imprint on them—but, needless to say, we will take away the idea and consider it further.
(10 months, 3 weeks ago)
Grand CommitteeI am happy to look into that as a mechanism, but, as currently set out in the Bill, the logic is that the Secretary of State can approve the guidance.
The Government will continue to work closely with the CMA, as they have throughout the drafting of the Bill, to ensure that the timely publication of guidance is not disrupted by this measure. Published guidance is required for the regime to be active, and the Government are committed to ensuring that this happens as soon as possible. Guidance will be published in good time before the regime goes live, to allow affected stakeholders to prepare. The Government hope that, subject to parliamentary time and receipt of Royal Assent, the regime will be in force for the common commencement date in October this year.
In response to my noble friend Lord Black’s question about guidance and purdah, the essential business of government can continue during purdah. The CMA’s guidance relates to the CMA’s intentions towards the operation of the regime, rather than to a highly political matter. However, the position would need to be confirmed with the propriety and ethics team in the Cabinet Office at the appropriate time, should the situation arise that we were in a pre-election period.
I thank the noble Viscount, Lord Colville, and my noble friend Lady Stowell for their amendments, and I hope that this will go some way towards reassuring them that the Government’s role in the production of guidance is proportionate and appropriate. As I said, I recognise the grave seriousness of the powerful arguments being raised, and I look forward to continuing to speak with them.
I thank noble Lords for their contributions and ask the Minister to listen to the concerns Members have expressed today. The clause gives extraordinary power to the Secretary of State, and I ask the Minister to listen to his noble friends, the noble Baronesses, Lady Stowell and Lady Harding, who called the power dangerous. In particular, the noble Baroness, Lady Harding, said that it was so dangerous and such a big power that it must be a distraction.
The noble Lord, Lord Black, said that the concern about having this power is that it would create a delay, and that that would especially be a concern over the period of the election, both before and after. He called for draft guidance to be approved within 31 days, which is certainly something that could be considered; after all, no one wants ping-pong to go back and forth do they? They want the CMA’s guidance to be put into action and this process to start as soon as possible.
The noble Baroness, Lady Kidron, said that the asymmetric power between the regulators and the tech companies means that there will be a drum beat of what she called “participative arrangements”. That is quite a complex thought, but the idea behind it—that the CMA must not be stopped from using its power to deal with some of the most powerful companies in the world—is very important.
The noble Baroness, Lady Stowell, is a former regulator and called for Parliament to have a role in overseeing this. We were reminded by both the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that we had a discussion on Secretary of State powers in the debate on the Online Safety Act, much of which was about whether a joint digital committee could oversee digital regulation. I suspect that that will be discussed in the next group. We have given enormous powers to Ofcom with the Online Safety Act, we are giving big powers to the CMA and I imagine that we are giving big powers to the ICO in the Data Protection Act, so Parliament should have a powerful standing role in dealing with that.
The Minister called for robust oversight of the CMA and said that it must be accountable before Parliament. Already, Parliament looks at its review and annual reporting. I come back to the concern that the Secretary of State still has powers that are far too great over the implementation of this guidance, and that the CMA’s independence will be impinged on. I repeat what I and other noble Lords said on the concern about Clause 114: it stands to reduce the CMA’s independence. I ask the Minister to consider very seriously what we have been saying.
The Minister’s suggestion that he will look at the affirmative resolution for Secretary of State approval of guidance is something that we should certainly push further—at least that is some step towards reducing Secretary of State powers. With that, I beg leave to withdraw my amendment.
(10 months, 4 weeks ago)
Grand CommitteeI will revert first to the questions about the word “indispensable”. As I have said, the Government consulted very widely, and one of the findings of the consultation was that, for a variety of stakeholders, the word “indispensable” reduced the clarity of the legislation.
Before my noble friend answers that, can he shed some light on which stakeholders feel that this is unclear?
(1 year ago)
Lords ChamberI do not think “perverse” is justified. GDPR Article 22 addresses automated individual decision-making, but, as I am sure the noble Lord knows, the DPDI Bill recasts Article 22 as the right to specific safeguards rather than a general prohibition on automated decision-making, so that subjects have to be informed about it and can seek a human review of decisions. It also defines meaningful human involvement.
When I asked the Minister in October why deepfakes could not be banned, he replied that he could not see a pathway to do so, as they were developed anywhere in the world. In the Online Safety Act, tech companies all over the world are now required not to disseminate harms to children. Why can the harms of deepfakes not be similarly proscribed?
I remember the question. It is indeed very important. There are two pieces to preventing deepfakes being presented to British users: one is where they are created and the second is how they are presented to those users. They are created to a great extent overseas, and we can do very little about that. As the noble Viscount said, the Online Safety Act creates a great many barriers to the dissemination and presentation of deepfakes to a British audience.
(1 year, 1 month ago)
Lords ChamberI do not believe that anyone anywhere is advocating unregulated AI. The voluntary agreement is, of course, a United States agreement secured with the White House. We welcome it, although it needs to be codified to make it non-voluntary, but that will be discussed as part of the summit next week.
My Lords, I would like to pick up on the point made by the noble Lord, Lord Clement-Jones, because Professor Russell also said that he would like to ban certain types of AI deepfakes. With elections looming in this country, can the Minister tell the House whether he thinks AI developers should be banned from creating software that allows the impersonation of people, particularly high-profile politicians?