11 Viscount Colville of Culross debates involving the Department for Business and Trade

Wed 18th Dec 2024
Mon 16th Dec 2024
Tue 10th Dec 2024
Data (Use and Access) Bill [HL]
Grand Committee

Committee stage & Committee stage: Minutes of Proceedings & Committee stage: Minutes of Proceedings
Tue 3rd Dec 2024
Tue 19th Nov 2024

Data (Use and Access) Bill [HL]

Viscount Colville of Culross Excerpts
Finally, at the recent All-Party Parliamentary Group for Writers reception, we heard a moving speech by the author Joanne Harris, who made perhaps the most important point. She said that to a lot of the public, as soon as you utter the words “artificial intelligence”, people still think it is science fiction. It is not science fiction. As Joanne Harris and others have pointed out, it is happening now and happening in a big way. The Government need to deal with these concerns both urgently and effectively.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords I have been very impressed by the speeches of my noble friends Lady Kidron and Lord Freyberg, so I will be very brief. I declare in interest as a television producer who produces content. I hope that it has not been scraped up by AI machines, but who knows? I support the amendments in this group.

I know that AI is going to solve many problems in our economy and our society. However, in their chase for the holy grail of promoting AI, I join other noble Lords in asking the Government not to push our creative economy under the bus. It is largely made up of SMEs and single content producers, who do not have the money to pursue powerful AI companies to get paid for the use of their content in training their AI models. It is up to noble Lords to help shape regulations that protect our data and copyright laws and can be fully deployed in the defence of the creative economy.

I too have read the Government’s Copyright and Artificial Intelligence consultation paper, published yesterday. The foreword says:

“The proposals include a mechanism for rights holders to reserve their rights”,


which I, like my noble friend Lady Kidron and others, interpret as meaning that creators’ works can be used by AI developers unless they opt out and require licensing for the use of their work. The Government are following the EU example and going for the opt-out model. I think that the European Union is beginning to realise that it is very difficult to make that work, and it brings an unfairness to content producers. Surely, the presumption should be that AI web crawlers should get agreement before using content. The real problem is that content producers do not even know when their content has been used. Even the AI companies sometimes do not know what content has been used. Surely, the opt-out measure is like having your house raided and then asking the burglar what he has taken.

I call on the Minister to work with us to create an opt-in regime. Creators’ works should be used only when already licensed by the AI companies. The companies say they usually do not use content, only data points. Surely that is like saying to a photographer, “We’ve used 99% of the pixels in a picture but not the whole picture”. If even one pixel is used, the photographer needs to know and be compensated.

The small companies and single content producers of our country are the backbone of our economy, as other noble Lords have said. They are threatened by this technology, in which we have placed so much faith. I ask the Minister to respond favourably to Amendments 204, 205 and 206 to ensure that we have fairness between some of the biggest AI players in the world and the hard-pressed people who create content.

Lord Hampton Portrait Lord Hampton (CB)
- Hansard - - - Excerpts

My Lords, I support Amendments 204, 205 and 206 in the names of my noble friends Lady Kidron and Lord Freyberg, and of the noble Lords, Lord Stevenson and Lord Clement-Jones, in what rapidly seems to be becoming the Cross-Bench creative club.

I spent 25 years as a professional photographer in London from the late 1980s. When I started, retouchers would retouch negatives and slides by hand, charging £500 an hour. Photoshop stopped that. Professional film labs such as Joe’s Basement and Metro would work 24 hours a day. Snappy Snaps and similar catered for the amateur market. Digital cameras stopped that. Many companies provided art prints, laminating and sundry items for professional portfolios. PDFs and websites stopped that. Many different forms of photography, particularly travel photography, were taken away when picture libraries cornered the market and drove down commissions to unsustainable levels. There were hundreds if not thousands of professional photographers in the country. The smartphone has virtually stopped that.

All these changes were evolution and the result of a world becoming more digitised, but AI web crawlers are different, illegally scraping images without consent or payment then potentially killing the trade of the victim by setting up in competition. This is a parasite, but not in the true sense, because a parasite is careful to keep its victims alive.

Moved by
92: Clause 77, page 91, line 5, leave out “the number of data subjects,”
Member’s explanatory statement
This amendment reduces the likelihood of misuse of Clause 77 by AI model developers, who may otherwise seek to claim they do not need to notify data subjects of reuse for scientific purposes under Clause 77 because of the way that personal data is typically collected and processed for AI development, for example by scraping large amounts of personal data from the internet.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I have tabled Amendments 92, 93, 101 and 105, and I thank the noble Lord, Lord Clement-Jones for adding his name to them. I also support Amendment 137 in the name of my noble friend Lady Kidron.

Clause 77 grants an exemption to the Article 13 and 14 rights of data subjects to be told within a set timeframe that their data will be reused for scientific research, if it would be impossible or involve disproportionate effort to do so. These amendments complement those I proposed to Clause 67. They aim to ensure that “scientific research” is limited in its definition and that the large language AI developers cannot say that they are doing scientific research and that the GDPR requirements involve too much effort to have to contact data subjects to reuse their data.

It costs AI developers time and money to identify data subjects, so this exemption is obviously very valuable to them and they will use it if possible. They will claim that processing and notifying data subjects from such a huge collection of data is a disproportionate effort, as it is hard to extract the identity of data subjects from the original AI model.

Up to 5 million data subjects could be involved in reusing data to train a large language model. However, the ICO requires data controllers to inform subjects that their data could be reused even if it involves contacting 5 million data subjects. The criteria set out in proposed new subsection (6) in Clause 77 play straight into the hands of ruthless AI companies that want to take advantage of this exemption.

Amendments 92 and 101 would ensure that the disproportionate effort excuse is not used if the number of data subjects is mentioned as a reason for deploying the excuse. Amendments 93 and 105 would clarify the practices and facts that would not qualify for the disproportionate effort exemption—namely,

“the fact the personal data was not collected from the data subject, or any processing undertaken by the controller that makes the effort involved greater”.

Without this wording, the Bill will mean that the data controller, when wanting to reuse data for training another large language model, could process the personal data on the original model and then reuse it without asking permission from the original subjects. The AI developer could say, “I don’t have the original details of the data subject, as they were deleted when the original model was trained. There was no identification of the original data subjects; only the data weight”. I fear that many companies will use this excuse to get around GDPR notification expectations.

Noble Lords should recognise that these provisions affect only AI developers seeking to reuse data under the scientific research provisions. These will mainly be the very large AI developers, which tend to use scrape data to train their general purpose models. Controllers will still be able to use personal data to train AI systems when they have lawful grounds to do so—they either have the consent of the data subject or there is a legitimate interest—but I want to make it clear that these provisions will not inhibit the legitimate training of AI models.

These amendments would ensure that organisations, especially large language AI developers, are not able to reuse data at scale, in contradiction to the expectations and intentions of data subjects. Failure to get this right will risk setting off a public backlash against the use of personal data for AI use, which would impede this Government’s aims of making this country an AI superpower. I beg to move.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Discussions with the ICO are taking place at the moment about the scope and intention of a number of issues around AI, and this issue would be included in that. However, I cannot say at the moment that that intention is specifically spelled out in the way that the noble Baroness is asking.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

This has been a wide-ranging debate, with important contributions from across the Committee. I take some comfort from the Minister’s declaration that the exemptions will not be used for web crawling, but I want to make sure that they are not used at the expense of the privacy and control of personal data belonging to the people of Britain.

That seems particularly so for Amendment 137 in the name of the noble Baroness, Lady Kidron. I was particularly taken by her pointing out that children’s data privacy had not been taken into account when it came to AI, reinforced by the noble Baroness, Lady Harding, telling us about the importance of the Bill. She said it was paramount to protect children in the digital age and reminded us that this is the biggest breakthrough of our lifetime and that children need protecting from it. I hope very much that there will be some successful meetings, and maybe a government amendment on Report, responding to these passionate and heartfelt demands. On that basis, I sincerely hope the Minister will meet us all and other noble Lords to discuss these matters of data privacy further. On that basis, I beg leave to withdraw my amendment.

Amendment 92 withdrawn.
--- Later in debate ---
The amendment could lead to the introduction of measures to ensure private sector assessment and monitoring of impacts on work, people and fundamental rights, which would conform to the framework convention. If this does not do what the Government intend as regards adoption in the UK of that framework convention, I very much hope that the Government can give us more information about that at this time. I beg to move.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, Amendment 119 is in my name, and I thank the noble Lord, Lord Knight, for adding his name to it. I am pleased to add my name to Amendment 115A in the name of noble Viscount, Lord Camrose.

Transparency is key to ensuring that the rollout of ADM brings the public and, most importantly, public trust with it. I give the Committee an example of how a lack of transparency can erode that trust. The DWP is using a machine learning model to analyse all applications for a loan, as an advance on a benefit to pay bills and other costs, while a recipient waits for their first universal credit payment. The DWP’s own analysis of the model concluded that for all of the protected characteristics that were analysed, including age, marital status and disability, it found disparities in who was most likely to be incorrectly referred by the model.

It is difficult to assess whether the model is discriminatory, effective or even lawful. When the DWP rolled it out, it was unable to reassure the Comptroller and Auditor-General that its anti-fraud models treated all customer groups fairly. The rollout continues despite these concerns. The DWP maintains that the analysis does not present

“any immediate concerns of discrimination, unfair treatment or detrimental impact on customers”.

However, because so little information is available about the model, this claim cannot be independently verified to provide the public with confidence. Civil rights organisations, including the Public Law Project, are currently working on a potential claim against the DWP, including in relation to this model, on the basis that they may consider it may be unlawful.

The Government’s commitment to rolling out ADM has been accompanied by a statement in the other place in November by AI Minister Feryal Clark that the mandatory requirement for the use of the ATRS has been seen as a significant acceleration towards adopting the standard. In response to a Written Question, the Secretary of State confirmed that, as part of the rollout of ADM phase 1 to the 16 largest ministerial departments plus HMRC, there is a deadline for them to publish their first ATRS records by the end of July 2024. Despite the Government’s statement, only eight ATRS reports have been published on the hub. The Public Law Project’s TAG project has discovered at least 74 areas in which ADM is being used, and they are only the ones that it has been able to uncover by freedom of information requests and from tip-offs by affected people. There is clearly a shortfall in the implementation and rolling out of the use of the ATRS across government departments.

Moved by
59: Clause 67, page 75, line 9, after “processing” insert “solely”.
Member’s explanatory statement
This amendment prevents misuse of the scientific research exceptions for data reuse by ensuring that the only purpose for which the reuse is permissible is for the scientific research—with no additional purposes.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

I have tabled Amendments 59, 62, 63 and 65, and I thank the noble Lord, Lord Clement-Jones, my noble friend Lady Kidron and the noble Viscount, Lord Camrose, for adding their names to them. I am sure that the Committee will agree that these amendments have some pretty heavyweight support. I also support Amendment 64, in the name of the noble Lord, Lord Clement-Jones, which is an alternative to my Amendment 63. Amendments 68 and 69 in this group also warrant attention.

I very much support the Government’s aim in Clause 67 to ensure that valuable research does not get discarded due to a lack of clarity around its use or because of an overly narrow distinction between the original and new purposes of the use of the data. The Government’s position is that this clause clarifies the law by incorporating into the Bill recitals to the original GDPR. However, while the effect is to encourage scientific research and development, it has to be seen in the context of the fast-evolving world of developments in AI and the way that AI developers, given the need for huge amounts of data to train their large language models, are reusing data.

My concern is that the scraping of vast amounts of data by these AI companies is often positioned as scientific research and in some cases is even supported by the production of academic papers. I ask the Minister to understand my concerns and those of many in the data community and beyond. The fact is that the lines between scientific research, as set out in Clause 67, and AI product development are blurred. This might not be the concern of the original recitals, but I beg to suggest to the Minister that, in the new world of AI, there should be concern about the definition presented in the Bill.

Like other noble Lords, I very much hope to make this country a centre of AI development, but I do not want this to happen at the expense of data subjects’ privacy and data protection. It costs at least £1 billion—even more, sometimes—to develop a large language model and, although the cost will soon go down, there is a huge financial incentive to scrape data that pushes the boundaries of what is legitimate. In this climate, it is important that the Bill closes any loopholes that allow AI developers to claim the protections offered by Clause 67. My Amendments 59, 62, 63 and 65 go some way to ensuring that this will not happen.

The definition of scientific research in proposed new paragraph 2, in Clause 67(1)(b), is drawn broadly. My concern is that many commercial developments of digital products, particularly those involving AI, could still claim to be, in the words of the clause, “reasonably … described as scientific”. AI model development usually involves a mix of purposes—not just developing its capabilities but also commercialising as it develops services. The exemption allowed for “purposes of technological development” makes me concerned that this vague area creates a threat whereby AI developers will misuse the provisions of the Bill to reuse personal data for any AI developments, provided that one of their goals is technological advancement.

Amendments 59 and 62, by inserting the word “solely” into proposed new paragraphs 2 and 3 in Clause 67, would disaggregate reuse of data for scientific research purposes from other purposes, ensuring that the only goal of reuse is scientific research.

An example of the threat under the present definition is shown by Meta’s recently allowing the reuse of Instagram users’ data to train its new generation of Llama models. When the news got out, it created a huge backlash, with more than half a million people reposting a viral hoax image that claimed to deny Meta the right to reuse their data to train AI. This caused the ICO to say that it was pleased that Meta had paused its data processing in response to users’ concerns, adding:

“It is crucial that the public can trust that their privacy rights will be respected from the outset”.


However, Meta could well claim under this clause that it is creating technological advancement which would allow it to reuse any data collected by users under the legitimate interest grounds for training the model. The Bill as it stands would not require the company to conduct its research in accordance with any of the features of genuine scientific research. These amendments go some way to rectify that.

Amendment 63 increases the test for what is deemed to be scientific interest. At the moment, the public interest test is applied only to public health. I am pleased that NHS researchers will have to recognise this threshold, but why should all researchers doing scientific work not have to adhere to this threshold? Why should that test not be applied to all data reuse for scientific research? By deleting the public health exception, the public interest test would apply to all data reuse for scientific purposes.

The original intention of the RAS purpose of the GDPR supports public health for scientific interests. This is complemented by Amendment 65, which uses the tests for consent already laid out in Clause 68. The inclusion of ethical thresholds in the reuse of data should meet the highest levels of academic rigour and oversight envisaged in the original GDPR. It will demand not just ethical standards in research but for it to be supervised by an independent research ethics committee that meets UKRI guidance. These requirements will ensure that the high standards of ethics that we expect from scientific research will be applied in evaluating the exemption in Clause 67.

I do not want noble Lords to think that these amendments are thwarting the development of AI. There is plenty of AI research that is clearly scientific. Look at DeepMind AlphaFold, which uses AI to analyse the shape of proteins so that they can be incorporated in future drug treatment and will move pharmaceutical development. It is an AI model developed in accordance with the ethical standards expected from modern scientific research.

The Minister will argue that the definition has been taken straight from EU recitals. I therefore ask her to consider very seriously what has been said about this definition by the EU’s premier data body, the European Data Protection Supervisor, in its preliminary opinion on data protection and scientific research. In its executive summary, it states:

“The boundary between private sector research and traditional academic research is blurrier than ever, and it is ever harder to distinguish research with generalisable benefits for society from that which primarily serves private interests. Corporate secrecy, particularly in the tech sector, which controls the most valuable data for understanding the impact of digitisation and specific phenomena like the dissimilation of misinformation, is a major barrier to social science research … there have been few guidelines or comprehensive studies on the application of data protection rules to research”.


It suggests that the rules should be interpreted in such a way that permits reuse only for genuine scientific research.

For the purpose of this preliminary opinion by the EDPS, the special data protection regime for scientific research is understood to apply if each of three criteria are met: first, personal data is processed; secondly, relevant sectorial standards of methodology and ethics apply, including the notion of informed consent, accountability and oversight; and, thirdly, the research is carried out with the aim of growing society’s collective knowledge and well-being as opposed to serving primarily one or several private interests. I hope that noble Lords will recognise that these are features that the amendments before the Committee would incorporate into Clause 67.

In the circumstances, I hope that the Minister, who I know has thought deeply about these issues, will recognise that the EU’s institutions are worried about the definition of scientific research that has been incorporated into the Bill. If they are worried, I suggest that we should be worried. I hope that these amendments will allay those fears and ensure that true scientific research is encouraged by Clause 67 and that it is not abused by AI companies. I beg to move.

--- Later in debate ---
I know that we have gone around this subject in a very wide sense and that we might equally revisit some of these issues on other amendments. But I hope that, for the moment, I have reassured noble Lords on the specific details of their amendments and persuaded them that those strong protections are in place.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

I thank the Minister very much, but is she not concerned by the preliminary opinion from the EDPS, particularly that traditional academic research is blurrier than ever and that it is even harder to distinguish research which generally benefits society from that which primarily serves private interest? People in the street would be worried about that and the Bill ought to be responding to that concern.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I have not seen that observation, but we will look at it. It goes back to my point that the provisions in this Bill are designed to be future facing as well as for the current day. The strength of those provisions will apply regardless of the technology, which may well include AI. Noble Lords may know that we will bring forward a separate piece of legislation on AI, when we will be able to debate this in more detail.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, this has been a very important debate about one of the most controversial areas of this Bill. My amendments are supported across the House and by respected civic institutions such as the Ada Lovelace Institute. I understand that the Minister thinks they will stifle scientific research, particularly by nascent AI companies, but the rights of the data subject must be borne in mind. As it stands, under Clause 67, millions of data subjects could find their information mined by AI companies, to be reused without consent.

The concerns about this definition being too broad were illustrated very well across the Committee. The noble Lord, Lord Clement-Jones, said that it was too broad and must recognise that AI development will be open to using data research for any AI purposes and talked about his amendment on protecting children’s data, which is very important and worthy of consideration. This was supported by my noble friend Lady Kidron, who pointed out that the definition of scientific research could cover everything and warned that Clause 67 is not just housekeeping. She quoted the EDPS and talked about its critical clarification not being included in the transfer of the scientific definition into the Bill. The noble Lord, Lord Holmes, asked what in the Bill has changed when you consider how much has changed in AI. I was very pleased to have the support of the noble Viscount, Lord Camrose, who warned against the abuse and misuse of data and the broad definition in this Bill, which could muddy the waters. He supported the public interest test, which would be fertile ground for helping define scientific data.

Surely this Bill should walk the line in encouraging the AI rollout to boost research and development in our science sector. I ask the Minister to meet me and other concerned noble Lords to tighten up Clauses 67 and 68. On that basis, I beg leave to withdraw my amendment.

Amendment 59 withdrawn.
--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, Amendments 66, 67 and 80 in this group are all tabled in my name. Amendment 66 requires scientific research carried out for commercial purposes to

“be subject to the approval of an independent ethics committee”.

Commercial research is, perhaps counterintuitively, generally subjected to fewer ethical safeguards than research carried out purely for scientific endeavour by educational institutions. Given the current broad definition of scientific research in the Bill—I am sorry to repeat this—which includes research for commercial purposes, and the lower bar for obtaining consent for data reuse should the research be considered scientific, I think it would be fair to require more substantial ethical safeguards on such activities.

We do not want to create a scenario where unscrupulous tech developers use the Bill to harvest significant quantities of personal data under the guise of scientific endeavour to develop their products, without having to obtain consent from data subjects or even without them knowing. An independent ethics committee would be an excellent way to monitor scientific research that would be part of commercial activities, without capping data access for scientific research, which aims more purely to expand the horizon of our knowledge and benefit society. Let us be clear: commercial research makes a huge and critically important contribution to scientific research, but it is also surely fair to subject it to the same safeguards and scrutiny required of non-commercial scientific research.

Amendment 67 would ensure that data controllers cannot gain consent for research purposes that cannot be defined at the time of data collection. As the Bill stands, consent will be considered obtained for the purposes of scientific research if, at the time consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed. I fully understand that there needs to be some scope to take advantage of research opportunities that are not always foreseeable at the start of studies, particularly multi-year longitudinal studies, but which emerge as such studies continue. I am concerned, however, that the current provisions are a little too broad. In other words: is consent not actually being given at the start of the process for, effectively, any future purpose?

Amendment 80 would prevent the data reuse test being automatically passed if the reuse is for scientific purposes. Again, I have tabled this amendment due to my concerns that research which is part of commercial activities could be artificially classed as scientific, and that other clauses in the Bill would therefore allow too broad a scope for data harvesting. I beg to move.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, it seems very strange indeed that Amendment 66 is in a different group from group 1, which we have already discussed. Of course, I support Amendment 66 from the noble Viscount, Lord Camrose, but in response to my suggestion for a similar ethical threshold, the Minister said she was concerned that scientific research would find this to be too bureaucratic a hurdle. She and many of us here sat through debates on the Online Safety Bill, now an Act. I was also on the Communications Committee when it looked at digital regulations and came forward with one of the original reports on this. The dynamic and impetus which drove us to worry about this was the lack of ethics within the tech companies and social media. Why on earth would we want to unleash some of the most powerful companies in the world on reusing people’s data for scientific purposes if we were not going to have an ethical threshold involved in such an Act? It is important that we consider that extremely seriously.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I welcome the noble Viscount to the sceptics’ club because he has clearly had a damascene conversion. It may be that this goes too far. I am slightly concerned, like him, about the bureaucracy involved in this, which slightly gives the game away. It could be seen as a way of legitimising commercial research, whereas we want to make it absolutely certain that that research is for the public benefit, rather than imposing an ethical board on every single aspect of research which has any commercial content.

We keep coming back to this, but we seem to be degrouping all over the place. Even the Government Whips Office seems to have given up trying to give titles for each of the groups; they are just called “degrouped” nowadays, which I think is a sign of deep depression in that office. It does not tell us anything about what the different groups contain, for some reason. Anyway, it is good to see the noble Viscount, Lord Camrose, kicking the tyres on the definition of the research aspect.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I almost have a full house in this group, apart from Amendment 35, so I will not read out the numbers of all the amendments in this group. I should just say that I very much support what the noble Viscount, Lord Colville, has put forward in his Amendment 35.

Many noble Lords will have read the ninth report of the Delegated Powers and Regulatory Reform Committee. I am sad to say that it holds exactly the same view about this Bill as it did about the previous Bill’s provisions regarding digital verification services. It said that

“we remain of the view that the power conferred by clause 28 should be subject to parliamentary scrutiny, with the affirmative procedure providing the appropriate level of scrutiny”.

It is against that backdrop that I put forward a number of these amendments. I am concerned that, although the Secretary of State is made responsible for this framework, in reality, they cannot be accountable for delivering effective governance in any meaningful way. I have tried, through these amendments, to introduce at least some form of appropriate governance.

Of course, these digital verification provisions are long-awaited—the Age Verification Providers Association is pleased to see them introduced—but we need much greater clarity. How is the Home Office compliant with Part 2 of the Bill as it is currently written? How will these digital verification services be managed by DSIT? How will they interoperate with the digital identity verification services being offered by DSIT in the UK Government’s One Login programme?

Governance, accountability and effective, independent regulation are also missing. There is no mechanism for monitoring compliance, investigating malicious actors or taking enforcement action regarding these services. The Bill has no mechanism for ongoing monitoring or the investigation of compliance failures. The Government propose to rely on periodic certification being sufficient but I understand that, when pressed, DSIT officials say that they are talking to certification bodies and regulators about how they can do so. This is not really sufficient. I very much share the intention of both this Government and the previous one to create a market in digital verification services, but the many good players in this marketplace believe that high levels of trust in the sector depend on a high level of assurance and focus from the governance point of view. That is missing in this part of the Bill.

Amendment 33 recognises the fact that the Bill has no mechanism for ongoing monitoring or the investigation of compliance failures. As we have seen from the Grenfell public inquiry, a failure of governance caused by not proactively monitoring, checking and challenging compliance has real, harmful consequences. Digital verification services rely on the trustworthiness of the governance model; what is proposed is not trustworthy but creates material risk for UK citizens and parties who rely on the system.

There are perfectly decent examples of regulatory frameworks. PhonepayPlus provides one such example, with a panel of three experts supported by a secretariat; the panel can meet once a quarter to give its opinion. That has been dismissed as being too expensive, but I do not believe that any costings have been produced or that it has been considered how such a cost would weigh against the consequences of a failure in governance of the kind identified in recent public inquiries.

Again, as regards Amendment 36, there is no mechanism in the Bill whereby accountability is clearly established in a meaningful way. Accountability is critical if relying parties and end-users are to have confidence that their interests are safeguarded.

Amendment 38 is linked to Amendment 36. The review under Clause 31 must be meaningful in improving accountability and effective governance. The amendment proposes that the review must include performance, specifically against the five-year strategy and of the compliance, monitoring and investigating mechanisms. We would also like to see the Secretary of State held accountable by the Science and Technology Select Committee for the performance captured in the review.

On Amendment 41, the Bill is silent on how the Secretary of State will determine that there is a compliance failure. It is critical to have some independence and professional rigour included here; the independent appeals process is really crucial.

As regards Amendments 42 and 43, recent public inquiries serve to illustrate the importance of effective governance. Good practice for effective governance would require the involvement of an independent body in the determination of compliance decisions. There does not appear to be an investigatory resource or expertise within DSIT, and the Bill currently fails to include requirements for investigatory processes or appeals. In effect, there is no check on the authority of the Secretary of State in that context, as well as no requirement for the Secretary of State proactively to monitor and challenge stakeholders on compliance.

As regards Amendment 44, there needs to be a process or procedure for that; fairness requires that there should be a due process of investigation, a review of evidence and a right of appeal to an independent body.

I turn to Amendment 45 on effective governance. A decision by the appeals body that a compliance failure is so severe that removal from the register is a proportionate measure must be binding on the Secretary of State, otherwise there is a risk of lobbying and investment in compliance and service improvement being relegated below that of investment in lobbying. Malicious actors view weaknesses in enforcement as a green light and so adopt behaviours that both put at risk the safety and security of UK citizens and undermine the potential of trustworthy digital verification to drive economic growth.

Amendment 39 would exclude powers in this part being used by government as part of GOV.UK’s One Login.

I come on to something rather different in Amendment 46, which is very much supported by Big Brother Watch, the Digital Poverty Alliance and Age UK. Its theme was raised at Second Reading. A significant proportion of the UK’s population lacks internet access, with this issue disproportionately affecting older adults, children and those from low-income backgrounds. This form of digital exclusion presents challenges in an increasingly digital world, particularly concerning identity verification.

Although digital identity verification can be beneficial, it poses difficulty for individuals who cannot or choose not to engage digitally. Mandating online identity verification can create barriers for digitally excluded groups. For example, the National Audit Office found that only 20% of universal credit applicants could verify their identity online, highlighting concerns for those with limited digital skills. The Lords Communications and Digital Select Committee emphasised the need for accessible, offline alternatives to ensure inclusivity in a connected world. The proponents of this amendment advocate the availability of offline options for essential public and private services, particularly those requiring identity verification. This is crucial as forcing digital engagement can negatively impact the well-being and societal participation of older people.

This is the first time that I have prayed in aid what the Minister said during the passage of the Data Protection and Digital Information Bill; this could be the first of a few such occasions. When we debated the DPDI Bill, she stressed the importance of a legal right to choose between digital and non-digital identity verification methods. I entirely agreed with her at the time. She said that this right is vital for individual liberty, equality and building trust in digital identity systems and that, ultimately, such systems should empower individuals with choices rather than enforce digital compliance. That is a fair summary of what she said at the time.

I turn to Amendment 50. In the context of Clause 45 and the power of public authorities to disclose information, some of which may be the most sensitive information, it is important for the Secretary of State to be able to require the public authority to provide information on what data is being disclosed and where the data is going, as well as why the data is going there. This amendment will ensure that data is being disclosed for the right reasons, to the right places and in the right proportion. I beg to move.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I tabled Amendment 35 because I want to make the DVS trust framework as useful as possible. I support Amendment 33 in the name of the noble Lord, Lord Clement-Jones, and Amendment 37 in the name of the noble Viscount, Lord Camrose.

The framework’s mandate is to define a set of rules and standards designed to establish trust in digital identity products in the UK. It is what I would hope for as a provision in this Bill. As the Minister told us at Second Reading, the establishment of digital ID services with a trust mark will increase faith in the digital market and reduce physical checks—not to mention reducing the time spent on a range of activities, from hiring new workers to moving house. I and many other noble Lords surely welcome the consequent reduction in red tape, which so often impedes the effectiveness of our public services.

Clause 28(3) asks the Secretary of State to consult the Information Commissioner and such persons as they consider appropriate. However, in order to ensure that these digital ID services are used and recognised as widely as possible—and, more importantly, that they can be used by organisations beyond our borders— I suggest Amendment 35, which would include putting consultation with an international digital standards body in the Bill. This amendment is supported by the Open Data Institute.

I am sure that the Minister will tell me that that amendment is unnecessary as we can leave it to the common sense of Ministers and civil servants in DSIT to consult such a body but, in my view, it is helpful to remind them that Parliament thinks the consultation of an international standards body is important. The international acceptance of DVS is crucial to its success. Just like an email, somebody’s digital identity should not be tied to a company or a sector. Imagine how frustrating it would be if we could only get Gmail in the UK and Outlook in the EU. Imagine if, in a world of national borders and jurisdictions, you could not send emails between the UK and the EU as a result. Although the DVS will work brilliantly to break down digital identity barriers in the UK, there is a risk that no international standards body might be consulted in the development of the DVS scheme. This amendment would be a reminder to the Secretary of State that there must be collaboration between this country, the EU and other nations, such as Commonwealth countries, that are in the process of developing similar schemes.

Data (Use and Access) Bill [HL]

Viscount Colville of Culross Excerpts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

I, too, thank the Minister for her introduction to this welcome Bill. I feel that most noble Lords have an encyclopaedic knowledge of this subject, having been around the course not just once but several times. As a newcomer to this Bill, I am aware that I have plenty to learn from their experience. I would like to thank the Ada Lovelace Institute, the Public Law Project, Connected by Data and the Open Data Institute, among others, which have helped me get to grips with this complicated Bill.

Data is the oil of the 21st century. It is the commodity which drives our great tech companies and the material on which the large language models of AI are trained. We are seeing an exponential growth in the training and deployment of AI models. As many noble Lords have said, it has never been more important than now to protect personal data from being ruthlessly exploited by these companies, often without the approval of either the data owners or the creators. It is also important that, as we roll out algorithmic use of data, we ensure adequate protections for people’s data. I, too, hope this Bill will soon be followed by another regulating the development of AI.

I would like to draw noble Lords’ attention to a few areas of the Bill which cause me concern. During the debates over the last data protection Bill, I know there were worries over the weakening of data subjects’ protection and the loosening of processing of their data. The Government must be praised for losing many of these clauses, but I am concerned, like some other noble Lords, to ensure adequate safeguards for the new “recognised legitimate interests” power given to data processors. I support the Government’s growth agenda and understand that this power will create less friction for companies when using data for their businesses, but I hope that we will have time later in the passage of the Bill to scrutinise the exemption from the three tests for processing data, particularly the balancing test, which are so important in forcing companies to consider the data rights of individuals. This is especially so when safeguarding children and vulnerable people. The test must not be dropped at the cost of the rights of people whose data is being used.

This concern is reinforced by the ICO stating in its guidance that this test is valuable in ensuring companies do not use data in a way that data subjects would not reasonably expect it to be used. It would be useful in the Explanatory Notes to the Bill to state explicitly that when a data processor uses “recognised legitimate interests”, their assessment includes the consideration of proportionality of the processing activity. Does the Minister agree with this suggestion?

The list of four areas for this exemption has been carefully thought through, and I am glad that the category of democratic engagement has been removed. However, the clause does give future Ministers a Henry VIII power to extend the list. I am worried; I have heard some noble Lords say that they are as well, and that the clause’s inclusion in the previous Bill also concerned other noble Lords. It could allow future Ministers to succumb to commercial interests and add new categories, which might be to the cost of data subjects. The Minister, when debating this power in the previous data Bill, reminded the House that the Delegated Powers and Regulatory Reform Committee said of these changes:

“The grounds for lawful processing of personal data go to the heart of the data processing legislation and therefore in our view should not be capable of being changed by subordinate legislation”.


The Constitution Committee’s report called for the Secretary of State’s powers in this area to be subject to primary and not secondary legislation. Why do these concerns not apply to Clause 70 in this Bill?

I welcome the Government responding to the scientific community’s demand that they should be able to reuse data for scientific, historic or statistical research. There will be many occasions when data was collected for the study of a specific disease and the researchers want to reuse it years later for further study, but they have been restricted by the narrow distinctions between the original and the new purpose. The Government have incorporated recitals from the original GDPR in the Bill, but the changes in Clause 67 must be read against the developments taking place in AI and the way in which it is being deployed.

I understand that the Government have gone to great efforts to set out a clear definition of scientific research in this clause. One criterion is the

“processing for the purposes of technological development or demonstration … so far as those activities can reasonably be described as scientific”,

and another is the publication of scientific papers from the study. But my fear is that AI companies, in their urgent need to scrape datasets for training large language models, will go beyond the policy intention in this clause. They might posit that their endeavours are scientific and may even be supported by academic papers, but when this is combined with the inclusion of commercial activities in the Bill, it opens the way for data reuses in creating AI data-driven products which claim they are for scientific research. The line between product development and scientific research is blurred because of how little is understood about these emerging technologies. Maybe it would help if the Bill set out what areas of commercial activity should not be considered scientific research. Can the Minister share with the House how the clause will stop attempts by AI developers to claim they are doing scientific research when they are reusing data to increase model efficiency and capabilities, or studying their risks? They might even be producing scientific papers in the process.

I have attended a forum with scientists and policymakers from tech companies using the training data for AI who admitted that it is sometimes difficult to define the meaning of scientific research in this context. This concern is compounded by Clause 77, which provides an exemption to Article 13 of the UK GDPR for researchers and archivists to provide additional information to a data subject when reusing their data for different purposes if it requires disproportionate effort to obtain the required information. I understand these provisions are drawn to help reuse medical data, but they could also be used by AI developers to say that contacting people for the reuse of datasets from an already trained AI model requires disproportionate effort. I understand there are caveats around this exemption. However, in an era when AI companies are scraping millions of pieces of data to train their models, noble Lords need to bear in mind it is often difficult for them to get permission from the data subjects before reusing the information for AI purposes.

I am impressed by the safeguards for the exemption for medical research set out in Clause 85. The clause says that medical research should be supervised by a research ethics committee to assess the ethical reuse of the data. Maybe the Government should think about using some kind of independent research committee with standards set by UKRI before commercial researchers are allowed to reuse data.

Like many other noble Lords, I am concerned about the changes to Article 22 of the UK GDPR put forward in Clause 80. I quite understand why the Government want to expand solely automated decision-making in order for decisions to be made quickly and efficiently. However, these changes need to be carefully scrutinised. The clause removes the burden on the data controller to overcome tests before implementing ADM, outside of the use of sensitive information. The new position requires the data subject to proactively ask if they would like a human to be involved in the decision made about them. Surely the original Article 22 was correct in making the processor think hard before making a decision to use ADM, rather than putting the burden on the data subject. That must be the right way round.

There are other examples, which do not include sensitive data, where ADM decisions have been problematic. Noble Lords will know that, during Covid, algorithms were used to predict A-level results which, in many cases, were flawed. None of that information would have been classified as sensitive, yet the decisions made were wrong in too many cases.

Once again, I am concerned about the Henry VIII powers which have been granted to the Secretary of State in new Article 22D(1) and (2). This clause is already extending the use of ADM, but it gives Secretaries of State in the future the power to change by regulation the definition of “meaningful human involvement”. This potentially allows for an expansion of the use of ADM; they could water down the effectiveness of human involvement needed to be considered meaningful.

Likewise, I am worried by the potential for regulations to be used to change the definition of a decision having a “significant adverse effect” on a data subject. The risk is that this could be used to exclude them from the relevant protection, but the decision could nevertheless still have a significant harmful effect on the individual. An example would be if the Secretary of State decided to exclude from the scope of a “significant decision” interim, rather than final, decisions. This could result in the exclusion of a decision taken entirely on the basis of a machine learning predictive tool, without human involvement, to suspend somebody’s universal credit pending an investigation and final decision of whether fraud had actually been committed. Surely some of the anxiety about this potential extension of ADMs would be assuaged by increased transparency around how they are used. The Bill is a chance for the Government to give greater transparency to how ADMs process our information. The result would be to greatly increase public trust.

The Algorithmic Transparency Recording Standard delivers greater understanding about the nature of tools being used in the public sector. However, of the 55 ADM tools in operation, only 9 reports have currently been subject to the ATRS. In contrast, the Public Law Project’s Tracking Automated Government register has identified at least 55 additional tools, with many others still to be uncovered. I suggest that the Government make it mandatory for public bodies to publish information about the ADM systems that they are using on the ATRS hub.

Just as importantly, this is a chance for people to obtain personal information about how an automated decision is made. The result would be that, if somebody is subject to a decision made or supported by AI or an algorithmic tool, they should be notified at the time of the decision and provided with a personalised explanation of how and why it was reached.

Finally, I will look at the new digital verification services trust framework being set up in Part 2. The Government must be praised for setting up digital IDs, which will be so useful in the online world. My life, and I am sure that of many others, is plagued by the vagaries of getting access to the various websites we need to run our lives, and I include the secondary security on our phones, which so often does not work. The effectiveness of this ID will depend on the trust framework that is created and on who is involved in building it.

At the moment, in Clause 28, the Secretary of State must consult the Information Commissioner and such other persons as the Secretary of State sees appropriate. It seems to me that the DVS will be useful only if it can be used across national boundaries. Interoperability must be crucial in a digital world without frontiers. I suggest that an international standards body should be included in the Bill. The most obvious would be W3C, the World Wide Web Consortium, which is the standards body for web technology. It was founded by Sir Tim Berners-Lee and is already responsible for the development of a range of web standards, from HTML to CSS. More than that, it is used in the beta version of the UK digital identity and attributes trust framework and has played a role in both the EU and the Australian digital identity services frameworks. I know that the Government want the Secretary of State to have flexibility in drawing up this framework, but the inclusion of an international standards body in the Bill would ensure that the Minister has them in the forefront of their mind when drawing up this much-needed framework.

The Bill is a wonderful opportunity for our country to build public trust in data-driven businesses and their development. It is a huge improvement on its predecessor; it goes a long way to ensure that the law has protections for data subjects and sets out how companies can lawfully use and reuse data. It is just as crucial in the era of AI that, during the passage of the Bill through the House, we do not leave the door open for personal data to be ruthlessly exploited by the big tech companies. We would all be damaged if that was allowed to happen.

Generative AI: Intellectual Property Rights

Viscount Colville of Culross Excerpts
Monday 11th November 2024

(1 month, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

The noble Baroness will know that there was an attempt to come to a voluntary agreement on this under the previous Government that would have been a way forward for both sectors. Unfortunately, that voluntary agreement did not work out, so the ball has bounced back into our court. The noble Baroness is absolutely right about journalism: if we do not have a vibrant journalistic bedrock for this society, we do not really have a democratic society; we need to know what is going on in the UK and the world. The noble Baroness is right that we need to protect journalists: we need to ensure that their work is rewarded and paid in the right way. We are working on this. I am sorry that I am beginning to sound a bit like a stuck record, but I assure noble Lords that we are working at pace to try to resolve these issues.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, many creators sold their IP rights to big publishers before the advent of large language models. Since then, those publishers have been exploiting creators’ work for the training of large language models and the creation of new AI performances, but they have failed to recompense the original creators. Does the Minister think that creators’ performance and moral rights should be updated in the face of the new use by AI of their work?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

That is exactly what we are trying to achieve. Creatives need to be properly respected and rewarded for their activities. We need to make sure that when scraping and web-crawling takes place, there is transparency about that and the originators of the material are properly recognised and rewarded.

Electronic Media: False Information

Viscount Colville of Culross Excerpts
Thursday 12th September 2024

(3 months, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, of course I am very happy to meet the noble Baroness to discuss this further, and I pay tribute to the work she has done on this issue in the past. On “small but risky” services, as she knows, the Secretary of State has written to Melanie Dawes, the CEO of Ofcom, and a very detailed reply was received today from Ofcom. We are still absorbing everything that it is proposing, but it is clear that it is taking this issue very seriously. That will give us the focus for our discussion when we meet.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, we have seen the first charge under the Online Safety Act’s false communications offence. To facilitate further prosecutions for false communications, can the Minister support statutory guidance to further define the term “non-trivial psychological harm” on a likely audience caused by disinformation?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, all this information will be detailed in the Ofcom guidance to be published in due course. This includes not only illegal harms but all the other issues under the category that the noble Viscount mentioned, all of which will be covered by the Ofcom codes to be published in due course.

Digital Markets, Competition and Consumers Bill

Viscount Colville of Culross Excerpts
Finally, the amendment also requires the Secretary of State to conduct a review in order to ascertain whether there are any other types of claim that might be appropriate for collective proceedings. No response has been given to that proposal, which I suggest is also eminently reasonable.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

I have added my name to the Minister’s Amendment 1 with great pleasure, because the Government agree that the power in Clause 6 is one the Secretary of State does not need. I have also added my name to Amendment 56 as it aims to curtail an even greater Secretary of State power. In Committee, I tabled a series of amendments to limit the Secretary of State’s powers over various stages of the Part 1 conduct requirement process. At the time, we were told that these powers were needed to ensure that the regime could respond to the fast evolution and unpredictability of digital markets. I grateful to the Minister for changing his mind on one of these powers in Clause 6 and for tabling the amendment to leave out subsections (2) and (3), which, even with the affirmative procedure, were going to give the Secretary of State unnecessary powers. It is a sensible move, as the criteria for deciding whether a digital activity should be deemed of strategic significance are, as he said, broad and well set out in subsection (1).

My concern was that the powerful tech companies, whose market dominance will be investigated in the Part 1 process, might put pressure on Ministers to amend the four criteria in Clause 6 to dilute the range of company activities under consideration for SMS positions. I am satisfied that this amendment will stop that happening. I hope that the Minister will now listen favourably to other amendments, which will be debated today, to ensure that the conduct requirement process is as swift as possible and that the Secretary of State does not have overmighty powers to intervene in the process.

I am grateful to the noble Lord, Lord Lansley, for tabling Amendment 56, to which I have added my name, to Clause 114. Subsection (4)(a) as it stands gives too much power to the Secretary of State to approve these guidelines. As I said in Committee, it was pointed out that the guidelines are the most important part of the SMS process. They set out the framework for the conduct requirement process and allow implementation of the new powers the Bill gives to the CMA to examine market-dominant activities by big tech companies.

One of the reasons for my fear of the Minister’s powers is that she might be subject to lobbying by tech companies, as the noble Baroness, Lady Stowell, pointed out, either to change the guidelines or to slow down implementation. At the moment, the Secretary of State has the power to delay approval indefinitely, and, looking to the future, when the guidelines need to be updated or revised, she or her successor could do the same thing. I am grateful to the Minister and his officials for meeting me twice to talk about this issue. I appreciate his time and attention, but I am disappointed that he and the Bill team felt unable to do anything to fetter the Secretary of State’s powers with a time limit on delay for approval. The Minister feels that a time limit would make the process brittle, and fears that an election or some big political event could cause the process to time out. I ask noble Lords to bear in mind that the amendment deals with the Secretary of State’s powers of approval of the guidelines only, not the entire procedure for setting up the guidelines. If there were an election, ministerial work would stop. However, once the new Government were in place, the time limit could kick in and start again. The Secretary of State could then approve the guidelines in 40 days or send them back to the CMA with reasons.

In my meeting with the Minister, he kindly offered to publish letters exchanged between the Secretary of State and the CMA as the guidelines were created. This seemed a wonderful offer that would go far towards ensuring transparency in the process and allay fears of backstage lobbying, and go some way towards assuaging Members’ concerns about the process of creating guidelines. Unfortunately, the Minister rescinded that offer. I ask him in the name of the openness and transparency of the Part 1 process to reinstate it.

Such a move would complement the second part of Amendment 56, whereby if the Minister does not approve of the guidelines—which would surely be the only reason for delay—an open statement of reasons as to why the guidelines could not be approved would be published. Surely noble Lords agree that transparency in the guidelines process would go far in calming any fears of it being influenced by the big tech companies.

I want very much to see this Bill on the statue book, but the Secretary of State’s powers in Clause 114 are detrimental to the Part 1 process and need to be looked at again. I hope the Minister will accept Amendment 56. If not, I will support the noble Lord, Lord Lansley, should he decide to test the opinion of the House.

Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I declare my interest as deputy chair of the Telegraph Media Group and my other interests as set out in the register. I will focus briefly on three crucial amendments in this group—on proportionality, the appeals standard, and the Secretary of State’s powers—echoing points that have already been made strongly in this debate.

I fully support Amendments 13 and 35 in the name of the noble Lord, Lord Faulks. The amendment made to the Bill in the Commons replacing “appropriate” with “proportionate” will significantly expand the scope for SMS firms to appeal the CMA’s decision to create conduct requirements and initiate pro-competitive interventions.

As we have already heard, the Government have sought to argue that, even absent the “proportionality” wording, in most cases the SMS firms will be able to argue that their ECHR rights will be engaged, therefore allowing them to appeal on the basis of proportionality. The question arises: why then introduce the “proportionality” standard for intervention at all, particularly when the CMA has never had the scope to act disproportionately at law?

In this context, it is clear that the main potential impact of the Bill as it now stands is that a court may believe that Parliament was seeking to create a new, heightened standard of judicial review. As the Government have rightly chosen to retain judicial review as the standard of appeals for regulatory decisions in Part 1, they should ensure that this decision is not undermined by giving big tech the scope to launch expensive, lengthy legal cases. All experience suggests that that is exactly what would happen by it arguing that the Government have sought to create a new, expansive iteration of JR. I fear that, if the amendments from the noble Lord, Lord Faulks, are not adopted, we may find in a few years’ time that we introduced full merits reviews by the back door, totally undermining the purpose of this Act.

Amendments 43, 44, 46, 51 and 52 in the name of the noble Baroness, Lady Jones, are also concerned with ensuring that we do not allow full merits appeals to undermine the CMA’s ability to regulate fast-moving digital markets. Even though full merits are confined to penalty decisions, financial penalties are, after all, as we have heard, the ultimate incentive to comply with the CMA’s requirements. We know that the Government want this to be a collaborative regime but, without there being a real prospect of meaningful financial penalties, an SMS firm will have little reason to engage with the CMA. Therefore, there seems little logic in making it easier for SMS firms to delay and frustrate the imposition of penalties.

There is also a danger that full merits appeals of penalty decisions will bleed back into regulatory decisions. The giant tech platforms will undoubtedly seek to argue that a finding of a breach of a conduct requirement, and the CMA’s consideration that an undertaking has failed to comply with a conduct requirement when issuing a penalty, are both fundamentally concerned with the same decision: “the imposition” of a penalty, with the common factor being a finding that a conduct requirement has been breached. The cleanest way to deal with this is to reinstate the merits appeals for all digital markets decisions. That is why, if the noble Baroness, Lady Jones, presses her amendments, I will support them.

Finally, I strongly support Amendment 56 in the name of my noble friend Lord Lansley, which would ensure that the Secretary of State must approve CMA guidance within a 40-day deadline. This would allow the Government to retain oversight of the pro-competition regime’s operations, while also ensuring that the operationalisation of the regime is not unduly delayed. It will also be important in ensuring that updates to the guidance are made promptly; such updates are bound to be necessary to iron out unforeseen snags or to react to rapidly developing digital markets. Absent a deadline for approval, there is a possibility that the regulation of big tech firms will grind to a halt mid-stream. That would be a disaster for a sector in which new technologies and business models are developed almost daily. I strongly support my noble friend and will back him if he presses his amendment to a vote.

With the deadline to comply with the Digital Markets Act in Europe passing only last week, big tech’s machinations in the EU have provided us with a window into our future if we do not make this legislation watertight. As one noble Lord said in Committee—I think it was the noble Lord, Lord Tyrie—we do not need a crystal ball when we can read the book. We have the book, and we do not like what we see in it. We must ensure that firms with an incredibly valuable monopoly to defend and limitless legal budgets with which to do so are not able to evade compliance in our own pro-competition regime.

Digital Markets, Competition and Consumers Bill

Viscount Colville of Culross Excerpts
None of the amendments that I have put forward would in any way dilute the protection for customers that we all want to see but they would protect British businesses, particularly in the vital creative economy, which we all want to see grow and prosper. If the issues that I have raised are not to be remedied here in Committee, I encourage the Government to commit to looking further into this issue as we approach Report. I beg to move.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I tabled Amendment 190, and I thank the noble Baroness, Lady Jones of Whitchurch, and the noble Lord, Lord Clement-Jones, for adding their names to it. I also thank Professor Christian Twigg-Flesner from the University of Warwick for his help in creating this amendment.

Clause 259 sets out the obligations of a trader when a consumer is entitled to cancel or bring a subscription contract to an end. They are limited to providing various types of notice and dealing with potential overpayments by the consumer. Many subscription contracts relate to all digital content. These will involve the provision of both personal and non-personal data under the contract. On ending the contract for a digital service, there needs to be clarity about what should happen to all the subscriber’s data.

The whole point of this amendment is that it lays duties on a trader, on the cancellation or end of a subscription contract, to ensure that the consumer gets all their data back, not just that narrowly defined as personal data. At the moment, only personal data is covered under the UK GDPR. This is defined very narrowly in Article 4. “Personal data” is defined as only

“information relating to an identified or identifiable natural person … an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier”.

Under Article 20, which covers the right of portability of data, the user can end a contract, which is tantamount to withdrawing their consent for the continuing processing of personal data. It ensures that the trader cannot use this personal data any more. Article 17 provides the consumer with the right to have the personal data erased after exercising the Article 20 portability right to download their personal data. Personal data, therefore, as narrowly defined, is well protected under the law at the end of a subscription.

However, the consumer might have a lot of other data that is not within the narrow definition of “personal data”. This is non-personal data. There is no provision under UK consumer law that deals with non-personal data following the end of a contract. This would have been covered by the 2019 EU directive on digital content and digital services, in Article 16, but that came into force only on 1 January 2022, long after the UK had left the EU.

Amendment 190 will deal with the absence of protection for non-personal data in English law. It will give the user control over all their data, both personal and non-personal. Proposed new subsection (7) protects all the consumer’s data created under the contract. This covers both personal and non-personal data. Proposed new subsection (8) allows for all this data to be returned to a user within a “reasonable period” after the end of the contract. Proposed new subsection (9) gives a balance to these consumer rights by creating exemptions for the trader to have to return the data, especially if it is part of a bigger dataset that cannot be easily separated out. Proposed new subsection (10) is particularly important, because it prevents the trader continuing to use the consumer’s non-personal data at the end of the contract.

As I have explained, “personal data” is very narrowly defined. This leaves a mass of data created by the consumer during the contract that will need to be protected at the end of the contract. It will be if this amendment is adopted. Surely, the Minister would want the trader to return all the digital data that the consumer created on the platform, and to prevent the trader continuing to exploit it for financial gain.

To give noble Lords an example of the dangers to consumers if this amendment is not adopted, a consumer might want to end their subscription to their account at Flickr, the photo-sharing platform. At the moment, the clause will ensure that all the photos that identify the user will be regarded as personal data and returned to them. However, it might well not cover all the other photos that do not directly identify them. They could be holiday pictures of beaches in Greece, historic buildings or wildlife that they placed on the Flickr platform during their contract.

Once the contract is finished, Flickr can currently keep all the other photographs that the consumer has taken and refuse to return them. Furthermore, it can use them for financial gain. Likewise, a user’s comments placed against somebody else’s photos can be retained on the site by the trader after the end of the contract. On Flickr, the original author’s name is changed to a randomly chosen two-word alternative. However, the comments can be detailed and the consumer might well want to retrieve them, but they currently will not be able to.

Digital Markets, Competition and Consumers Bill

Viscount Colville of Culross Excerpts
Moved by
76: Clause 114, page 70, line 37, leave out paragraph (b)
Member's explanatory statement
This amendment removes the requirement for the CMA to obtain approval from the Secretary of State before publishing its guidance outlining how it will exercise its functions.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I have asked for my Amendment 76 to Clause 114 to be decoupled, because I think it goes to the centre of the operation of Part 1 and I want noble Lords to focus on debating the issues raised by this clause as it stands. I also thank the noble Baroness, Lady Jones of Whitchurch, and the noble Lords, Lord Black and Lord Holmes, for putting their names to this amendment. I am glad that Amendment 77 in the name of the noble Baroness, Lady Stowell, is also in this group; I support its aims. Clause 114 seems to be a small section hidden away on page 70 of the Bill, yet the guidance process that it outlines is fundamental to the operation of the regime set out in Part 1 of the Bill.

This is a high-level Bill, which leaves a lot of fine-tuning and detail to the CMA. It will be the first part of the process to become operational after Royal Assent has been granted. Without these guidelines, the CMA will not be able to start its urgently needed investigations into the activities of large tech companies and their domination of many digital markets.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am happy to look into that as a mechanism, but, as currently set out in the Bill, the logic is that the Secretary of State can approve the guidance.

The Government will continue to work closely with the CMA, as they have throughout the drafting of the Bill, to ensure that the timely publication of guidance is not disrupted by this measure. Published guidance is required for the regime to be active, and the Government are committed to ensuring that this happens as soon as possible. Guidance will be published in good time before the regime goes live, to allow affected stakeholders to prepare. The Government hope that, subject to parliamentary time and receipt of Royal Assent, the regime will be in force for the common commencement date in October this year.

In response to my noble friend Lord Black’s question about guidance and purdah, the essential business of government can continue during purdah. The CMA’s guidance relates to the CMA’s intentions towards the operation of the regime, rather than to a highly political matter. However, the position would need to be confirmed with the propriety and ethics team in the Cabinet Office at the appropriate time, should the situation arise that we were in a pre-election period.

I thank the noble Viscount, Lord Colville, and my noble friend Lady Stowell for their amendments, and I hope that this will go some way towards reassuring them that the Government’s role in the production of guidance is proportionate and appropriate. As I said, I recognise the grave seriousness of the powerful arguments being raised, and I look forward to continuing to speak with them.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

I thank noble Lords for their contributions and ask the Minister to listen to the concerns Members have expressed today. The clause gives extraordinary power to the Secretary of State, and I ask the Minister to listen to his noble friends, the noble Baronesses, Lady Stowell and Lady Harding, who called the power dangerous. In particular, the noble Baroness, Lady Harding, said that it was so dangerous and such a big power that it must be a distraction.

The noble Lord, Lord Black, said that the concern about having this power is that it would create a delay, and that that would especially be a concern over the period of the election, both before and after. He called for draft guidance to be approved within 31 days, which is certainly something that could be considered; after all, no one wants ping-pong to go back and forth do they? They want the CMA’s guidance to be put into action and this process to start as soon as possible.

The noble Baroness, Lady Kidron, said that the asymmetric power between the regulators and the tech companies means that there will be a drum beat of what she called “participative arrangements”. That is quite a complex thought, but the idea behind it—that the CMA must not be stopped from using its power to deal with some of the most powerful companies in the world—is very important.

The noble Baroness, Lady Stowell, is a former regulator and called for Parliament to have a role in overseeing this. We were reminded by both the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that we had a discussion on Secretary of State powers in the debate on the Online Safety Act, much of which was about whether a joint digital committee could oversee digital regulation. I suspect that that will be discussed in the next group. We have given enormous powers to Ofcom with the Online Safety Act, we are giving big powers to the CMA and I imagine that we are giving big powers to the ICO in the Data Protection Act, so Parliament should have a powerful standing role in dealing with that.

The Minister called for robust oversight of the CMA and said that it must be accountable before Parliament. Already, Parliament looks at its review and annual reporting. I come back to the concern that the Secretary of State still has powers that are far too great over the implementation of this guidance, and that the CMA’s independence will be impinged on. I repeat what I and other noble Lords said on the concern about Clause 114: it stands to reduce the CMA’s independence. I ask the Minister to consider very seriously what we have been saying.

The Minister’s suggestion that he will look at the affirmative resolution for Secretary of State approval of guidance is something that we should certainly push further—at least that is some step towards reducing Secretary of State powers. With that, I beg leave to withdraw my amendment.

Amendment 76 withdrawn.