(4 days, 21 hours ago)
Grand CommitteeMy Lords, the first amendment of the noble Lord, Lord Holmes, Amendment 14, seeks to ensure that the production reliance on software and artificial intelligence are included in the scope of the Bill. Clearly, all our remarks are somewhat irrelevant if the Minister gets up and says, “No, they are not”. However, on the assumption that the Minister is going to say, “Yes, they are”, I draw particular attention, if I may, in supporting all the noble Lord’s amendments, to Amendments 75 to 78, on the issue of labelling. This seems to me to be an opportunity for real joined-up government thinking.
The Minister will be well aware that the Communications and Digital Committee, on which I had the opportunity to serve at the time of this, produced a very detailed report on the development of LLMs, large language models, and AI. In so doing, we particularly raised concern about the way in which these large language models were being trained by scraping tons of data from a variety of sources, then creating products over which they were then able to get intellectual property coverage. In so doing, they had scraped a great deal of data.
Amendment 78 in the name of the noble Lord, Lord Holmes, in respect of the labelling and so on, requires the Secretary of State to lay
“regulations to ensure no product or content … uses an individual’s image, likeness or personality rights without that individual’s express consent”.
Had I been drafting the amendment, I would have gone much further, because it seems to me that a large amount of other data is scraped—for instance, novels written by authors without their permission. I could go on; it is well worth looking at the Select Committee report.
Does the Minister accept that this is a real opportunity to have joined-up thinking, when the Government finally decide what their position is in relation to the training of LLMs and people being required to get the permission of all data owners before they can bring their product to market? Does he agree that the labelling of such products, when developed, should include specific reference to them having gained the appropriate permission, paid the appropriate fee or got the appropriate licence to make use of the data that was made use of in the training of those AI products?
My Lords, I shall speak briefly to Amendment 75, which was very eloquently introduced by the noble Lord, Lord Holmes. My academic background is in the research of communication and how people make decisions based on information that they are given. That touches quite a lot on how people assess the reliability and trustworthiness of data.
Amendment 75, on the labelling of AI-based products, includes a proposal about communicating the data used in the training of the AI. I think it is really important that people who have products that provide information on which they might be making decisions, or the product might be acting, are able to know the reliability and trustworthiness of that information. The cues that people use for assessing that reliability are such things as the size of the dataset, how recently that data was gathered and the source of that data—because they want to know if that data, to use the example of the noble Lord, Lord Holmes, is on American cheeses, British cheeses or Italian cheeses, all of which might need a different temperature in your fridge. I urge the Minister to look at this, because the over-trust or the under-trust in the outputs of data make such a difference to how people respond to products. I think this is very important.