(1 year ago)
Lords ChamberMy Lords, I declare an interest as chair of the council of Queen Mary University of London, with its major research interests. It is a pleasure to follow the noble Baroness, Lady Jones of Whitchurch, in her new role.
I want to start on a positive note by celebrating the recent Royal Assent of the Online Safety Act and the publication of the first draft code for consultation. I also very much welcome that we now have a dedicated science and technology department in the form of DSIT, although I very much regret the loss of Minister George Freeman yesterday.
Sadly, there are many other less positive aspects to mention. Given the Question on AI regulation today, all I will say is that despite all the hype surrounding the summit, including the PM’s rather bizarre interview with Mr Musk, in reality the Government are giving minimal attention to AI, despite the Secretary of State saying that the world is standing at the inflection point of a technological revolution. Where are we on adjusting ourselves to the kinds of risk that AI represents? Is it not clear that the Science, Innovation and Technology Committee is correct in recommending in its interim report that the Government
“accelerate, not … pause, the establishment of a governance regime for AI, including whatever statutory measures as may be needed”?
That is an excellent recommendation.
I also very much welcome that we are rejoining Horizon, but there was no mention in the Minister’s speech of how we will meet the challenge of getting international research co-operation back to where it was. I am disappointed that the Minister did not give us a progress update on the department’s 10 objectives in its framework for science and technology, and on action on the recommendations of its many reviews, such as the Nurse review. Where are the measurable targets and key outcomes in priority areas that have been called for?
Nor, as we have heard, has there been any mention of progress on Project Gigabit, and no mention either of progress on the new programmes to be undertaken by ARIA. There was no mention of urgent action to mitigate increases to visa fees planned from next year, which the Royal Society has described as “disproportionate” and a “punitive tax on talent”, with top researchers coming to the UK facing costs up to 10 times higher than in other leading science nations. There was no mention of the need for diversity in science and technology. What are we to make of the Secretary of State demanding that UKRI “immediately” close its advisory group on EDI? What progress, too, on life sciences policy? The voluntary and statutory pricing schemes for new medicines currently under consultation are becoming a major impediment to future life sciences investment in the UK.
Additionally, health devices suffer from a lack of development and commercialisation incentives. The UK has a number of existing funding and reimbursement systems, but none is tailored for digital health, which results in national reimbursement. What can DSIT do to encourage investment and innovation in this very important field?
On cybersecurity, the G7 recognises that red teaming, or what is called threat-led penetration testing, is now crucial in identifying vulnerabilities in AI systems. Sir Patrick Vallance’s Pro-innovation Regulation of Technologies Review of March this year recommended amending the Computer Misuse Act 1990 to include a statutory public interest defence that would provide stronger legal protections for cybersecurity researchers and professionals carrying out threat intelligence research. Yet there is still no concrete proposal. This is glacial progress.
However, we on these Benches welcome the Digital Markets, Competition and Consumers Bill. New flexible, pro-competition powers, and the ability to act ex ante and on an interim basis, are crucial. We have already seen the impact on our sovereign cloud capacity through concentration in just two or three US hands. Is this the future of AI, given that these large language models now developed by the likes of OpenAI, Microsoft, Anthropic AI, Google and Meta require massive datasets, vast computing power, advanced semiconductors, and scarce digital and data skills?
As the Lords Communications and Digital Committee has said, which I very much welcome, the Bill must not, however, be watered down in a way that allows big tech to endlessly challenge the regulators in court and incentivise big tech firms to take an adversarial approach to the regulators. In fact, it needs strengthening in a number of respects. In particular, big tech must not be able to use countervailing benefits as a major loophole to avoid regulatory action. Content needs to be properly compensated by the tech platforms. The Bill needs to make clear that platforms profit from content and need to pay properly and fairly on benchmarked terms and with reference to value for end users. Can the Minister, in winding up, confirm at the very least that the Government will not water down the Bill?
We welcome the CMA’s market investigation into cloud services, but it must look broadly at the anti-competitive practices of the service providers, such as vendor lock-in tactics and non-competitive procurement. Competition is important in the provision of broadband services too. Investors in alternative providers to the incumbents need reassurance that their investment is going on to a level playing field and not one tilted in favour of the incumbents. Can the Minister reaffirm the Government’s commitment to infrastructure competition in the UK telecommunications industry?
The Data Protection and Digital Information Bill is another matter. I believe the Government are clouded by the desire to diverge from the EU to get some kind of Brexit dividend. The Bill seems largely designed, contrary to what the Minister said, to dilute the rights of data subjects where it should be strengthening them. For example, there is concern from the National AIDS Trust that permitting intragroup transmission of personal health data
“where that is necessary for internal administrative purposes”
could mean that HIV/AIDS status will be inadequately protected in workplace settings. Even on the Government’s own estimates it will have a minimal positive impact on compliance costs, and in our view it will simply lead to companies doing business in Europe having to comply with two sets of regulation. All this could lead to a lack of EU data adequacy.
The Bill is a dangerous distraction. Far from weakening data rights, as we move into the age of the internet of things and artificial intelligence, the Government should be working to increase public trust in data use and sharing by strengthening those rights. There should be a right to an explanation of automated systems, where AI is only one part of the final decision in certain circumstances—for instance, where policing, justice, health, or personal welfare or finance is concerned. We need new models of personal data controls, which were advocated by the Hall-Pesenti review as long ago as 2017, especially through new data communities and institutions. We need an enhanced ability to exercise our right to data portability. We need a new offence of identity theft and more, not less, regulatory oversight over use of biometrics and biometric technologies.
One of the key concerns we all have as the economy becomes more and more digital is data and digital exclusion. Does DSIT have a strategy in this respect? In particular, as Citizens Advice said,
“consumers faced unprecedented hikes in their monthly mobile and broadband contract prices”
as a result of mid-contract price rises. When will the Government, Ofcom or the CMA ban these?
There are considerable concerns about digital exclusion, for example regarding the switchover of voice services from copper to fibre. It is being carried out before most consumers have been switched on to full fibre infra- structure and puts vulnerable customers at risk.
There are clearly great opportunities to use AI within the creative industries, but there are also challenges, big questions over authorship and intellectual property. Many artists feel threatened, and this was the root cause of the recent Hollywood writers’ and actors’ strike. What are the IPO and government doing, beyond their consultation on licensing in this area, to secure the necessary copyright and performing right reform to protect artists from synthetic versions?
I very much echo what the noble Baroness, Lady Jones, said about misinformation during elections. We have already seen two deepfakes related to senior Front-Bench Members—shadow spokespeople—in the Commons. It is concerning that those senior politicians appear powerless to stop this.
My noble friends will deal with the Media Bill. The Minister did not talk of the pressing need for skilling and upskilling in this context. A massive skills and upskilling agenda is needed, as well as much greater diversity and inclusion in the AI workforce. We should also be celebrating Maths Week England, which I am sure the Minister will do. I look forward to the three maiden speeches and to the Minister’s winding up.
(1 year ago)
Lords ChamberThis is indeed a serious and complex issue, and yesterday I met the Creative Industries Council to discuss it. Officials continue to meet regularly both with creative rights holders and with innovating labs, looking for common ground with the goal of developing a statement of principles and a code of conduct to which all sides can adhere. I am afraid to say that progress is slow on that; there are disagreements that come down to legal interpretations across multiple jurisdictions. Still, we remain convinced that there is a landing zone for all parties, and we are working towards that.
My Lords, I welcome what the Minister has just said, and he clearly understands this technology, its risks and indeed its opportunities, but is he not rather embarrassed by the fact that the Government seem to be placing a rather higher priority on the regulation of pedicabs in London than on AI regulation?
I am pleased to reassure the noble Lord that I am not embarrassed in the slightest. Perhaps I can come back with a quotation from Yann LeCun, one of the three godfathers of AI, who said in an interview the other week that regulating AI now would be like regulating commercial air travel in 1925. We can more or less theoretically grasp what it might do, but we simply do not have the grounding to regulate properly because we lack the evidence. Our path to the safety of AI is to search for the evidence and, based on the evidence, to regulate accordingly.
(1 year, 1 month ago)
Lords ChamberI remember the July debate very well. I made a commitment then to meet with concerned Members, which I am happy to repeat. Again, I ask that concerned Members write to me to indicate that they would like to meet. Those who have written to me, have met with me.
My Lords, the Minister mentioned that the Online Safety Bill will come into law very shortly. Will he commit to setting up the advisory committee on disinformation and misinformation as soon as possible after this? The current situation clearly demonstrates both the need for it and for it to come to swift conclusions.
I very much share the noble Lord’s analysis of the need for this group to come rapidly into existence. It is, of course, the role of Ofcom to create it. I will undertake to liaise with it to make sure that that is speeded up.
(1 year, 1 month ago)
Lords ChamberMy noble friend is absolutely right to highlight the essential need for interoperability of AI given the way that AI is produced across so many jurisdictions. In addition to the global safety summit next week, we continue our very deep engagement with a huge range of multilateral groups. These include the OECD, the Council of Europe, the GPAI, the UN, various standards development groups, the G20 and the G7, along with a range of bilateral groups, including —just signed this year—the Atlantic declaration with the US and the Hiroshima accord with Japan.
My Lords, Professor Stuart Russell memorably said:
“There are more regulations on sandwich shops than there are on AI companies”.
After a disappointing White Paper, in the light of the forthcoming summit will the Government put more risk and regulatory meat in their AI sandwich? Is it not high time that we started addressing the AI risks so clearly identified at the G7 meetings this year with clear, effective and proportionate regulation?
I am pleased to say that the Government spend more on AI safety than any other Government of any country. We have assembled the greatest concentration of AI safety expertise anywhere and, based on that input, we feel that nobody has sufficient understanding of the risks or potential of AI at this point to regulate in a way that is not premature. The result of premature regulation is regulation that creates unnecessary friction for businesses, or runs the risk of protecting or failing to protect from emerging dangers of which we are as yet unaware.
(1 year, 2 months ago)
Lords ChamberMy Lords, shall we allow the noble Lord, Lord Wigley, to contribute and then the noble Lord, Lord Clement-Jones?
I apologise to the noble Lord for not having reached that bit. The concern about Newport Wafer Fab was that the ultimate owners of the buyer were Chinese investors; hence, under the NSI Act, that was blocked. I cannot comment any further on that specific case because it is under judicial review.
My Lords, the Government may have finally published a strategy on semiconductors, but is investment in our great south Wales compound semiconductor hub going to be encouraged by his ministerial colleague Paul Scully’s remarks about not wanting to recreate Taiwan in south Wales? Also, as has been referred to, there is the very much delayed decision over the future of Newport Wafer Fab.
What Minister Scully clearly meant was that there is no point attempting to construct an advanced silicon manufactory at the cost of tens of billions of pounds at considerable risk to both investors and the taxpayer when all those who have tried to mimic TSMC have failed at great expense. It is far better to focus on our strengths and on the compound semiconductor strategy that Minister Scully will have spoken about on that occasion. Again, Newport Wafer Fab is under judicial review and I cannot comment further.
(1 year, 2 months ago)
Grand CommitteeMy Lords, these regulations were laid before the House on 10 July 2023, and they will be made under the powers provided by the Product Security and Telecommunications Infrastructure Act 2022 and the European Union (Withdrawal Agreement) Act 2020. They will mandate that the manufacturers of consumer connectable products made available to customers in the UK are, unless excepted, required to meet minimum security requirements.
In doing so, this instrument will complete the introduction of the UK’s pioneering product security regime, established by Part 1 of the Product Security and Telecommunications Infrastructure Act 2022. Subject to noble Lords’ approval, this regime will afford UK citizens and businesses with world-leading protections from the threats of cybercrime, as well as equipping the Government with the tools to ensure the long-term security of a vital component of the broader technology ecosystem.
Acting to secure consumer connectable products has never been more critical than it is now, as we cross the threshold of the fourth industrial revolution. Before our eyes, artificial intelligence is rewriting how we live our lives, how we deliver our priorities and the rules of entire industries. AI models are already an inextricable part of the connectable products we use every day, from the convolutional neural networks that recognise the photos of loved ones on our smartphones, to the recurrent neural networks that allow our smart speakers to respond to our requests. The data collected through consumer devices is often also a vital part of a model’s training set.
These regulations are therefore not just crucial if we are to protect our citizens and economy from the array of threats posed by consumer connectable products today but a vital step if we are to mitigate the risks, and therefore fully realise the benefits, of the AI-enabled economy of tomorrow. With the support of this House and Members of another place, this is precisely what the Government aim to achieve with these regulations.
The key provisions of this instrument are as follows. First, the regulations mandate that manufacturers comply with the security requirements set out in Schedule 1. These requirements were selected, following extensive consultation, because they are applicable across a broad range of devices and are commended by security experts as the most fundamental measures for addressing cyber risks to products and their users. This means that businesses will no longer be able to sell consumer smart products with universal default or easily guessable default passwords to UK customers. These passwords not only expose users to unacceptable risks of cyberattack but can also allow malicious actors to compromise products at scale, equipping them with the computing power to launch significantly disruptive cyberattacks.
Manufacturers will also be required to publish, in a manner that is accessible, clear and transparent, the details of a point of contact for the reporting of security vulnerabilities. It pains me to share that, despite our entrusting the security of our data, finances and even homes to the manufacturers of these products, as of 2022, less than one-third of global manufacturers had a policy for how they can be made aware of vulnerabilities. With your support, the UK aims to change that.
The final security requirement in this instrument will ensure that the minimum length of time for which a product will receive security updates is not just published but published in an accessible, clear and transparent manner. We know that consumers value security and consider it when purchasing products. Equipped with the vital information mandated by this requirement, UK consumers will be able to drive manufacturers to improve the security protections they offer through market forces.
We are confident, based on extensive policy development, consultation and advice from the National Cyber Security Centre, that these security requirements will make a fundamental difference to the security of products, their users and the wider connected technology ecosystem.
We also recognise the importance of cutting red tape or, better still, not introducing it in the first place. For this reason, Regulation 4 allows manufacturers that are already compliant with provisions in international standards equivalent to our security requirements to more readily demonstrate their compliance with our security requirements.
The instrument also sets out a list of products excepted from the scope of the product security regime. First, it excepts select product categories where made available for supply in Northern Ireland. This exception ensures that the regime upholds the UK’s international commitments under the EU withdrawal agreement, while extending the protections and benefits offered by the regime to consumers and businesses across the UK.
In addition, smart charge points, medical devices and smart metering devices are excepted to avoid double regulation and to ensure that these products are secured with the measures most appropriate to the particulars of their functions. This instrument also excepts laptops, desktop computers and tablets without a cellular connection from the regime’s scope. Engagement with industry highlighted that the manufacturers of these products would face unique challenges in complying with this regime, and in many cases where these products are in use they are already subject to suitable cyber protections. It is therefore not clear at this stage that including these products in the regime’s scope would be proportionate.
Finally, the regulations also contain uncontroversial administrative provisions, including provisions relating to statements of compliance. The regime will require that these documents accompany products, serving as an audit trail to enable compliance across the supply chain and to facilitate effective enforcement.
These regulations and the regime of which they are a part represent a victory for UK consumers. They are the first in the world to recognise that the public has a right to expect that the products available for them to purchase are secure. These measures solidify the United Kingdom’s position at the forefront of the global cyber agenda, paving the way for other nations to follow in our footsteps. I commend the regulations to the Committee.
My Lords, I thank the Minister for his introduction, which gave us the context for these regulations and the risks they are designed to mitigate and prevent. I agree with him about the importance of regulating in this area but, sadly—clearly—this is not box office today. We must live with that.
I welcome the regulations as far as they go. The one bright spot is that all regulations under the original Act, with one exception, are subject to the affirmative procedure, thanks to amendments put forward by us and accepted by the Government, which were designed to implement the recommendations of the Delegated Powers and Regulatory Reform Committee. That we are discussing the regulations in this way is testimony to that.
However, the regulations do not go far enough, despite being described by the Minister as a “pioneering product security regime”. As I said at Third Reading of the original Bill, last October, we did not specify enough security requirements for IoT devices in primary legislation. There was a commitment to regulate for only the top three guidelines covered by the 2018 Code of Practice for Consumer IoT Security, namely: first, to prohibit the setting of universal default passwords and the ability to set weak or easily guessable passwords; secondly, to implement a vulnerability disclosure policy, requiring the production and maintenance by manufacturers of regularly publicly available reports of security vulnerabilities; and, thirdly, to keep software updated and ensure the provision of information to the consumer before the contract for sale or supply of a relevant connectable product detailing the minimum length of time for which they will receive software or other relevant updates for that product.
Those are now all in the regulations and I welcome that, but, sadly, many of the other guidelines were never going to be, and are not now, specifically covered in the regulations. Quite apart from the first three, there are a whole range of others: securely store credentials and security-sensitive data; communicate securely; minimise exposed attack surfaces; ensure software integrity; ensure that personal data is protected; make systems resilient to outages; monitor system telemetry data; make it easier for consumers to delete personal data; make the installation and maintenance of devices easy; and validate input data. All those are standards that should be adhered to in relation to these devices. Two of the guidelines that have not been made mandatory—ensure that personal data is protected, and make it easier for consumers to delete personal data—have been highlighted by Which? this very morning, which has produced research demonstrating that:
“Smart home device owners are being asked to provide swathes of data to manufacturers, which could compromise their privacy and potentially result in them handing their personal information to social media and marketing firms, Which? research has found”.
This is part of its press release.
“The consumer champion found companies appear to hoover up far more data than is needed for the product to function. This includes smart speakers and security cameras that share customer data with Meta and TikTok, smart TVs that insist on knowing users’ viewing habits and a smart washing machine that requires people’s date of birth. The research suggests that, despite consumers having already paid up to thousands of pounds for smart products, they are also having to ‘pay’ with their personal data”.
We need to make sure that the Government and the regulator, whether the ICO or others, are on the case in that respect.
Nor did we see any intention to introduce appropriate minimum periods for the provision of security updates and support, taking into account factors including the reasonable expectations of consumers, the type and purpose of the connectable products concerned and any other relevant considerations. During the passage of the Bill, the Government resisted that—unlike the EU, which has imposed a five-year mandatory minimum period in which products must receive security updates. So consumers in Northern Ireland, for instance, are going to be far better off as a result of the TCA and the Windsor agreement.
That has inevitably followed through into these disappointing regulations, but they are even more disappointing than previously anticipated. Online marketplaces are not covered. Why not? My noble friend Lord Fox tabled an amendment on Report that sought to probe whether online marketplaces would be covered, a question that I think we all agree is of great importance. My noble friend quoted a letter from the noble Lord, Lord Parkinson, dated 21 September 2022 stating that
“businesses need to comply with the security requirements of the product security regime in relation to all new consumer connectable products offered to customers in the UK, including those sold through online marketplaces”.
In response, the then Minister, the noble Lord, Lord Kamall, said:
“The Bill will ensure that where online marketplaces manufacture, import or sell products, they bear responsibility for the security of those products. Where this does not happen, I assure noble Lords that they should make no mistake: the regulator will act promptly to address serious risk from insecure products, and work closely with online marketplaces to ensure effective remedy”.
I accepted that assurance. I said:
“As regards the online marketplaces, I am grateful for those assurances, which are accepted and are very much in line with the letter”.—[Official Report, 12/10/22; cols. 794-95.]
That was the assurance that was given and accepted.
The Minister has moved on from talking about periods of assurance for consumers. I mentioned the EU introducing its five-year rule and the Northern Ireland aspect. That is rather useful for the Government to be able to see the impact of putting down a marker on a five-year period, because there is no alternative under the TCA and the Windsor agreement. Will the Government undertake to review how it is working in Northern Ireland? If it is working well and they think it is practical, will they introduce it across the UK?
That is an interesting experimental chamber to have, because we can compare the two regimes, so I am happy to make that commitment, yes.
The assurances about online marketplaces from my noble friends Lord Kamall and Lord Parkinson remain true. Products sold through online marketplaces are subject to the same requirements as all other products. No regulation is perfect and, if relevant parties do not comply, the parent Act empowers the Secretary of State, or those whom the Secretary of State has authorised to carry out enforcement functions, with robust powers to address non-compliance, including monitoring the market, warning consumers of risks and, where appropriate, seizing products and recalling products from customers.
The Government have made it clear that they expect online marketplaces to do more to keep unsafe products off their platforms, and are conducting a review of the product safety framework. The product safety review consultation is open until 24 October. Following this, we will review and analyse stakeholder feedback and publish a government response. Any legislation will be brought forward in line with parliamentary procedures and timetables, which will include proposals to tackle the sale of unsafe products online. Officials will continue—
I apologise to the Minister, but what is the reason for having two separate processes for manufacturers and online distributors? The assurance that I quoted could not have been clearer, and we all thought that these regulations would include not only manufacturers but online distributors. It still baffles me and I am sure it baffles the noble Lord, Lord Bassam, as well. The logic of doing it in two separate tranches entirely escapes me.
The processes we have put here resulted from extensive consultation with the stakeholders, both the manufacturers and the retailers.
So the Minister is saying that the retailers did not like it, did not have the systems required and could not do things quickly enough—despite the fact that some time has elapsed, as the noble Lord, Lord Bassam, mentioned—so they said, “Not now, Josephine”, basically.
No, the consultation took place with a wide range of civil society and other stakeholders. Mechanisms are in place to update, should it not prove to be as proportionate as we believe it is. The Government are also engaging directly with online marketplaces to explore how they can complement the product security regime and further protect consumers.
On the question of how the regime accounts for the possibility of changing international standards, the instrument references specific versions of ETSI EN 303 645 and ISO/IEC 29147. Were the standards to be updated, the version cited would still be the applicable conditions in Regulation 2. Noble Lords should rest assured that any action by the Government to update the standards referenced in the regime would require further parliamentary scrutiny.
Turning to computers, we do not have evidence that including such products in the scope of the regime would significantly reduce security risk. There is a mature anti-virus software market that empowers customers to secure their own devices. Alongside this, mainstream operating system vendors already include security features in their services. The result is that they are not subject to the same level of risk as other consumer devices.
On smart meters and data, the smart metering product market is already regulated through the Gas Act 1986, the Electricity Act 1989 and the Smart Energy Code. Smart metering products are subject to tailored cyber requirements that reflect their specific risk profile. This exception ensures that smart meter products are not subject to double regulation without compromising their security.
I have to confess that my familiarity with some of that legislation is a bit limited, but I was attempting to convey that the full extent of the regulation covering those devices is collectively included in those three instruments. I recognise that that is not a wholly satisfactory answer, so I am very happy to write to the noble Lord. That legislation mandates compliance with the code collectively, which is kept up to date and includes robust modern cyber requirements. The UK already has a robust framework for data protection. While I absolutely agree that it is important, it is not the subject of these regulations.
I would like to return to a matter that I addressed earlier and point out that the cyber resilience Act that the noble Lord mentioned will in fact not, as per the current agreed version of the Windsor Framework, come into effect in Northern Ireland. The point remains that we will monitor its impact on the continent. I beg his pardon for not being clear about that.
Turning to the matters raised by the noble Lord, Lord Bassam, we agree that the challenges posed by inadequate consumer connectable product security require urgent action. However, regulating a sector as heterogeneous as connectable technology in its diversity of devices, user cases, threat profiles and extant regulation also requires careful consideration. We feel that we have acted as quickly as was appropriate, and in doing so we acted before any other nation.
On the role of distributors in communicating the defined support period to customers, products made available to consumers in the UK, or those made available to businesses but identical to those made available to consumers, are required to be accompanied by a statement of compliance, which will contain information about the minimum security update period for the product. Retailers are in fact required to ensure that the statement of compliance accompanies their product.
In addition, the SI requires manufacturers to publish information about the minimum security update periods, alongside invitations to purchase the product where certain conditions are met. The Government have no immediate plans to make it mandatory for the distributors of these products to publicise the defined support period. However, we encourage distributors to take this action voluntarily. If the manufacturer fails to publish the defined support period, the enforcement authority can issue notices demanding that the manufacturer make the necessary corrections, or demand that importers or distributors stop selling the product. It can also seize products and recall them from end users.
We will of course be monitoring the effectiveness of the product security regime when it comes into effect. If evidence emerges suggesting that further action to ensure the availability of the defined support period at points of purchase would be appropriate to enhance and protect the security of products and their users, the PSTI product security regime empowers Ministers to take such action.
In conclusion, I hope noble Lords will recognise the benefits that this regime will bring to the UK public and its ground-breaking influence on the world stage.
Before the Minister sits down, I wonder whether he could return to his notes on the cyber resilience Act. I heard what he said but it may have been a slip of the tongue because he said that it has not yet come into effect but we will monitor its impact on the continent. I think—at least, I assume—that he meant we will monitor its impact when it comes into effect in Northern Ireland. It will inevitably come into effect into Northern Ireland, will it not?
Perhaps the Minister could write to me or to us. The fact, as I understand it, is that the Act is a piece of EU legislation that is going to come into effect across the EU under the Windsor agreement and the TCA. Northern Ireland is subject to EU legislation of that kind; it will therefore come into effect in Northern Ireland and we will be able to monitor its impact there. So, it is not just a question of monitoring its impact on the continent. We have a homegrown example of how it will be implemented—a test bed.
(1 year, 4 months ago)
Lords ChamberMy Lords, I remind the House of my relevant interests in the register. We are all indebted to the noble Lord, Lord Ravensdale, for initiating this very timely debate and for inspiring such a thought-provoking and informed set of speeches today. The narrative around AI swirls back and forwards in this age of generative AI, to an even greater degree than when our AI Select Committee conducted its inquiry in 2017-18—it is very good to see a number of members of that committee here today. For instance, in March more than 1,000 technologists called for a moratorium on AI development. This month, another 1,000 technologists said that AI is a force for good. As the noble Lord, Lord Giddens, said, we need to separate the hype from the reality to an even greater extent.
Our Prime Minister seems to oscillate between various narratives. One month we have an AI governance White Paper suggesting an almost entirely voluntary approach to regulation, and then shortly thereafter he talks about AI as an existential risk. He wants the UK to be a global hub for AI and a world leader in AI safety, with a summit later this year, which a number of noble Lords discussed.
I will not dwell too much on the definition of AI. The fact is that the EU and OECD definitions are now widely accepted, as is the latter’s classification framework, but I very much liked what the noble and right reverend Lord, Lord Chartres, said about our need to decide whether it is tool, partner or competitor. We heard today of the many opportunities AI presents to transform many aspects of people’s lives for the better, from healthcare—mentioned by the noble Lords, Lord Kakkar and Lord Freyberg, in particular—to scientific research, education, trade, agriculture and meeting many of the sustainable development goals. There may be gains in productivity, as the noble Lord, Lord Londesborough, postulated, or in the detection of crime, as the noble Viscount, Lord Waverley, said.
However, AI also clearly presents major risks, especially reflecting and exacerbating social prejudices and bias, the misuse of personal data and undermining the right to privacy, such as in the use of live facial recognition technology. We have the spreading of misinformation, the so-called hallucinations of large language models and the creation of deepfakes and hyper-realistic sexual abuse imagery, as the NSPCC has highlighted, all potentially exacerbated by new open-source large language models that are coming. We have a Select Committee, as we heard today from the noble Lord, Lord Browne, and the noble and gallant Lord, Lord Houghton, looking at the dilemmas posed by lethal autonomous weapons. As the noble Lord, Lord Anderson, said, we have major threats to national security. The noble Lord, Lord Rees, interestingly mentioned the question of overdependence on artificial intelligence—a rather new but very clearly present risk for the future.
We heard from the noble Baroness, Lady Primarolo, that we must have an approach to AI that augments jobs as far as possible and equips people with the skills they need, whether to use new technology or to create it. We should go further on a massive skills and upskilling agenda and much greater diversity and inclusion in the AI workforce. We must enable innovators and entrepreneurs to experiment, while taking on concentrations of power, as the noble Baroness, Lady Stowell, and the noble Lords, Lord Rees and Lord Londesborough, emphasised. We must make sure that they do not stifle and limit choice for consumers and hamper progress. We need to tackle the issues of access to semiconductors, computing power and the datasets necessary to develop large language generative AI models, as the noble Lords, Lord Ravensdale, Lord Bilimoria and Lord Watson, mentioned.
However, the key and most pressing challenge is to build public trust, as we heard from so many noble Lords, and ensure that new technology is developed and deployed ethically, so that it respects people’s fundamental rights, including the rights to privacy and non-discrimination, and so that it enhances rather than substitutes for human creativity and endeavour. Explainability is key, as the noble Lord, Lord Holmes, said. I entirely agree with the right reverend Prelate that we need to make sure that we adopt these high-level ethical principles, but I do not believe that is enough. A long gestation period of national AI policy-making has ended up producing a minimal proposal for:
“A pro-innovation approach to AI regulation”,
which, in substance, will amount to toothless exhortation by sectoral regulators to follow ethical principles and a complete failure to regulate AI development where there is no regulator.
Much of the White Paper’s diagnosis of the risks and opportunities of AI is correct. It emphasises the need for public trust and sets out the attendant risks, but the actual governance prescription falls far short and goes nowhere in ensuring where the benefit of AI should be distributed. There is no recognition that different forms of AI are technologies that need a comprehensive cross-sectoral approach to ensure that they are transparent, explainable, accurate and free of bias, whether they are in a regulated or an unregulated sector. Business needs clear central co-ordination and oversight, not a patchwork of regulation. Existing coverage by legal duties is very patchy: bias may be covered by the Equality Act and data issues by our data protection laws but, for example, there is no existing obligation for ethics by design for transparency, explainability and accountability, and liability for the performance of AI systems is very unclear.
We need to be clear, above all, as organisations such as techUK are, that regulation is not necessarily the enemy of innovation. In fact, it can be the stimulus and the key to gaining and retaining public trust around AI and its adoption, so that we can realise the benefits and minimise the risks. What I believe is needed is a combination of risk-based, cross-sectoral regulation, combined with specific regulation in sectors such as financial services, underpinned by common, trustworthy standards of testing, risk and impact assessment, audit and monitoring. We need, as far as possible, to ensure international convergence, as we heard from the noble Lord, Lord Rees, and interoperability of these standards of AI systems, and to move towards common IP treatment of AI products.
We have world-beating AI researchers and developers. We need to support their international contribution, not fool them that they can operate in isolation. If they have any international ambitions, they will have to decide to conform to EU requirements under the forthcoming AI legislation and ensure that they avoid liability in the US by adopting the AI risk management standards being set by the National Institute of Standards and Technology. Can the Minister tell us what the next steps will be, following the White Paper? When will the global summit be held? What is the AI task force designed to do and how? Does he agree that international convergence on standards is necessary and achievable? Does he agree that we need to regulate before the advent of artificial general intelligence, as a number of noble Lords, such as the noble Lords, Lord Fairfax and Lord Watson, and the noble Viscount, Lord Colville, suggested?
As for the creative industries, there are clearly great opportunities in relation to the use of AI. Many sectors already use the technology in a variety of ways to enhance their creativity and make it easier for the public to discover new content, in the manner described by the noble Lord, Lord Watson.
But there are also big questions over authorship and intellectual property, and many artists feel threatened. Responsible AI developers seek to license content which will bring in valuable income. However, as the noble Earl, Lord Devon, said, many of the large language model developers seem to believe that they do not need to seek permission to ingest content. What discussion has the Minister, or other Ministers, had with these large language model firms in relation to their responsibilities for copyright law? Can he also make a clear statement that the UK Government believe that the ingestion of content requires permission from rights holders, and that, should permission be granted, licences should be sought and paid for? Will he also be able to update us on the code of practice process in relation to text and data-mining licensing, following the Government’s decision to shelve changes to the exemption and the consultation that the Intellectual Property Office has been undertaking?
There are many other issues relating to performing rights, copying of actors, musicians, artists and other creators’ images, voices, likeness, styles and attributes. These are at the root of the Hollywood actors and screenwriters’ strike as well as campaigns here from the Writers’ Guild of Great Britain and from Equity. We need to ensure that creators and artists derive the full benefit of technology, such as AI-made performance synthetisation and streaming. I very much hope that the Minister can comment on that as well.
We have only scratched the surface in tackling the AI governance issues in this excellent debate, but I hope that the Minister’s reply can assure us that the Government are moving forward at pace on this and will ensure that a debate of the kind that the noble Lord, Lord Giddens, has called for goes forward.
(1 year, 4 months ago)
Lords ChamberMy Lords, I add my thanks to the Minister for moving these amendments from the Commons. He has shown remarkable consistency with the words of his honourable friend Mr Scully in the Commons—I think word for word it is what he said, so that is excellent. I see other members of the committee here; I am only sorry that the noble and learned Lord, Lord Thomas of Cwmgiedd, is not here to see the final process and see this legislation go forward.
I welcome these amendments, because it means that the legislation will cover the whole United Kingdom, and that the exception power in Clause 5 will operate across the UK. Could the Minister say whether anything is in contemplation under Clause 5 to be excepted in using that power across the UK?
I very much agree with what the noble Lord, Lord Bassam, said about a plan for implementation. This is a much more important Bill than it appears at first sight, and we should really speed it on its way in implementation terms.
My Lords, I rise briefly to support the amendments as set out. In doing so, I declare my technology interests as set out in the register.
This is the most important Bill that no one has ever heard of. It demonstrates what we can do when we combine the potential of these new technologies with the great good fortune of common law that we have in this country. I particularly support the comments made by the noble Lords, Lord Bassam and Lord Clement-Jones, about the Government’s plan for implementation. Although it is obviously critical that we get Royal Assent to this Bill as soon as possible, that is really where the work begins. As my noble friend the Minister knows, the Bill is rightly permissive in nature; it cannot be that, having done all the work through both Houses of Parliament, the Bill is then just left on the shelf. There needs to be an active plan for implementation, communicating to all the sectors and all the organisations, institutions and brilliant businesses in this space to seize the opportunity that comes from electronic trade documents. Does my noble friend the Minister agree— and will he fill out some more detail on what that implementation plan is?
(1 year, 4 months ago)
Lords ChamberThe unit is established within the Department for Science, Innovation and Technology. Its existence and mission, and indeed the legal basis for its activities, are posted on GOV.UK. Because the great majority of its activities are now directed at overseas state actors hostile to our interests, we do not share in a public forum any operational details pertaining to its activity, simply for fear of giving an advantage to our overseas adversaries. However, I recognise the importance and seriousness of the question. To that end, while I cannot in a public forum provide operational details, if the noble Lord or any other noble Lords would like an operational briefing, I would be happy to arrange that.
My Lords, the CDU outsources its surveillance activities to opaque companies such as Logically and Faculty. It does not respond to Freedom of Information Act requests. Its budget is not public. Is it not quite unacceptable that there is no parliamentary oversight by any Select Committee, and is the place for that not the Intelligence and Security Committee?
I am delighted to reassure the noble Lord that it is subject to parliamentary oversight. The DSIT Secretary of State is accountable to Parliament, and indeed to the relevant parliamentary Select Committee.
(1 year, 5 months ago)
Lords ChamberThe debate on this matter in Committee on the Online Safety Bill was well attended and certainly well received. The purpose of the Online Safety Bill is to intervene between the platforms on which the distressing images are published and the users who see those platforms. It is, first, for human beings and, secondly, for their experiences online. The appalling instances that the noble Baroness referenced, particularly in the BBC documentary, would themselves be covered by either the Animal Welfare Act or the Communications Act, both of which make those criminal offences without the need for recourse to the Online Safety Bill.
My Lords, these offences are bad enough by themselves, but does the Minister accept that there is a direct connection between animal cruelty and violence towards humans? If so, is this not yet another reason why the Government should use the Online Safety Bill to combat animal cruelty offences and make this a priority offence under the Bill?
I join the whole House in absolutely deploring these behaviours. The concern about adding animal cruelty offences to the Online Safety Bill is that it is a Bill built around the experiences online of human beings. To rearchitect the Bill around actions perpetrated or commissioned on animals runs the risk of diminishing the effectiveness of the Bill.