(6 days, 20 hours ago)
Grand CommitteeMy Lords, I thank the noble Lord, Lord Foster, for introducing this debate and everyone who contributed. Clearly, several of the amendments that we discussed earlier in the week have been touched on in one form or another in today’s debate. The fact that those amendments were voted through demonstrates the intensity of noble Lords’ passion for and interest in this topic; of course, that is recognised. I acknowledge clearly, because I was asked this question, that I recognise the importance of these issues and I absolutely understand the concerns of the creative industries and, as the noble Lord, Lord Black, mentioned, the media sector.
In some ways, what we have discussed today speaks directly to the question of whether we need a consultation. On 17 December, we published a consultation that seeks to deliver a competitive copyright regime and a package of measures that support our creative industries and the AI sector. I do not want to sound like a broken record, but the proposals aim to deliver three objectives, and I agree with the way the noble Viscount, Lord Camrose, framed objectives. The three objectives that we have put forward are: transparency about the use of copyrighted works to train AI models and AI-generated content, providing greater control for rights holders’ material so that they can be supported in protecting it and can be remunerated where it is used—again, I say that the aim here is quite the opposite of theft: it is to give more control—and enhancing lawful access to the material to be used to train world-leading AI models.
I reiterate what I said on Tuesday: this is a genuine consultation, and many people from a range of sectors are engaging to share their views and evidence. The Government continue to believe that it is important that we have the benefit of that public consultation before we act. A central issue that the noble Lord, Lord Foster, set it in his Question is how to make sure that rights holders can easily reserve their rights and control the use of their material. These are the challenges that rights holders face today. Although they may have copyright on their work, they are often unable in practice to control how it is used or to gain remuneration. This is often particularly true for new or solo artists, the very people we need to protect, a point that the noble Lord, Lord Holmes, and others made.
The rights reservation model proposed in the consultation aims to enhance rights holders’ ability to withdraw their content from being used. It would support their ability to license this content for use with AI if they wish to do that. To do this, we will need the right blend of technology and regulation, and the consultation seeks views on how this should be achieved. Importantly—many noble Lords raised this point—this model would have to be simple, effective and accessible for rights holders of all sizes, something that, frankly, is not available in the current position. The Government have been clear that we will not proceed with this model unless we are confident that these criteria will be met.
On transparency, we want to consider how to achieve this broadly, ensuring that rights holders understand how and where their content is used, while also ensuring any measures are not disproportionate for small businesses and individuals.
On our third objective, access, for all the reasons that the noble Viscount, Lord Camrose, said, we want to ensure that there is a system in place that allows AI developers to access the high-quality material they need to train world-leading models in the UK. We want that access to be without uncertainty and without legal liability slowing down investment and adoption.
These are undoubtedly complex issues, and we need to strike the right balance to ensure that we are able fully to benefit from AI and guarantee the success of our world-leading creative industries. This is why we are asking about all these elements in the consultation.
The question asked by the noble Lord, Lord Foster, raises important issues about the impacts on creators and our assessment of these impacts. This was also something mentioned in the debate earlier in the week. I reassure noble Lords that gathering further economic impact evidence is one of the main reasons for conducting a full inquiry, but it is also worth pointing out that alongside our proposed paper on this, we published a 22-page summary options assessment that set out its initial analysis of the proposals that we have put forward, so it is not correct that there has been no options impact appraisal. This options assessment received a green rating from the independent Regulatory Policy Committee. It recognises, however, that quantitative evidence is currently limited in this area and highlights areas where the Government hope to receive further data during the course of the consultation.
The options assessment sets out the expected impacts of different options and assesses them against those three objectives in the consultation: control, access and transparency. The assessment does not provide detailed data on economic impact, as publicly available evidence in this area is currently rather limited. It is important that we let the consultation run its course so that we can gather evidence of impacts on the full range of affected parties. We are particularly keen for respondents to the consultation to provide further economic evidence to inform how we achieve our objectives. To answer partially, without being able to have singing in the streets, the question from the noble Lord, Lord Clement-Jones, depending on the evidence we receive through the consultation, we will revise, update and expand on the assessment of the options and better determine how we move forward with any potential legislative change. Acting without this would risk imposing legislation that does not have the intended effects.
Alongside our analysis, the Government of course continue to consider a broad range of external studies to assess AI’s economic impact. Modelling the potential economic impact of AI is complicated, and there are several external studies on this. We know that it is complicated, as we have seen just this week with the entry of DeepSeek and how that may change many of the things we think about, but AI adoption has the potential to drive growth across the economy, including, as many noble Lords mentioned, in the creative industries, where more than 38% of creative industry businesses have used AI technologies as of September 2024, with nearly 50% using AI to improve business operations. Earlier this week, I attended the launch of the Institute for the Future of Work’s report into the future of work and well-being, which looks at the impact of AI on work and well-being in all sectors. The Government have considered this external evidence alongside our internal analysis to inform our approach to AI and will continue to do so.
I will now move on to a few other areas. In passing, I agree with the noble Lord, Lord Black, that the question of truth in the effect of AI is crucial. We are in an era where this is increasingly difficult; it is the first wave of the AI challenge. It is crucial for everybody in society and, of course, for the media. Technology will play an important part in delivering greater rightholder control. The Government are clear that any solutions need to be effective, proportionate and accessible to all parties of all sizes, and they must be easy to use. Again, I want to reassure noble Lords that we do not intend to go forward with this approach until we are confident that this is the case.
The noble Lord, Lord Foster, asked whether anything is already available. Things are available; they are not good enough yet but coming along very fast. I know from my time as chair of the Natural History Museum, where we looked after vast amounts of data of huge potential value, that we had ways to try to block people getting hold of it. Things are available now but they need to be better; they also need to be simpler and usable by the individual.
The consultation recognises that more detailed work needs to be done, and an important function of the consultation is to help us work through this detail. A number of industry initiatives are already under way to deliver effective standards. As has been mentioned, these standards—international and national—will be crucial. These efforts, combined with careful regulation, will make it possible to deliver workable rights reservation tools, and a reimbursement mechanism that, again, should be easy to operate and not available only to the largest players or by going to court.
As noble Lords have raised it during the passage of the data Bill, I reiterate the central importance of transparency in the way that creative content is used. The use of web crawlers, metadata and watermarks as different forms of technological solutions could have a number of benefits for those who wish to control or license the use of their content with AI and could provide the very basis for a rights reservation tool.
We agree that a key issue to be addressed is the role of some web crawlers that are used to obtain content for AI training. However, it is important to recognise that web crawlers are used for different purposes, the most familiar being indexing online content so that it can be searched with a search engine. Standards on the use of web crawlers may also be important to improve the ability of rightholders to prevent the use of work against their wishes.
I spoke about workability, and several noble Lords made it clear that it must mean workability for the creative sector and creatives, as well as for others. The noble Lord, Lord Foster, asked about the temporary copy issue. We have asked about that in the consultation.
To conclude, I again thank noble Lords for contributing to this debate. They can rest assured that the Government understand the strongly held and legitimate concerns which creators and rightholders have about their content being used. We also agree that transparency is fundamental. However, it would be wrong to commit to specific legislation while the Government’s consultation is ongoing. Indeed, we should and must consider stakeholders’ responses fully and progress our package of objectives together.
We will consider all the points raised by noble Lords today and during the passage of the Bill. We will do this alongside the responses and evidence received as part of the consultation, before bringing further proposals. I end on the specific point raised by the noble Lord, Lord Holmes, on the LAION case, which is under German law. I will ask the IPO to give him a full answer on that.
(1 week, 1 day ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord Bassam of Brighton, for laying this amendment and introducing the debate on it.
As I understand it, a private copying levy is a surcharge on the price of digital content. The idea is that the money raised from the surcharge is either redistributed directly to rights holders to compensate them for any loss suffered because of copies made under the private copying exceptions or contributed straight to other cultural events. I recognise what the noble Lord is seeking to achieve and very much support his intent.
I have two concerns. First—it may be that I have misunderstood it; if so, I would be grateful if the noble Lord would set me straight—it sounds very much like a new tax of some kind is being raised, albeit a very small one. Secondly, those who legitimately pay for digital content end up paying twice. Does this not incentivise more illegal copying?
We all agree how vital it is for those who create products of the mind to be fairly rewarded and incentivised for doing so. We are all concerned by the erosion of copyright or IP caused by both a global internet and increasingly sophisticated AI. Perhaps I could modestly refer the noble Lord to my Amendment 75 on digital watermarking, which I suggest may be a more proportionate means of achieving the same end or at least paving the way towards it. For now, we are unable to support Amendment 57 as drafted.
I thank my noble friend Lord Bassam for his Amendment 57 on the subject of private copying levies. It reinforces a point we discussed earlier about copying being covered by copyright.
The smart fund campaign seeks the introduction of a private copy levy. Such a levy would aim to indirectly compensate copyright owners for the unauthorised private copying of their works—for example, when a person takes a photo of an artwork or makes a copy of a CD—by paying copyright owners when devices capable of making private copies are sold.
Noble Lords may be aware that, in April 2024, the Culture, Media and Sport Committee recommended that the Government introduce a private copying levy similar to that proposed by this amendment. The Government’s response to that recommendation, published on 1 November, committed the Intellectual Property Office to meet with representatives from the creative industries to discuss how to strengthen the evidence base on this issue. That process is under way. I know that a meeting with the smart fund group is planned for next week, and I can confirm that DCMS is included and invited. I know that the IPO would be glad to meet my noble friend, as well as the noble Lord, Lord Freyberg, and the noble Earl, Lord Clancarty, to discuss this further. I also absolutely assure him that Chris Bryant is aware of this important issue and will be following this.
I am sure my noble friend will agree that it is essential that we properly engage and consider the case for intervention before legislating. Therefore, I hope he will be content to withdraw his amendment, to allow the Government the opportunity to properly explore these issues with creative and tech industry stakeholders.
My Lords, I will happily withdraw my amendment. I am delighted to hear of the progress that the Minister has set out. I view his comments as a positive endorsement of the progress made so far.
It is essential that we get more money into the hands of creators, who are an important driving force and part of our economy. It is essential too that we make more funds available for arts generally across the country. This is one way of doing it. The approach was endorsed in a recent Fabian Society publication, Arts For Us All. It identified a number of other potential sources for generating income that could be distributed to the arts and arts organisations.
I commend the Government for taking up the challenge posed by the smart fund and I look forward to playing my part, along with my colleagues on the Cross Benches and others who support this initiative. It could do much to strengthen the funding base for the arts as a cultural sector, which was sadly eroded by the previous Government over the last decade and a half. I beg leave to withdraw my amendment.
I thank the noble Baroness, Lady Kidron, for moving her amendment. The amendments in this group seek to establish a new status for data held in the public interest, and to establish statutory oversight rules for a national data library. I was pleased during Committee to hear confirmation from the noble Baroness, Lady Jones of Whitchurch, that the Government are actively developing their policy on data held in the public interest and developing plans to use our data assets in a trustworthy and ethical way.
We of course agree that we need to get this policy right, and I understand the Government’s desire to continue their policy development. Given that this is an ongoing process, it would be helpful if the Government could give the House an indication of timescales. Can the Minister say when the Government will be in a position to update the House on any plans to introduce a new approach to data held in the public interest? Will the Government bring a statement to this House when plans for a national data library proceed to the next stage?
I suggest that a great deal of public concern about nationally held datasets is a result of uncertainty. The Minister was kind enough to arrange a briefing from his officials yesterday, and this emerged very strongly. There is a great deal of uncertainty about what is being proposed. What are the mechanics? What are the risks? What are the costs? What are the eventual benefits to UK plc? I urge the Minister, as and when he makes such a statement, to bring a maximum of clarity about these fundamental questions, because I suspect that many people in the public will find this deeply reassuring.
Given the stage the Government are at with these plans, we do not think it would be appropriate to legislate at this stage, but we of course reserve the right to revisit this issue in the future.
I am grateful to the noble Baroness, Lady Kidron, and the noble Lord, Lord Tarassenko, for Amendments 58 and 71, one of which we also considered in Committee. I suspect that we are about to enter an area of broad agreement here. This is a very active policy area, and noble Lords are of course asking exactly the right questions of us. They are right to emphasise the need for speed.
I agree that it is essential that we ensure that legal and policy frameworks are fit for purpose for the modern demands and uses of data. This Government have been clear that they want to maximise the societal benefits from public sector data assets. I said in the House very recently that we need to ensure good data collection, high-quality curation and security, interoperability and ways of valuing data that secure appropriate value returns to the public sector.
On Amendment 58, my officials are considering how we approach the increased demand and opportunity of data, not just public sector data but data across our economy. This is so that we can benefit from the productivity and growth gains of improvements to access to data, and harness the opportunities, which are often greater when different datasets are combined. As part of this, we sought public views on this area as part of the industrial strategy consultation last year. We are examining our current approach to data licensing, data valuation and the legal framework that governs data sharing in the public sector.
Given the complexity, we need to do this in a considered manner, but we of course need to move quickly. Crucially, we must not betray the trust of people or the trust of those responsible for managing and safeguarding these precious data assets. From my time as chair of the Natural History Museum, I am aware that museums and galleries are considering approaches to this very carefully. The noble Lord, Lord Lucas, may well be interested to see some of the work going on on biodiversity datasets there, where there are huge collections of great value that we actually did put value against.
Of course, this issue cuts across the public sector, including colleagues from the Geospatial Commission, NHS, DHSC, National Archives, Department for Education, Ordnance Survey and Met Office, for example. My officials and I are very open to discussing the policy issues with noble Lords. I recently introduced the noble Lord, Lord Tarassenko, to officials from NHSE dealing with the data side of things there and linked him with the national data library to seek his input. As was referred to, yesterday, the noble Baroness, Lady Kidron, the noble Lords, Lord Clement-Jones, Lord Tarassenko and Lord Stevenson, and the noble Viscount, Lord Camrose, all met officials, and we remain open to continuing such in-depth conversations. I hope the noble Baroness appreciates that this is an area with active policy development and a key priority for the Government.
Turning to Amendment 71, also from the noble Baroness, I agree that the national data library represents an enormous opportunity for the United Kingdom to unlock the full value of our public data. I agree that the protection and care of our national data is essential. The scope of the national data library is not yet finalised, so it is not possible to confirm whether a new statutory body or specific statutory functions are the right way to do this. Our approach to the national data library will be guided by the principles of public law and the requirements of the UK’s data protection legislation, including the data protection principles and data subject rights. This will ensure that data sharing is fair, secure and preserves privacy. It will also ensure that we have clear mechanisms for both valuation and value capture. We have already sought, and continue to seek, advice from experts on these issues, including work from the independent Prime Minister’s Council for Science and Technology. The noble Lord, Lord Freyberg, also referred to the work that I was involved with previously at the Tony Blair Institute.
The NDL is still in the early stages of development. Establishing it on a statutory footing at this point would be inappropriate, as work on its design is currently under way. We will engage and consult with a broad range of stakeholders on the national data library in due course, including Members of both Houses.
The Government recognise that our data and its underpinning infrastructure is a strategic national asset. Indeed, it is for that reason that we started by designating the data centres as critical national infrastructure. As the subjects of these amendments remain an active area of policy development, I ask the noble Baroness to withdraw her amendment.
I am grateful for a breakout of agreement at this time of night; that is delightful. I agree with everything that the Minister said, but one thing we have not mentioned is the incredible cost of managing the data and the investment required. I support the Government investing to get the value out, as I believe other noble Lords do, and I would just like to put that point on record.
We had a meeting yesterday and thought it was going to be about data assets, but it turned out to be about data communities, which we had debated the week before. Officials said that it was incredibly useful, and it might have been a lot quicker if they had had it earlier. In echoing what was said in the amendment of the noble Baroness, Lady Owen, there is considerable interest and expertise, and I would love to see the Government move faster, possibly with the help of noble Lords. With that, I beg leave to withdraw the amendment.
My Lords, I thank the noble Lord, Lord Holmes, for his amendments on reviews of and consultations on large language models and data centres. First, on Amendment 59, as we have discussed in some detail, the Government are conducting their consultation on copyright and AI. This will consider issues relating to transparency of creative content in both input and output of AI. This would apply not just to large language models but to other forms of AI. Questions on the wider copyright framework are also included in the consultation, including the issue of models trained in other jurisdictions, importation and enforcement provisions.
A review of large language models, as required by this amendment, as well as the consideration of the specific provisions of copyright law, would prejudge the outcome of that consultation. I might even go so far as to say to noble Lords that the consultation and the process around it is, in a sense, the very review that this amendment seeks—or at least a range of ways may be suggested through that consultation to address these issues, which are important and might be more effective than a further review. I also remind noble Lords about the AI Safety Institute, which, of course, has a duty to look at some of the safety issues around these models.
I reassure noble Lords that we welcome those suggestions and will carefully consider which parts of the copyright framework would benefit from amendment. I reiterate that the proposals the Government have put forward on copyright and AI training will not affect the wider application of copyright law. If a model were to output a creator’s work without their permission, rights holders would be able to take action, as they are at present.
On Amendment 60, as the Prime Minister laid out as part of the AI opportunities action plan, this Government intend to secure more data centre capacity and ensure that it is delivered as sustainably as possible. Noble Lords will have also noted the investment that followed the investment summit targeted towards data centres. The Government are committed to ensuring that any negative impact of data centres is, where possible, minimised and that sustainability is considered. The noble Lord may well be aware of the creation of the AI energy council, which will be led by Secretaries of State for DSIT and DESNZ. That will consider the energy requirements and, of course, the need for future energy requirements, including things such as SMRs. The Government recognise the aim of this amendment, but we do not feel this Bill is the place to address this issue. The accompanying notes to the Bill will detail its environmental impacts.
Amendment 66 calls for a consultation on data centre power usage. The UK has committed to decarbonising the electricity system by 2030, subject to security of supply, and data centres will increasingly be powered by renewable energy resources. The first data centre site has been identified as Culham. Why is it there? It is because the UK Atomic Energy Authority has a very large power supply, with some 100 megawatts of electricity supply available. That will need to increase to something closer to 500 megawatts. How we will select other data centre sites will depend on where there is power and an appropriate ability to put those sites. Noble Lords can expect them to be distributed around the UK. The sector operates under a climate change agreement, to encourage greater uptake of energy-efficiency measures among operators.
Data centres themselves, of course, play a major part in powering the high-tech solutions to environmental challenges, whether that is new tech that increases the efficiency of energy use across towns and cities or development and application of innovative materials and new technologies that take carbon out of the atmosphere. The energy efficiency of data centres themselves is improving with new technologies and will continue to do so. Perhaps that was one of the features of the announcement of DeepSeek—exactly how that might advance rather rapidly. Closed-loop cooling, energy-efficient hardware, heat reuse and hot/cold aisle containment are already having an effect on the energy consumption and output of data centres.
The Government continue to monitor the data centre industry and are aware of the environmental impacts of data centres. I hope that, in the light of the points I raised, the noble Lord will be content not to press his amendments.
I thank everyone who took part in this short debate, in particular the Minister for that full, clear and helpful answer. In a spirit of throwing roses at this stage of the evening, I congratulate him and the Government on the quick identification and implementation of Culham as the first site for one of these centres. It makes complete sense—as he says, the power already exists there. I urge the Government to move with such speed for the remaining five of the first six sites. It makes complete sense to move at speed to identify these resources and the wider benefits they can bring to the communities where they will be located. For now, I am content to withdraw the amendment.
Amendment 67, tabled by the noble Lord, Lord Lucas, would require terms relating to personal attributes to be defined consistently across government data. The Government believe that public sector data should continue to be collected based on user needs for data and any applicable legislation, but I fully recognise the need for standards and consistency in data required for research and evaluation. Harmonisation creates more meaningful statistics that allow users to better understand a topic. It is also an important part of the code of practice for statistics; the code recommends using harmonised standards unless there is a good reason not to.
As I set out in last week’s debate, the Government believe that data accuracy is essential to deliver services that meet citizens’ needs and ensure accurate evaluation and research as a result of that. I will set out to the noble Lord some work that is ongoing in this space. The Office for Statistics Regulation published guidance on collecting and reporting data about sex and gender identity in February 2024, and the Government Statistical Service published a work plan for updated harmonised standards and guidance on sex and gender identity in December 2024 and will take into account the needs for accurate metadata. The Sullivan review explores these issues in detail and should be published shortly; it will be taken into account as the work progresses. In addition, the Government Digital Service has started work on developing data standards on key entities and their attributes to ensure that the way data is organised, stored and shared is consistent between public authorities.
This work has been commenced via the domain expert group on the “person” entity, which has representation from organisations including the Home Office, HMRC, the Office for National Statistics, NHS England, the Department for Education, the Ministry of Justice, the Local Government Association and the Police Digital Service. The group has been established as a pilot under the Data Standards Authority to help ensure consistency across organisations.
As I said last week, it is the Government’s belief that these matters are crucial and need to be considered carefully, but are more appropriately considered holistically outside this Bill. The intention of this Bill is not to define or remark on the specific definitions of sex or gender, or other aspects of data definition. It is, of course, to make sure that the data that is collected can be made available, and I have reiterated my point that the data needs to be both based in truth and consistent and clear. There is work going on to make these new regulations and approaches to this absolutely clear. As such, I urge the noble Lord to consider withdrawing his amendment.
My Lords, I am very grateful to the Minister for that explanation. I am particularly glad to know that the Sullivan review will be published soon—I look forward very much to reading that—and I am pleased by the direction the Government are moving in. None the less, we only get a Bill every now and again. I do think we need to give the Government the powers that this amendment offers. I would hate noble Lords opposite to feel that they had stayed here this late to no purpose, so I beg leave to test the opinion of the House.
I thank the noble Baroness, Lady Kidron, for her amendments. The reliability of computer-based evidence, needless to say, has come into powerful public focus following the Post Office Horizon scandal and the postmasters’ subsequent fight for justice. As the noble Baroness has said previously and indeed tonight, this goes far beyond the Horizon scandal. We accept that there is an issue with the way in which the presumption that computer evidence is reliable is applied in legal proceedings.
The Government accepted in Committee that this is an issue. While we have concerns about the way that the noble Baroness’s amendment is drafted, we hope the Minister will take the opportunity today to set out clearly the work that the Government are doing in this area. In particular, we welcome the Government’s recently opened call for evidence, and we hope Ministers will work quickly to address this issue.
Amendment 68 from the noble Baroness, Lady Kidron, aims to prevent future miscarriages of justice, such as the appalling Horizon scandal. I thank the noble Baroness and, of course, the noble Lord, Lord Arbuthnot, for the commitment to ensuring that this important issue is debated. The Government absolutely recognise that the law in this area needs to be reviewed. Noble Lords will of course be aware that any changes to the legal position would have significant ramifications for the whole justice system and are well beyond the scope of this Bill.
I am glad to be able to update the noble Baroness on this topic since Committee. On 21 January the Ministry of Justice launched a call for evidence on this subject. That will close on 15 April, and next steps will be set out immediately afterwards. That will ensure that any changes to the law are informed by expert evidence. I take the point that there is a lot of evidence already available, but input is also needed to address the concerns of the Serious Fraud Office and the Crown Prosecution Service, and I am sure they will consider the important issues raised in this amendment.
I hope the noble Baroness appreciates the steps that the Ministry of Justice has taken on this issue. The MoJ will certainly be willing to meet any noble Lords that wish to do so. As such, I hope she feels content to withdraw the amendment.
The Minister did not quite address my point that the consultation is not broad enough in scope, but I will accept the offer of a meeting. Although the noble Lord, Lord Arbuthnot, spoke very briefly, he is my partner in crime on this issue; indeed, he is a great campaigner for the postmasters and has done very much. So I say to the Minister: yes, I will have the meeting, but could it happen this time? With that, I beg leave to withdraw the amendment.
My Lords, I move Amendment 73 standing in my name which would require the Secretary of State to undertake a risk assessment on the data privacy risks associated with genomics and DNA companies that are headquartered in countries which the Government determine to be systemic competitors and hostile actors. The UK is a world leader in genomics research, and this a growing sector that makes an important contribution. The opportunities in genomics are enormous and we should take the steps needed to protect the UK’s leading role here.
I was pleased to hear from the noble Baroness, Lady Jones of Whitchurch, in Committee that:
“the Government have continued the important work of the UK Biological Security Strategy of 2023, including by conducting a full risk assessment and providing updated guidance to reduce the risks from the misuse of sensitive data”.
The Minister also gave the undertaking that the Government would
“brief the Joint Committee on the National Security Strategy on the findings of the risk assessment in the new year”.—[Official Report, 18/12/24; col. GC 124.]
I would be very grateful if the Minister could confirm whether the Joint Committee has been briefed and, if not, when that will happen.
I look forward to continuing to engage with Ministers on the issue of data security in the face of growing threats from international competitors and hostile actors.
I thank the noble Viscount, Lord Camrose, for giving me an opportunity to speak for 45 minutes on genomics, which I know everyone will be very grateful for. I shall resist that temptation and thank him for the amendment on security in genomic data.
As he is aware, the UK is a world leader in genomics, and its various datasets and studies have contributed to health globally. I also note that the UK Biological Security Strategy of 2023 has been endorsed by this Government and a variety of measures are under active consideration. I recognise the noble Viscount’s desire for quick movement on the issue and agree with him that this is of great importance. I reassure him that my officials are working at speed across government on this very issue. I would be very happy to brief him and other noble Lords present today on the findings of the risk assessment in due course. We have not yet engaged with the Joint Committee on National Security Strategy but will do shortly as per standard practice.
I hope that the noble Viscount will appreciate that this work is live and will grant a little patience on this issue. I look forward to engaging with him soon on this but, in the meantime, I would be grateful if he would withdraw his amendment.
I thank the Minister for his clear response and for taking pity on the House and not giving us the full benefit of his knowledge of genomics. Meanwhile, I recognise that we have to move with deliberateness here and not rush into the wrong solution. I gratefully accept his offer of further briefings and beg leave to withdraw my amendment.
It is indeed getting late. I thank the noble Lord, Lord Clement-Jones, for moving his amendment, and I really will be brief.
We do not oppose the government amendment in the name of the noble Lord, Lord Vallance. I think the Minister should be able to address the concerns raised by the noble Lord, Lord Clement-Jones, given that the noble Lord’s amendment merely seeks clarification on the retrospective application of the provisions of the Bill within a month of the coming into force of the Act. It seems that the Government could make this change unnecessary by clarifying the position today. I hope the Minister will be able to address this in his remarks.
I will speak first to Amendment 76. I reassure noble Lords that the Government do not believe that this amendment has a material policy effect. Instead, it simply corrects the drafting of the Bill and ensures that an interpretation provision in Clause 66 commences on Royal Assent.
Amendment 74, in the name of the noble Lord, Lord Clement Jones, would require the Secretary of State to publish a statement setting out whether any provisions in the Bill apply to controllers and processers retrospectively. Generally, provisions in Bills apply from the date of commencement unless there are strong policy or legal reasons for applying them retrospectively. The provisions in this Bill follow that general rule. For instance, data controllers will only be able to rely on the new lawful ground of recognised legitimate interests introduced by Clause 70 in respect of new processing activities in relation to personal data that take place after the date of commencement.
I recognise that noble Lords might have questions as to whether any of the Bill’s clauses can apply to personal data that is already held. That is the natural intent in some areas and, where appropriate, commencement regulations will provide further clarity. The Government intend to publish their plans for commencement on GOV.UK in due course and the ICO will also be updating its regulatory guidance in several key areas to help organisations prepare. We recognise that there can be complex lifecycles around the use of personal data and we will aim to ensure that how and when any new provisions can be relied on is made clear as part of the implementation process.
I hope that explanation goes some way to reassuring the noble Lord and that he will agree to withdraw his amendment.
My Lords, I thank the Minister. There is clearly no easy answer. I think we were part-expecting a rather binary answer, but clearly there is not one, so we look forward to the guidance.
But that is a bit worrying for those who have to tackle these issues. I am thinking of the data protection officers who are going to grapple with the Bill in its new form and I suspect that that is going to be quite a task. In the meantime, I withdraw the amendment.
(1 week, 1 day ago)
Lords ChamberMy Lords, I thank my noble friend Lord Holmes of Richmond for moving this amendment. I am sure we can all agree that the ICO should encourage and accommodate innovation. As I noted during the first day on Report, in a world where trade and business are ever more reliant on cross-border data transfers, data adequacy becomes ever more important.
In Committee, the noble Baroness, Lady Jones of Whitchurch, was able to give the House the reassurance that this Bill was designed with EU adequacy in mind. We were pleased to hear that the Government’s course of action is not expected to put this at risk. I also suggest that this Bill represents even less of a departure from GDPR than did its predecessor, the DPDI Bill.
We welcome the Government’s assurances, but we look to them to address the issues raised by my noble friend Lord Holmes. I think we can all agree that he has engaged constructively and thoughtfully on this Bill throughout.
I thank the noble Lord, Lord Holmes, for his Amendment 38 relating to the ICO’s innovation duty. I agree with his comments about the quality of our regulators.
I reiterate the statements made throughout the Bill debates that the Government are committed to the ongoing independence of the ICO as a regulator and have designed the proposals in the Bill with retaining EU adequacy in mind. The commissioner’s status as an independent supervisory authority for data protection is assured. The Information Commissioner has discretion over the application of his new duties. It will be for him to set out and justify his activities in relation to those duties to Parliament.
To answer the specific point, as well as that raised by the noble Lord, Lord Clement-Jones, considerations of innovations will not come at the expense of the commissioner’s primary objective to secure an appropriate level of protection for personal data. I hope that reassures the noble Lord.
I thank all noble Lords who have taken part in this short debate and thank the Minister for his response. I believe my wording would assist the ICO in its mission, but I have listened to what the Minister has said and, for the time being, I beg leave to withdraw the amendment.
My Lords, I thank the noble Baroness, Lady Kidron, for moving her amendment. Before I begin, let me declare my interest as a recently appointed director of Lumi, an edtech provider—but for graduates, not for schools.
AI has the potential to revolutionise educational tools, helping teachers spend less time on marking and more time on face-to-face teaching with children, creating more innovative teaching tools and exercises and facilitating more detailed feedback for students. AI presents a real opportunity to improve education outcomes for children, opening more opportunities throughout their lives. There are deeply compelling promises in edtech.
However—there is always a however when we talk about edtech—creating and using AI education tools will require the collection and processing of children’s personal data. This potentially includes special category data—for instance, medical information pertaining to special educational needs such as dyslexia. Therefore, care must be taken in regulating how this data is collected, stored, processed and used. Without this, AI poses a major safeguarding risk. We share the concerns of the noble Baroness, Lady Kidron, and wholeheartedly support the spirit of her amendment.
We agree that it is prudent to require the ICO to make a code of practice on children’s data and education, and I particularly welcome a requirement on the ICO to consult with and involve parents. Parents know their children best, needless to say, and have their best interests at heart; their input will be critical in building trust in AI-assisted educational tools and facilitating their rollout and benefits for children throughout the UK.
However, as I said earlier at Report—and I shall not repeat the arguments now—we have concerns about the incorporation of international law into our law, and specifically, in this instance, the UN Convention on the Rights of the Child. We cannot therefore support the amendment as drafted. That said, we hope very much that the Government will listen carefully to the arguments raised here and take steps to introduce appropriate safeguards for children and young people in our data legislation regime. I suspect that most parents will greatly welcome more reassurance about the use of their children’s data.
I thank the noble Baroness, Lady Kidron, for raising this important topic today, and thank noble Lords for the impassioned speeches that we have heard. As my noble friend Lady Jones mentioned in Committee, the ICO has been auditing the practices of several edtech service providers and is due to publish its findings later this year. I am pleased to be able to give the noble Baroness, Lady Kidron, a firm commitment today that the Government will use powers under the Data Protection Act 2018 to require the ICO to publish a new code of practice addressing edtech issues.
The noble Baronesses, Lady Kidron and Lady Harding, both raised important points about the specificity, and I will try to address some of those. I am grateful to the noble Baroness for her suggestions about what the code should include. We agree that the starting point for the new code should be that children merit special protection in relation to their personal data because they may be less aware of the risks and their rights in relation to its processing. We agree that the code should include guidance for schools on how to comply with their controller duties in respect of edtech services, and guidance for edtech services on fulfilling their duties under the data protection framework—either as processors, controllers or joint controllers. We also agree that the code should provide practical guidance for organisations on how to comply with their so-called:
“Data protection by design and by default”
duties. This would help to ensure that appropriate technical and organisational measures are implemented in the development and operation of processing activities undertaken by edtech services.
The noble Baroness suggested that the new code should include requirements for the ICO to develop the code in consultation with children, parents, educators, children’s rights advocates, devolved Governments and industry. The commissioner must already consult trade associations, data subjects and persons who appear to the commissioner to represent the interest of data subjects before preparing a code, but these are very helpful suggestions. The development of any new code will also follow the new procedures introduced by Clause 92 of this Bill. The commissioner would be required to convene an expert panel to inform the development of the code and publish the draft code. Organisations and individuals affected by the code would be represented on the panel, and the commissioner would be required to consider its recommendations before publishing the code.
Beyond this, we do not want to pre-determine the outcome of the ICO’s audits by setting out the scope of the code on the face of the Bill now. The audits might uncover new areas where guidance is needed. Ensuring a clear scope for a code, grounded in evidence, will be important. We believe that allowing the ICO to complete its audits, so that the findings can inform the breadth and focus of the code, is appropriate.
The ICO will also need to carefully consider how its codes interrelate. For example, the noble Baroness suggested that the edtech code should cover edtech services that are used independently by children at home and the use of profiling to make predictions about a child’s attainment. Such processing activities may also fall within the scope of the age-appropriate design code and the proposed AI code, respectively. We need to give the ICO the flexibility to prepare guidance for organisations in a way that avoids duplication. Fully understanding the problems uncovered by the ICO audits will be essential to getting the scope and content of each code right and reducing the risk of unintended consequences.
To complement any recommendations that come from the ICO and its audits, the Department for Education will continue to work with educators and parents to help them to make informed choices about the products and services that they choose to support teaching and learning. The noble Baroness’s suggestion that there should be a certification scheme for approved edtech service providers is an interesting one that we will discuss with colleagues in the Department for Education. However, there might be other solutions that could help schools to make safe procurement decisions, and it would not be appropriate to use the ICO code to mandate a specific approach.
The point about schools and the use of work by children is clearly important; our measures are intended to increase the protections for children, not to reduce them. The Government will continue to work closely with noble Lords, the Department for Education, the ICO and the devolved regions as we develop the necessary regulations following the conclusion of the ICO audit. I hope that the noble Baroness is pleased with this commitment and as such feels content to withdraw her amendment.
May I ask for a commitment from the Dispatch Box that, when the order is complete and some of those conversations are being discussed, we can have a meeting with the ICO, the DfE and noble Lords who have fought for this since 2018?
I am very happy to give that commitment. That would be an important and useful meeting.
I thank the Minister and the Government. As I have just said, we have been fighting for this since 2018, so that is quite something. I forgot to say in my opening remarks that edtech does not, of course, have an absolute definition. However, in my mind—it is important for me to say this to the House—it includes management, safety and tech that is used for educational purposes. All those are in schools, and we have evidence of problems with all of them. I was absolutely delighted to hear the Government’s commitments, and I look forward to working with the ICO and the department. With that, I beg leave to withdraw.
I thank the noble Baroness, Lady Kidron, for moving this incredibly important group and all those speakers who have made the arguments so clearly and powerfully. I pay tribute to noble Baroness’s work on copyright and AI, which is so important for our arts and culture sector. As noble Lords have rightly said, our cultural industries make an enormous contribution to our country, not just in cultural terms but in economic ones, and we must ensure that our laws do not put that future at risk.
In the build-up to this debate I engaged with great pleasure with the noble Baroness, Lady Kidron, and on these Benches we are sympathetic to her arguments. Her Amendment 61 would require the Government to make regulations in this area. We accept the Government’s assurance that this is something they will seek to address, and I note the Minister’s confirmation that their consultation will form the basis of the Government’s approach to this issue. Given the importance of getting this right, our view is that the Government’s consultation is in mid-flight, and we have to allow it to do its work. Whatever view we take of the design and the timing of the consultation, it offers for now a way forward that will evidence some of the serious concerns expressed here. That said, we will take a great interest in the progress and outcomes of the consultation and will come back to this in future should the Government’s approach prove unsatisfactory.
Amendment 75 in my name also seeks to address the challenge that the growth in AI poses to our cultural industries. One of the key challenges in copyright and AI is enforceability. Copyright can be enforced only when we know it has been infringed. The size and the international distribution of AI training models render it extremely challenging to answer two fundamental questions today: first, was a given piece of content used in a training model and secondly, if so, in what jurisdiction did that use take place? If we cannot answer these questions, enforcement can become extremely hard, so a necessary, if not sufficient, part of the solution will be a digital watermark—a means of putting some red dye in the water where copyrighted material is used to train AIs. It could also potentially provide an automated means for content creators to opt out, with a vastly more manageable administrative burden.
I thank the Minister for his constructive engagement on digital watermarking and look to him to give the House an assurance that the Government will bring forward a plan to develop a technological standard for a machine-readable digital watermark. I hope that, if and when he does so, he is able to indicate both a timeline and an intention to engage internationally. Subject to receiving such reassurances when he rises, I shall not move my amendment.
I congratulate the noble Baroness, Lady Kidron, on her excellent speech. I know that she feels very strongly about this topic and the creative industries, as do I, but I also recognise what she said about junior Ministers. I have heard the many noble Lords who have spoken, and I hope they will forgive me if I do not mention everyone by name.
It is vital that we get this right. We need to give creators better, easier and practical control over their rights, allow appropriate access to training material by AI firms and, most importantly, ensure there is real transparency in the system, something that is currently lacking. We need to do this so that we can guarantee the continued success of our creative industries and fully benefit from what AI will bring.
I want to make it clear, as others have, that these two sectors are not mutually exclusive; it is not a case of picking sides. Many in the creative industries are themselves users or developers of AI technology. We want to ensure that the benefits of this powerful new technology are shared, which was a point made by the noble Baroness, Lady Stowell, and her committee.
It is obvious that these are complex issues. We know that the current situation is unsatisfactory in practice for the creative industries and the AI sector. That is why we have launched a detailed consultation on what package of measures can be developed to benefit both the creative industries and the AI sector. This is a genuine consultation. Many people from a range of sectors are engaging with us to share their views and evidence. It is important, and indeed essential, that we fully consider all responses provided in the consultation before we act. Not to do so would be a disservice to all those who are providing important input and would narrow our chance to get the right solution.
I agree wholeheartedly with the noble Baroness and many other noble Lords, including the noble Lord, Lord Freyberg, on the importance of transparency about the creative content used to train AI. Transparency, about both inputs and outputs, is a key objective in the Government’s consultation on copyright and AI. This very ability to provide transparency is at the centre of what is required. The consultation also contains two other vital objectives alongside transparency: practical and clear control and reward for rights holders over the use of their work. This is quite the opposite of the notion of giving away their hard work or theft. It is about increasing their control and ensuring access to data for AI training.
The Government certainly agree with the spirit of the amendments on transparency and web crawlers and the aims they are trying to achieve—that creators should have more clarity over which web crawlers can access their works and be able to block them if they wish, and that they should be able to know what has been used and by whom and have mechanisms to be appropriately reimbursed. However, it would be premature to commit to very specific solutions at this stage of the consideration of the consultation.
We want to consider these issues more broadly than the amendments before us, which do not take into account the fact that web crawling is not the only way AI models are trained. We also want to ensure that any future measures are not disproportionate for small businesses and individuals. There is a risk that legislating in this way will not be flexible enough to keep pace with rapid developments in the AI sector or new web standards. A key purpose of our consultation is to ensure that we have the full benefit of views on how to approach these issues, so that any legislation will be future-proof and able to deliver concrete and sustainable benefits for the creators. The preferred option in the consultation is one proposal; this is a consultation to try to find the right answer and all the proposals will be considered on their merits.
The Government are also committed to ensuring that rights holders have real control over how their works are used. At the moment, many feel powerless over the use of their works by AI models. Our consultation considers technological and other means that can help to ensure that creators’ wishes are respected in practice. We want to work with industry to develop simple and reliable ways to do this that meet agreed standards, in reference to the point made by the noble Viscount, Lord Camrose.
Technical standards are an important part of this. There are technical standards that will be required to prevent web crawlers accessing certain datasets. Standards will be needed for control at the metadata level and for watermarking. I agree with the noble Viscount, Lord Camrose, that standards on the use of watermarks or metadata could have a number of benefits for those who wish to control or license the use of their content with AI. Standards on the use of web crawlers may also improve the ability of rights holders to prevent the use of their works against their wishes. We will actively support the development of new standards and the application of existing ones. We see this as a key part of what is needed. We do not intend to implement changes in this area until we are confident that they will work in practice and are easy to use.
I also want to stress that our data mining proposals relate only to content that has been lawfully made available, so they will not apply to pirated copies. Existing copyright law will continue to apply to the outputs of AI models, as it does today. People will not be able to use AI as a cover for copyright piracy. With improved transparency and control over inputs, we expect that the likelihood of models generating infringing output will be greatly reduced.
I thank the noble Lord, Lord Clement-Jones, for Amendment 46. It would require a review of the impact of transferring all data protection-related cases to the relevant tribunals. Currently there is a mixture of jurisdictions for tribunals and courts for data protection cases, depending on the nature of the proceedings. This is on the basis that certain claims are deemed appropriate for tribunal, while others are appropriate for courts, where stricter rules of evidence and procedure apply—for example, in dealing with claims by data subjects against controllers for compensation due to breaches of data protection legislation. As such, the current system already provides clear and appropriate administrative and judicial redress routes for data subjects seeking to exercise their rights.
Tribunals are in many cases the appropriate venue for data protection proceedings, including appeals by controllers against enforcement action or applications by data subjects for an order that the ICO should progress a complaint. Claims by individuals against businesses or other organisations for damages arising from breach of data protection law fall under the jurisdiction of courts rather than tribunals. This is appropriate, given the likely disparity between the resources of the respective parties, because courts apply stricter rules of evidence and procedures than tribunals. While court proceedings can, of course, be more costly, successful parties can usually recover their costs, which would not always be the case in tribunals.
I hope that the noble Lord agrees that there is a rationale for these different routes and that a review to consider transfer of jurisdictions to tribunals is therefore not necessary at this time.
My Lords, I thank the Minister for that dusty reply. I wonder whether he has been briefed about particular legal cases, such as Killock or Delo, where the judiciary themselves were confused about the nature of the different jurisdictions of tribunal and court. The Minister and, indeed, the noble Viscount, Lord Camrose, seemed to make speeches on the basis that all is wonderful and the jurisdiction of the courts and tribunals is so clearly defined that we do not need a review. That is not the case and, if the Minister were better briefed about the obiter, if not the judgments, in Delo and Killock, he might appreciate that there is considerable confusion about jurisdiction, as several judges have commented.
I am very disappointed by the Minister’s reply. I think that there will be several judges jumping up and down, considering that he has not really looked at the evidence. The Minister always says that he is very evidence-based. I very much hope that he will take another look at this—or, if he does not, that the MoJ will—as there is considerably greater merit in the amendment than he accords. However, I shall not press this to a vote and I beg leave to withdraw the amendment.
I thank my noble friend Lord Holmes for tabling the amendment in this group. I, too, believe these amendments would improve the Bill. The nature of computing and data processing has fundamentally changed since the Computer Misuse Act 1990. Third parties hold and process immense quantities of data, and the means of accessing and interacting with that data have become unrecognisably more sophisticated. Updating the definition of unauthorised computer access through Amendment 48 is a sensible reform, as this new definition takes into account that data controllers and processors now hold substantial quantities of personal data. These entities are responsible for the security of the data they hold, so their provisions on access become legally relevant and this amendment reflects this.
When updating an offence, it is equally necessary to consider the legal defences, as my noble friend has rightly done in Amendment 47 by protecting individuals accessing information to detect or prevent a crime or whose actions are in the public interest. We on these Benches feel these amendments are wholly sensible. I urge the Minister to listen to the persuasive argument that my noble friend Lord Holmes has made and consider how we can deliver these improvements to our data legislation.
I am grateful to the noble Lord, Lord Holmes, for raising this topic through Amendments 47 and 48. I am very aware of this issue and understand the strength of feeling about reforming the Computer Misuse Act, as we have heard from the noble Lord, Lord Arbuthnot, and the noble Earl, Lord Erroll.
As the noble Lord, Lord Clement-Jones, rightly pointed out, when I was the Government Chief Scientific Adviser I conducted a review making recommendations on pro-innovation regulation of technologies and I made recommendations on the issues these amendments raise. These recommendations were accepted by the previous Government.
The Government are actively taking forward these recommendations as part of the Act’s ongoing review. These issues are, of course, complex and require careful consideration. The introduction of these specific amendments could unintentionally pose more risk to the UK’s cybersecurity, not least by inadvertently creating a loophole for cybercriminals to exploit to defend themselves against a prosecution.
Our engagement with stakeholders has revealed differing views, even among industry. While some industry partners highlight the noble Lord’s view that the Computer Misuse Act may prevent legitimate public interest activity, others have concerns about the unintended consequences. Law enforcement has considerable concerns that allowing unauthorised access to systems under the pretext of identifying vulnerabilities could be exploited by cybercriminals. Without robust safeguards and oversight, this amendment could significantly hinder investigations and place a burden on law enforcement partners to establish whether a person’s actions were in the public interest.
Further work is required to consider the safeguards that would need to accompany any introduction of statutory defences. The Government will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies on this issue. The Home Office will provide an update in due course, once the proposals have been finalised—or, in the words of the noble Lord, Lord Clement-Jones, they will pop out of the bowels of the Home Office in due course. With these reassurances in mind, I hope the noble Lord will feel able to withdraw his amendments.
My Lords, I thank everybody who has taken part in this short debate. I was really hoping that we would not hear the phrase “the bowels of the Home Office” twice, but we did—now we have heard it three times. Perhaps it could be the title of somebody’s autobiography. I do not know whose, but I claim the IP rights even though the noble Lord, Lord Clement-Jones, said it first.
I am grateful for the Minister’s response. It would probably have been better to have some sense of timeline; much of what he said was very much what we heard in Committee. We are all amenable to having a course of action, but it needs more objectives attached to it as to when we are likely to see some consequences, action and changes. As every day goes by, as the Minister is well aware, risks go unchecked that could be checked, people are less safe who could be made safe and economic growth, the Government’s priority, is prevented which could be enabled.
For now, I will withdraw my amendment, but I am minded to see what is possible between now and Third Reading, because the time is now; otherwise, “in due course” will be even longer than the official statement “later in the summer”. I beg leave to withdraw.
My Lords, I thank my noble friend Lord Lucas for introducing this group. Amendments 48A and 50A, in his name, would ensure that regulated professionals, including financial services firms, are able to comply with current and future regulatory requirements. The example my noble friend has given—the FCA’s expectation that firms communicate effectively with consumers—is a good one. Clearly, we must avoid a circumstance where regulators expect businesses to take action that is not possible due to limiting legislation governing data use and access. My noble friend has made a forceful case and I hope the Government will be able to give the House appropriate assurance that businesses will not be put in this position as a result of this legislation.
Amendment 48B, in the name of the noble Lord, Lord Clement-Jones, seeks to ban cookie paywalls. I opposed a similar amendment when we debated it in Committee as it actually seeks to curtail choice. Currently, users have the options to pay money and stay private, share personal data and read for free, or walk away. Faced with these options, for instance, I have sadly chosen to forgo my regular evening reading of the Daily Mail’s excellent sports pages, but I see no reason why that newspaper, or anyone else, should be compelled to provide anything for free. In fact, it has been very persuasively argued by Jaron Lanier, Shoshana Zuboff and many others that it is the fact that so much of the internet is apparently, but not actually, free that has caused a great deal of damage, rather than having an open charging model. This approach finally reveals the exact cash value of individuals’ data that websites are harvesting and offers users choice. We do not agree with attempts to remove that choice.
My Lords, I will start with Amendments 48A and 50A in the name of the noble Lord, Lord Lucas. The Government are aware that some financial services firms have raised concerns that the direct marketing rules in the privacy and electronic communications regulations prevent them supporting consumers in some instances. I appreciate the importance of the support that financial services firms provide to their customers to help them make informed decisions on matters such as their financial investments. The Government and the FCA are working closely together to improve the support available to consumers.
In December, the FCA launched an initial consultation on a new type of support for consumers with their investments and pensions called “targeted support”. Through this consultation, the FCA will seek feedback on any interactions of the proposals and direct marketing rules. As my noble friend Lady Jones explained in the debate in Grand Committee, firms can already provide service or regulatory communication messages to their customers without permission, provided these messages are neutral in tone, factual and do not include promotional content. Promotional content can be sent if a consumer consents to receiving direct marketing. Messages which are not directed to a particular individual, such as online adverts shown to everyone who views a website, are also not prevented by the rules. I hope this explanation and the fact that there is ongoing work provide some reassurance to the noble Lord, Lord Lucas, that the Government are actively looking into this issue, and that, as such, he is content to withdraw his amendment.
Amendment 48B from the noble Lord, Lord Clement-Jones, is aimed at banning cookie paywalls. These generally work by giving web users the option to pay for a cookie-free browsing experience. Many websites are funded by advertising, and some publishers think that people should pay for a viewing experience without personalised advertising. As he rightly pointed out, the ICO released updated guidance on how organisations can deploy “consent or pay” models while still ensuring that consent is “freely given”. The guidance is detailed and outlines important factors that organisations should consider in order to operate legally. We encourage businesses to read this guidance and respond accordingly.
I note the important points that the noble Lord makes, and the counterpoints made by the noble Viscount, Lord Camrose. The Government will continue to engage with businesses, the ICO and users on these models, and on the guidance, but we do not think there is currently a case for taking action to ban the practice. I therefore hope the noble Lord will not press his amendment.
My Lords, I am grateful to the Minister for that explanation. I will, for the moment, be content to know that the Government are continuing to discuss this. There is a real problem here that will need to be dealt with, but if the Government are engaged they will inevitably find themselves having to deal with it. There are some occasions in regulatory messages where you need to make options clear: “You need to do this or something else will happen and you’ll really disadvantage yourself”. The regulator will expect that, particularly where things such as pensions are concerned, but it is clearly a marketing message. It will be difficult to be resolved, but I am happy to trust the Government to have a go at it and not to try to insist on the particular formulation of these amendments. I beg leave to withdraw my amendment.
I thank the noble Baroness, Lady Kidron, for introducing this group, and the noble Lord, Lord Clement-Jones, and the noble Earl, Lord Erroll, for their comments and contributions—particularly the salutary words of the noble Earl, Lord Erroll, on the role of the Executive here, which were very enlightening.
I agree with the noble Baroness, Lady Kidron, that Parliament should have the opportunity to scrutinise this secondary legislation. Online safety research is essential: as our lives become more and more digital, we must assess how it impacts us as people, and especially children, who are particularly vulnerable to online harms. This cannot be achieved unless researchers are able to access the unadulterated raw data. Therefore, I am sure that noble Lords—and our colleagues in the other place—would wish to scrutinise the legislation creating this access to ensure it is fit for purpose. This is why I support the spirit of Amendment 51.
Following on from this point, facilitating online harms research by making access requests enforceable under a pre-existing online safety regime, as per Amendment 52, certainly seems to me like a sensible measure. It would enable this vital research, as would Amendment 54, which removes the need to create a bespoke enforcement system for online safety research access.
Amendment 53 would also enable independent research into how online risks and harms impact different groups. This information would be extremely valuable to a broad range of stakeholders including social media platforms, data controllers, schools and parents and parliamentarians. It would help us all identify groups who are at heightened risk of online harm, what type of harm they are at risk of, which measures have reduced this risk, which have exacerbated it and what we can all do to reduce this danger.
There are many people undertaking online safety research across the globe and we should look to help these researchers access data for the purposes of safety research, even if their location is outside the UK. Of course, adequate safeguards would need to be in place, which may be dictated to some extent by the location of the researcher. However, online safety research is a benefit for all of us and Amendment 55 would keep barriers to this research to a minimum.
I am sure we would all like to think that all data holders and processors would wish to assist with prevention of online harms. However, where commercial and moral imperatives compete, we sadly cannot always count on the latter winning out. Therefore, Amendment 56 is a sensible addition that would prevent contractual exclusion of research access on online safety grounds, ensuring that online safety risks cannot be hidden or obscured.
I thank the noble Baroness, Lady Kidron, for the amendments on researchers’ access to data for online safety research, an incredibly important topic. It is clear from Committee that the Government’s proposals in this clause are broadly welcomed. They will ensure that researchers can access the vital data they need to undertake an analysis of online safety risks to UK users, informing future online safety interventions and keeping people safe online.
Amendment 51 would compel the Secretary of State to make regulations for a researcher access framework, and to do so within 12 months. While I am sympathetic to the spirit of the noble Baroness’s amendment, a fixed 12-month timescale and requirement to make regulations may risk compressing the time and options available to develop the most effective and appropriate solution, as my noble friend Lady Jones outlined in Committee. Getting this right is clearly important. While we are committed to introducing a framework as quickly as possible, we do not want to compromise its quality. We need adequate time to ensure that the framework is fit for purpose, appropriately safeguarded and future-proofed for a fast-evolving technological environment.
As required by the Online Safety Act, Ofcom is currently preparing a report into the ways in which researchers can access data and the barriers that they face, as well as exploring how additional access might be achieved. This report will be published in July of this year. We are also committed to conducting a thorough consultation on the issue prior to any enforceable requirements coming into force. The Government intend to consult on the framework as soon as practicable after the publication of Ofcom’s report this summer.
Sufficient time is required for a thorough consultation with the wide range of interested stakeholders in this area, including the research community, civil society and industry. I know that the noble Baroness raised a concern in Committee that the Government would rely on Ofcom’s report to set the framework for the regime, but I can assure her that a robust evidence-gathering process is already under way. The framework will be informed by collaboration with key stakeholders and formal consultation, as well as being guided by evidence from Ofcom’s report on the matter. Once all interested parties have had their say and the consultation is completed, the Government expect to make regulations to install the framework. It is right that the Government commit to a full consultation process and do not seek to prejudge the outcomes of that process by including a mandatory requirement for regulations now.
Amendment 53 would seek to expand the list of examples of the types of provision that the regulations might make. Clause 123 gives non-exhaustive examples of what may be included in future regulations; it certainly does not limit those regulations to the examples given. Given the central importance of protecting children and vulnerable users online, a key aim of any future regulations would be to support researchers to conduct research into the different ways that various groups of people experience online safety, without the need for this amendment. Indeed, a significant driving force for establishing this framework in the first place is to improve the quality of research that is possible to understand the risks to users online, particularly those faced by children. I acknowledge the point that the noble Baroness made about people of all ages. We would be keen to discuss this further with her as we consult on specific requirements as part of developing regulations.
I will touch on the point about legal privilege. We believe that routinely copying a lawyer on to all emails and documents is not likely to attract legal privilege. Legal privilege protects communication specifically between legal advisers and their clients being created for the purpose of giving or receiving legal advice, or for the sole or dominant purpose of litigation. It would not be satisfactory just to copy everyone on everything.
We are confident that we can draft regulations that will make it entirely clear that the legal right to data for research purposes cannot be avoided by tech companies seeking to rely on contractual provisions that purport to prevent the sharing of data for research purposes. Therefore, there is no need for a specific requirement in the Bill to override a terms of service.
I thank the Minister for his very full answer. My legal adviser on my right—the noble and learned Lord, Lord Thomas of Cwmgiedd—let me know that I was in a good place here. I particularly welcome the Minister’s invitation to discuss Ofcom’s review and the consultation. Perhaps he would not mind if I brought some of my researcher friends with me to that meeting. With that, I beg leave to withdraw the amendment.
(2 weeks, 1 day ago)
Lords ChamberMy Lords, government Amendment 18—
I call on the noble Lord, Lord Clement-Jones, to speak to Amendment 17.
Amendment 17
My Lords, I thank the noble Lord, Lord Clement-Jones, for raising these significant issues. While I share some of the concerns expressed, I find myself unable—at least for the moment—to offer support for the amendments in their current form.
Amendment 17 seeks to remove the powers granted to the Secretary of State to override primary legislation and to modify aspects of UK data protection law via statutory instrument. I agree with the principle underpinning this amendment: that any changes to data protection law must be subject to appropriate scrutiny. It is essential that parliamentary oversight remains robust and meaningful, particularly when it comes to matters as sensitive and far-reaching as data protection.
However, my hesitation lies in the practical implications of the amendment. While I sympathise with the call for greater transparency, I would welcome more detail on how this oversight mechanism might work in practice. Would it involve enhanced scrutiny procedures or a stronger role for relevant parliamentary committees? I fear that, without this clarity, we risk creating uncertainty in an area that requires, above all, precision and confidence.
The Minister’s Amendment 18 inserts specific protections for children’s personal data into the UK GDPR framework. The Government have rightly emphasised the importance of safeguarding children in the digital age. I commend the intention behind the amendment and agree wholeheartedly that children deserve special protections when it comes to the processing of their personal data.
It is worth noting that this is a government amendment to their own Bill. While Governments amending their own legislation is not unprecedented—the previous Government may have indulged in the practice from time to time—it is a practice that can give rise to questions. I will leave my comments there; obviously it is not ideal, but these things happen.
Finally, Amendment 21, also tabled by the noble Lord, Lord Clement-Jones, mirrors Amendment 17 in seeking to curtail the Secretary of State’s powers to amend primary legislation via statutory instrument. My earlier comments on the importance of parliamentary oversight apply here. As with Amendment 17, I am of course supportive of the principle. The delegation of such significant powers to the Executive should not proceed without robust scrutiny. However, I would appreciate greater clarity on how this proposed mechanism would function in practice. As it stands, I fear that the amendment raises too many questions. If these concerns could be addressed, I would be most grateful.
In conclusion, these amendments raise important points about the balance of power between the Executive and Parliament, as well as the protection of vulnerable individuals in the digital sphere. I look forward to hearing more detail and clarity, so that we can move forward with confidence.
My Lords, government Amendment 18 is similar to government Amendment 40 in the previous group, which added an express reference to children meriting specific protection to the new ICO duty. This amendment will give further emphasis to the need for the Secretary of State to consider the fact that children merit specific protection when deciding whether to use powers to amend the list of recognised legitimate interests.
Turning to Amendment 17 from the noble Lord, Lord Clement-Jones, I understand the concerns that have been raised about the Secretary of State’s power to add or vary the list of recognised legitimate interests. This amendment seeks to remove the power from the Bill.
In response to some of the earlier comments, including from the committees, I want to make it clear that we have constrained these powers more tightly than they were in the previous data Bill. Before making any changes, the Secretary of State must consider the rights and freedoms of individuals, paying particular attention to children, who may be less aware of the risks associated with data processing. Furthermore, any addition to the list must meet strict criteria, ensuring that it serves a clear and necessary public interest objective as described in Article 23.1 of the UK GDPR.
The Secretary of State is required to consult the Information Commissioner and other stakeholders before making any changes, and any regulations must then undergo the affirmative resolution procedure, guaranteeing parliamentary scrutiny through debates in both Houses. Retaining this regulation-making power would allow the Government to respond quickly if future public interest activities are identified that should be added to the list of recognised legitimate interests. However, the robust safeguards and limitations in Clause 70 will ensure that these powers are used both sparingly and responsibly.
I turn now to Amendment 21. As was set out in Committee, there is already a relevant power in the current Data Protection Act to provide exceptions. We are relocating the existing exemptions, so the current power, so far as it relates to the purpose limitation principle, will no longer be relevant. The power in Clause 71 is intended to take its place. In seeking to reassure noble Lords, I want to reiterate that the power cannot be used for purposes other than the public interest objectives listed in Article 23.1 of the UK GDPR. It is vital that the Government can act quickly to ensure that public interest processing is not blocked. If an exemption is misused, the power will also ensure that action can be swiftly taken to protect data subjects by placing extra safeguards or limitations on it.
My Lords, I thank the Minister for that considered reply. It went into more detail than the letter he sent to the two committees, so I am grateful for that, and it illuminated the situation somewhat. But at the end of the day, the Minister is obviously intent on retaining the regulation-making power.
I thank the noble Viscount, Lord Camrose, for his support—sort of—in principle. I am not quite sure where that fitted; it was post-ministerial language. I think he needs to throw off the shackles of ministerial life and live a little. These habits die hard but in due course, he will come to realise that there are benefits in supporting amendments that do not give too much ministerial power.
Turning to one point of principle—I am not going to press either amendment—it is a worrying trend that both the previous Government and this Government seem intent on simply steamrollering through powers for Secretaries of State in the face of pretty considered comment by House of Lords committees. This trend has been noted, first for skeletal Bills and secondly for Bills that, despite being skeletal, include a lot of regulation-making power for Secretaries of State, and Henry VIII powers. So I just issue a warning that we will keep returning to this theme and we will keep supporting and respecting committees of this House, which spend a great deal of time scrutinising secondary legislation and warning of overweening executive power. In the meantime, I beg leave to withdraw Amendment 17.
My Lords, I now turn to government Amendment 49. I thank the noble Lord, Lord Clement-Jones, and other noble Lords for raising the concerns of the charity sector during earlier debates. The Government have also heard from charities and trade associations directly.
This amendment will permit charities to send marketing material—for example, promoting campaigns or fundraising activities—to people who have previously expressed an interest in their charitable purposes, without seeking express consent. Charities will have to provide individuals with a simple means of opting out of receiving direct marketing when their contact details are collected and with every subsequent message sent. The current soft opt-in rule for marketing products and services has similar requirements.
Turning to Amendment 24, I am grateful to the noble Baroness, Lady Harding, for our discussions on this matter. As was said in the debate in Grand Committee, the Government are committed to upholding the principles of transparency. I will try to outline some of that.
I understand that this amendment is about data brokers buying data from the open electoral register and combining it with data they have collected from other sources to build profiles on individuals with the intention of selling them for marketing. Despite what was said in the last debate on this, I am not convinced that all individuals registering on the open electoral register would reasonably expect this kind of profiling or invisible processing using their personal data. If individuals are unaware of the processing, this undermines their ability to exercise their other rights, such as to object to the processing. That point was well made by the noble Lord, Lord Davies.
With regard to the open electoral register, the Government absolutely agree that there are potential benefits to society through its use—indeed, economic growth has been mentioned. Notification is not necessary in all cases. There is, for example, an exemption if notifying the data subject would involve a disproportionate effort and the data was not collected directly from them. The impact on the data subject must be considered when assessing whether the effort is disproportionate. If notification is proportionate, the controller must notify.
The ICO considers that the use and sale of open electoral register data alone is unlikely to require notification. As was set out in Committee, the Government believe that controllers should continue to assess on a case-by-case basis whether cases meet the conditions for the existing disproportionate effort exemption. Moreover, I hope I can reassure the noble Baroness that in the event that the data subject already has the information—from another controller, for example—another exemption from notification applies.
The Government therefore do not see a case for a new exemption for this activity, but as requested by the noble Baroness, Lady Harding, I would be happy to facilitate further engagement between the industry and the ICO to improve a common understanding of how available exemptions are to be applied on a case-by-case basis. I understand that the ICO will use the Bill as an opportunity to take stock of how its guidance can address particular issues that organisations face.
Amendment 50, tabled by the noble Lord, Lord Clement-Jones, seeks to achieve a very similar thing to the government amendment and we studied it when designing our amendment. The key difference is that the government amendment defines which organisations can rely on the new measure and for what purposes, drawing on definitions of “charity” and “charitable purpose” in relevant charities legislation.
I trust that the noble Lord will be content with this government amendment and feel content to not to press his own.
Before the Minister sits down, can I follow up and ask a question about invisible processing? I wonder whether he considers that a better way of addressing potential concerns about invisible processing is improving the privacy notices when people originally sign up for the open electoral register. That would mean making it clear how your data could be used when you say you are happy to be on the open electoral register, rather than creating extra work and potentially confusing communication with people after that. Can the Minister confirm that that would be in scope of potential options and further discussions with the ICO?
The further discussions with the ICO are exactly to try to get to these points about the right way to do it. It is important that people know what they are signing up for, and it is equally important that they are aware that they can withdraw at any point. Those points obviously need to be discussed with the industry to make sure that everyone is clear about the rules.
I thank noble Lords for having humoured me in the detail of this debate. I am very pleased to hear that response from the Minister and look forward to ongoing discussions with the ICO and the companies involved. As such, I beg leave to withdraw my amendment.
My Lords, I will very briefly speak to Amendment 30 in my name. Curiously, it was in the name of the noble Viscount, Lord Camrose, in Committee, but somehow it has jumped.
On the whole, I have always advocated for age-appropriate solutions. The amendment refers to preventing children consenting to special category data being used in automated decision-making, simply because there are some things that children should not be able to consent to.
I am not sure that this exact amendment is the answer. I hope that the previous conversation that we had before the dinner break will produce some thought about this issue—about how automatic decision-making affects children specifically—and we can deal with it in a slightly different way.
While I am on my feet, I want to say that I was very struck by the words of my noble friend Lady Freeman, particularly about efficacy. I have seen so many things that have purported to work in clinical conditions that have failed to work in the complexity of real life, and I want to associate myself with her words and, indeed, the amendments in her name and that of the noble Lord, Lord Clement-Jones.
I start with Amendment 26, tabled by the noble Viscount, Lord Camrose. As he said in Committee, a principles-based approach ensures that our rules remain fit in the face of fast-evolving technologies by avoiding being overly prescriptive. The data protection framework achieves this by requiring organisations to apply data protection principles when personal data is processed, regardless of the technology used.
I agree with the principles that are present for AI, which are useful in the context in which they were put together, but introducing separate principles for AI could cause confusion around how data protection principles are interpreted when using other technologies. I note the comment that there is a significant overlap between the principles, and the comment from the noble Viscount that there are situations in which one would catch things and another would not. I am unable to see what those particular examples are, and I hope that the noble Viscount will agree with the Government’s rationale for seeking to protect the framework’s technology-neutral set of principles, rather than having two separate sets.
Amendment 28 from the noble Lord, Lord Clement-Jones, would extend the existing safeguards for decisions based on solely automated processing to decisions based on predominantly automated processing. These safeguards protect people when there is no meaningful human involvement in the decision-making. The introduction of predominantly automated decision-making, which already includes meaningful human involvement—and I shall say a bit more about that in a minute—could create uncertainty over when the safeguards are required. This may deter controllers from using automated systems that have significant benefits for individuals and society at large. However, the Government agree with the noble Viscount on strengthening the protections for individuals, which is why we have introduced a definition for solely automated decision-making as one which lacks “meaningful human involvement”.
I thank noble Lords for Amendments 29 and 36 and the important points raised in Committee on the definition of “meaningful human involvement”. This terminology, introduced in the Bill, goes beyond the current UK GDPR wording to prevent cursory human involvement being used to rubber stamp decisions as not being solely automated. The point at which human involvement becomes meaningful is context specific, which is why we have not sought to be prescriptive in the Bill. The ICO sets out in its guidance its interpretation that meaningful human involvement must be active: someone must review the decision and have the discretion to alter it before the decision is applied. The Government’s introduction of “meaningful” into primary legislation does not change this definition, and we are supportive of the ICO’s guidance in this space.
As such, the Government agree on the importance of the ICO continuing to provide its views on the interpretation of terms used in the legislation. Our reforms do not remove the ICO’s ability to do this, or to advise Parliament or the Government if it considers that the law needs clarification. The Government also acknowledge that there may be a need to provide further legal certainty in future. That is why there are a number of regulation-making powers in Article 22D, including the power to describe meaningful human involvement or to add additional safeguards. These could be used, for example, to impose a timeline on controllers to provide human intervention upon the request of the data subject, if evidence suggested that this was not happening in a timely manner following implementation of these reforms. Any regulations must follow consultation with the ICO.
Amendment 30 from the noble Baroness, Lady Kidron, would prevent law enforcement agencies seeking the consent of a young person to the processing of their special category or sensitive personal data when using automated decision-making. I thank her for this amendment and agree about the importance of protecting the sensitive personal data of children and young adults. We believe that automated decision-making will continue to be rarely deployed in the context of law enforcement decision-making as a whole.
Likewise, consent is rarely used as a lawful basis for processing by law enforcement agencies, which are far more likely to process personal data for the performance of a task, such as questioning a suspect or gathering evidence, as part of a law enforcement process. Where consent is needed—for example, when asking a victim for fingerprints or something else—noble Lords will be aware that Clause 69 clearly defines consent under the law enforcement regime as
“freely given, specific, informed and unambiguous”
and
“as easy … to withdraw … as to give”.
So the tight restrictions on its use will be crystal clear to law enforcement agencies. In summary, I believe the taking of an automated decision based on a young person’s sensitive personal data, processed with their consent, to be an extremely rare scenario. Even when it happens, the safeguards that apply to all sensitive processing will still apply.
I thank the noble Viscount, Lord Camrose, for Amendments 31 and 32. Amendment 31 would require the Secretary of State to publish guidance specifying how law enforcement agencies should go about obtaining the consent of the data subject to process their data. To reiterate a point made by my noble friend Lady Jones in Committee, Clause 69 already provides a definition of “consent” and sets out the conditions for its use; they apply to all processing under the law enforcement regime, not just automated decision-making, so the Government believe this amendment is unnecessary.
Amendment 32 would require the person reviewing an automated decision to have sufficient competence and authority to amend the decision if required. In Committee, the noble Viscount also expressed the view that a person should be “suitably qualified”. Of course, I agree with him on that. However, as my noble friend Lady Jones said in Committee, the Information Commissioner’s Office has already issued guidance which makes it clear that the individual who reconsiders an automated decision must have the “authority and competence” to change it. Consequently, the Government do not feel that it is necessary to add further restrictions in the Bill as to the type of person who can carry out such a review.
The noble Baroness, Lady Freeman, raised extremely important points about the performance of automated decision-making. The Government already provide a range of products, but A Blueprint for Modern Digital Government, laid this morning, makes it clear that part of the new digital centre’s role will be to offer specialist insurance support, including, importantly in relation to this debate,
“a service to rigorously test models and products before release”.
That function will be in place and available to departments.
On Amendments 34 and 35, my noble friend Lady Jones previously advised the noble Lord, Lord Clement-Jones, that the Government would publish new algorithmic transparency recording standard records imminently. I am pleased to say that 14 new records were published on 17 December, with more to follow. I accept that these are not yet in the state in which we would wish them to be. Where these amendments seek to ensure that the efficacy of such systems is evaluated, A Blueprint for Modern Digital Government, as I have said, makes it clear that part of the digital centre’s role will be to offer such support, including this service. I hope that this provides reassurance.
My Lords, before the Minister sits down, I was given considerable assurance between Committee and Report that a code of practice, drawn up with the ICO, would be quite detailed in how it set out the requirements for those engaging in automated decision-making. The Minister seems to have given some kind of assurance that it is possible that the ICO will come forward with the appropriate provisions, but he has not really given any detail as to what that might consist of and whether that might meet some of the considerations that have been raised in Committee and on Report, not least Amendments 34 and 35, which have just been discussed as if the ATRS was going to cover all of that. Of course, any code would no doubt cover both the public and private sectors. What more can the Minister say about the kind of code that would be expected? We seem to be in somewhat of a limbo in this respect.
I apologise; I meant to deal with this at the end. I think I am dealing with the code in the next group.
Before the Minister sits down, he said that there will be evaluations of the efficacy of these systems but he did not mention whether those will have to be made public. Can he give me any assurance on that?
There is a requirement. Going back to the issue of principles, which was discussed earlier on, one of the existing principles—which I am now trying to locate and cannot—is transparency. I expect that we would make as much of the information public as we can in order to ensure good decision-making and assure people as to how the decisions have been reached.
I thank all noble Lords and the Minister for their comments and contributions to what has been a fascinating debate. I will start by commenting on the other amendments in this group before turning to those in my name.
First, on Amendments 28 and 29, I am rather more comfortable with the arrangements for meaningful human intervention set out in the Bill than the noble Lord, Lord Clement-Jones. For me, either a decision has meaningful human intervention or it does not. In the latter case, certain additional rights kick in. To me, that binary model is clear and straightforward, and could only be damaged by introducing some of the more analogue concepts such as “predominantly”, “principally”, “mainly” or “wholly”, so I am perfectly comfortable with that as it is.
However, I recognise that puts a lot of weight on to the precise meaning of “meaningful human involvement”. Amendment 36 in the name of the noble Lord, Lord Clement-Jones, which would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the ICO, seems to take on some value in those circumstances, so I am certainly more supportive of that one.
As for Amendments 34 and 35 in the names of the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Freeman, I absolutely recognise the value and potential of efficacy; I agree it is a very valuable term. I have more faith in the rollout and use of the ATRS but on a non-statutory basis, believing, as I do, that this would allow it to continue to develop in an agile and adaptive manner. I welcome the Minister’s words on this subject, and for now I remain comfortable that the ATRS is the direction forward for that.
I turn to the amendments in my name. I thank all noble Lords and, indeed, the Minister for their comments and contributions regarding Amendments 31 and 32. I very much take the Minister’s point that definitions of consent feature elsewhere in the Bill. That reduces my concern somewhat.
However, I continue to strongly commend Amendment 26 to the House. I believe it will foster innovation while protecting data rights. It is popular with the public and with private sector stakeholders. It will bring about outcomes that we all want to see in AI safety without stifling this new and exciting technology. In the absence of an AI Bill—and possibly even in the presence of one—it is the only AI-specific legislation that will be around. It is important somehow to get those AI principles in the Bill, at least until an AI Bill comes along. With this in mind, I wish to test the opinion of the House.
My Lords, Amendment 41 aims to establish a code of practice for the use of children’s data in the development of AI technologies. In the face of rapidly advancing AI, it is, of course, crucial that we ensure children’s data is handled with the utmost care, prioritising their best interests and fundamental rights. We agree that AI systems that are likely to impact children should be designed to be safe and ethical by default. This code of practice will be instrumental in guiding data controllers to ensure that AI development and deployment reflect the specific needs and vulnerabilities of children.
However, although we support the intent behind the amendment, we have concerns, which echo concerns on amendments in a previous group, about the explicit reference to the UN Convention on the Rights of the Child and general comment 25. I will not rehearse my comments from earlier groups, except to say that it is so important that we do not have these explicit links to international frameworks, important as they are, in UK legislation.
In the light of this, although we firmly support the overall aim of safeguarding children’s data in AI, we believe this can be achieved more effectively by focusing on UK legal principles and ensuring that the code of practice is rooted in our domestic context.
I thank the noble Lord, Lord Clement-Jones, for Amendment 33, and the noble Baroness, Lady Kidron, for Amendment 41, and for their thoughtful comments on AI and automated decision-making throughout this Bill’s passage.
The Government have carefully considered these issues and agree that there is a need for greater guidance. I am pleased to say that we are committing to use our powers under the Data Protection Act to require the ICO to produce a code of practice on AI and solely automated decision-making through secondary legislation. This code will support controllers in complying with their data protection obligations through practical guidance. I reiterate that the Government are committed to this work as an early priority, following the Bill receiving Royal Assent. The secondary legislation will have to be approved by both Houses of Parliament, which means it will be scrutinised by Peers and parliamentarians.
I can also reassure the noble Baroness that the code of practice will include guidance about protecting data subjects, including children. The new ICO duties set out in the Bill will ensure that where children’s interests are relevant to any activity the ICO is carrying out, it should consider the specific protection of children. This includes when preparing codes of practice, such as the one the Government are committing to in this area.
I understand that noble Lords will be keen to discuss the specific contents of the code. The ICO, as the independent data protection regulator, will have views as to the scope of the code and the topics it should cover. We should allow it time to develop those thoughts. The Government are also committed to engaging with noble Lords and other stakeholders after Royal Assent to make sure that we get this right. I hope noble Lords will agree that working closely together to prepare the secondary legislation to request this code is the right approach instead of pre-empting the exact scope.
The noble Lord, Lord Clement-Jones, mentioned edtech. I should add—I am getting into a habit now—that it is discussed in a future group.
Before the Minister sits down, I welcome his words, which are absolutely what we want to hear. I understand that the ICO is an independent regulator, but it is often the case that the scope and some of Parliament’s concerns are delivered to it from this House—or, indeed, from the other place. I wonder whether we could find an opportunity to make sure that the ICO hears Parliament’s wish on the scope of the children’s code, at least. I am sure the noble Lord, Lord Clement-Jones, will say similar on his own behalf.
It will be clear to the ICO from the amendments that have been tabled and my comments that there is an expectation that it should take into account the discussion we have had on this Bill.
My Lords, I thank the Minister for his very considered response. In the same way as the noble Baroness, Lady Kidron, I take it that, effectively, the Minister is pledging to engage directly with us and others about the nature and contents of the code, and that the ICO will also engage on that. As the Minister knows, the definition of terms such as meaningful human engagement is something that we will wish to discuss and consider in the course of that engagement. I hope that the AI edtech code will also be part of that.
I thank the Minister. I know he has had to think about this quite carefully during the Bill’s passage. Currently, Clause 80 is probably the weakest link in the Bill, and this amendment would go some considerable way towards repairing it. My final question is not to the Minister, but to the Opposition: what on earth have they got against the UN? In the meantime, I beg leave to withdraw my amendment.
My Lords, I thank the noble Lord, Lord Clement-Jones—as ever—and the noble and learned Lord, Lord Thomas, for tabling Amendment 37 in their names. It would introduce a new clause that would require the Secretary of State to carry out an impact assessment of this Act and other changes to the UK’s domestic and international frameworks relating to data adequacy before the European Union’s reassessment of data adequacy in June this year.
I completely understand the concerns behind tabling this amendment. In the very worst-case scenario, of a complete loss of data adequacy in the assessment by the EU, the effect on many businesses and industries in this country would be knocking at the door of catastrophic. It cannot be allowed to happen.
However, introducing a requirement to assess the impact of the Bill on the European Union data adequacy decision requires us to speculate on EU intentions in a public document, which runs the risk of prompting changes on its part or revealing our hand to it in ways that we would rather not do. It is important that we do two things: understand our risk, without necessarily publishing it publicly; and continue to engage at ministerial and official level, as I know we are doing intensively. I think the approach set out in this amendment runs the risk of being counterproductive.
I thank the noble Lord, Lord Clement-Jones, for his amendment, and the noble and learned Lord, Lord Thomas, for his contribution. I agree with them on the value and importance placed on maintaining our data adequacy decisions from the EU this year. That is a priority for the Government, and I reassure those here that we carefully considered all measures in the light of the EU’s review of our adequacy status when designing the Bill.
The Secretary of State wrote to the House of Lords European Affairs Committee on 20 November 2024 on this very point and I would be happy to share this letter with noble Lords if that would be helpful. The letter sets out the importance this Government place on renewal of our EU adequacy decisions and the action we are taking to support this process.
It is important to recognise that the EU undertakes its review of its decisions for the UK in a unilateral, objective and independent way. As the DSIT Secretary of State referenced in his appearance before the Select Committee on 3 December, it is important that we acknowledge the technical nature of the assessments. For that reason, we respect the EU’s discretion about how it manages its adequacy processes. I echo some of the points made by the noble Viscount, Lord Camrose.
That being said, I reassure noble Lords that the UK Government are doing all they can to support a swift renewal of our adequacy status in both technical preparations and active engagement. The Secretary of State met the previous EU Commissioner twice last year to discuss the importance of personal data sharing between the UK and EU. He has also written to the new Commissioner for Justice responsible for the EU’s review and looks forward to meeting Commissioner McGrath soon.
I also reassure noble Lords that DSIT and the Home Office have dedicated teams that have been undertaking preparations ahead of this review, working across government as needed. Those teams are supporting European Commission officials with the technical assessment as required. UK officials have met with the European Commission four times since the introduction of the Bill, with future meetings already in the pipeline.
My Lords, the noble and learned Lord, Lord Thomas, whose intervention I very much appreciated, particularly at this time of the evening, talked about a fresh pair of eyes. What kind of reassurance can the Minister give on that?
It is worth remembering that the ultimate decision is with the EU Commission and we are quite keen to have its eyes on it now, which is why we are engaging with it very carefully. It is looking at it as we are going through it—we are talking to it and we have dedicated teams of people brought together specifically to do this. There are several people from outside the direct construct of the Bill who are looking at this to make sure that we have adequacy and are having very direct conversations with the EU to ensure that that process is proceeding as we would wish it to.
I thank the Minister for his response. It would be very reassuring if it was our own fresh pair of eyes rather than across the North Sea. That is all I can say as far as that is concerned. I appreciate what he said—that the Government are taking this seriously. It is a continuing concern precisely because the chair of the European Affairs Committee wrote to the Government. It is a continuing issue for those of us observing the passage of the Bill and we will continue to keep our eyes on it as we go forward. I very much hope that June 2025 passes without incident and that the Minister’s predictions are correct. In the meantime, I beg leave to withdraw the amendment.
(2 weeks, 1 day ago)
Lords ChamberI start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.
I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?
I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?
I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.
I thank the noble Baroness, Lady Kidron, and the noble Viscount, Lord Camrose, for their proposed amendments and continued interest in Part 1 of this Bill. I hope I can reassure the noble Baroness that the definition of customer data is purposefully broad. It encompasses information relating to a customer or a trader and the Government consider that this would indeed include inferred data. The specific data to be disclosed under a smart data scheme will be determined in the context of that scheme and I reassure the noble Baroness that there will be appropriate consultation before a smart data scheme is introduced.
I turn to Amendment 5. Clause 13 provides statutory authority for the Secretary of State or the Treasury to give financial assistance to decision-makers, enforcers and others for the purpose of meeting any expense in the exercise of their functions in the smart data schemes. Existing and trusted bodies such as sector regulators will likely be in the lead of the delivery of new schemes. These bodies will act as decision-makers and enforcers. It is intended that smart data schemes will be self-financing through the fees and levies produced by Clauses 11 and 12. However, because of the nature of the bodies that are involved, it is deemed appropriate for there to be a statutory spending authority as a backstop provision if that is necessary. Any spending commitment of resources will, of course, be subject to the usual estimates process and to existing public sector spending controls and transparency requirements.
I hope that with this brief explanation of the types of bodies involved, and the other explanations, the noble Baroness will be content to withdraw Amendment 1 and that noble Lords will not press Amendment 5.
I thank the Minister for his reassurance, particularly that we will have an opportunity for a consultation on exactly how the smart data scheme works. I look forward to such agreement throughout the afternoon. With that, I beg leave to withdraw my amendment.
I thank my noble friend Lord Lucas for introducing this group and for bringing these important and sometimes very difficult matters to the attention of the House. I will address the amendments slightly out of order, if I may.
For digital verification services to work, the information they have access to and use to verify documents must be accurate; this is, needless to say, critical to the success of the entire scheme. Therefore, it is highly sensible for Amendment 8 to require public authorities, when they disclose information via the information gateway, to ensure that it is accurate and reliable and that they can prove it. By the same measure, Amendment 6, which requires the Secretary of State to assess whether the public authorities listed are collecting accurate information, is equally sensible. These amendments as a pair will ensure the reliability of DVS services and encourage the industry to flourish.
I would like to consider the nature of accurate information, especially regarding an individual’s biological sex. It is possible for an individual to change their recorded sex on their driving licence or passport, for example, without going through the process of obtaining a gender recognition certificate. Indeed, a person can change the sex on their birth certificate if they obtain a GRC, but many would argue that changing some words on a document does not change the reality of a person’s genome, physical presentation and, in some cases, medical needs, meaning that the information recorded does not accurately relate to their sex. I urge the Minister to consider how best to navigate this situation, and to acknowledge that it is crucially important, as we have heard so persuasively from the noble Earl, Lord Errol, and my noble friends Lord Arbuthnot and Lord Lucas, that a person’s sex is recorded accurately to facilitate a fully functioning DVS system.
The DVS trust framework has the potential to rapidly transform the way identities and information are verified. It should standardise digital verification services, ensure reliability and build trust in the concept of a digital verification service. It could seriously improve existing, cumbersome methods of verifying information, saving companies, employers, employees, landlords and tenants time and money. Personally, I have high hopes of its potential to revolutionise the practices of recruitment. I certainly do not know many people who would say no to less admin. If noble Lords are minded to test the opinion of the House, we will certainly support them with respect to Amendments 6 and 8.
With the greatest respect to the noble Lord, Lord Clement-Jones, I think it is a mistake to regard this as part of some culture war struggle. As I understand it, this is about accuracy of data and the importance, for medical and other reasons, of maintaining accurate data.
All the benefits of DVS cannot be to the detriment of data privacy and data minimisation. Parliament is well-practised at balancing multiple competing concepts and doing so with due regard to public opinion. Therefore, Amendment 7 is indeed a sensible idea.
Finally, Amendment 9 would require the Secretary of State to review whether an offence of false use of identity documents created or verified by a DVS provider is needed. This is certainly worth consideration. I have no doubt that the Secretary of State will require DVS providers to take care that their services are not being used with criminal intent, and I am quite sure that DVS service providers do not want to facilitate crimes. However, the history of technology is surely one of high-minded purposes corrupted by cynical practices. Therefore, it seems prudent for the Secretary of State to conduct a review into whether creating this offence is necessary and, if it is, the best way that it can be laid out in law. I look forward to hearing the Minister’s comments on this and other matters.
I thank the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Arbuthnot, for their amendments and interest in the important area of digital verification services. I thank the noble Viscount, Lord Camrose, for his support for this being such an important thing to make life easier for people.
I will go in reverse order and start with Amendment 9. I thank the noble Lord, Lord Clement-Jones, for reconsidering his stance since Committee on the outright creation of these offences. Amendment 9 would create an obligation for the Secretary of State to review the need for digital identity theft offences. We believe this would be unnecessary, as existing legislation—for example, the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018—already addresses the behaviour targeted by this amendment.
However, we note the concerns raised and confirm that the Government are taking steps to tackle the issue. First, the Action Fraud service, which allows individuals to report fraud enabled by identity theft, is being upgraded with improved reporting tools, increased intelligence flows to police forces and better support services for victims. Secondly, the Home Office is reviewing the training offered to police officers who have to respond to fraud incidents, and identifying the improvements needed.
I am sorry to interrupt the Minister. He is equating digital identity theft to fraud, and that is not always the case. Is that the advice that he has received?
The advice is that digital identity theft would be captured by those Acts. Therefore, there is no need for a specific offence. However, as I said, the Government are taking steps to tackle this and will support the Action Fraud service as a way to deal with it, even though I agree that not everything falls as fraud under that classification.
I am sorry to interrupt the Minister again, but could he therefore confirm that, by reiterating his previous view that the Secretary of State should not have to bring the framework to Parliament, he disagrees with both the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, both of which made the same point on this occasion and on the previous Bill—that Parliament should look at the trust framework?
For the reasons that I have given, I think that the trust framework is a technical document and one best dealt with in this technical form. It is built on other assurance processes, with the United Kingdom Accreditation Service overseeing the conformity accreditation bodies that will test the digital verification services. In this case, our view is that it does not need to come under parliamentary scrutiny.
On Amendments 6 and 8 from the noble Lord, Lord Lucas, I am absolutely behind the notion that the validity of the data is critical. We have to get this right. Of course, the Bill itself takes the data from other sources, and those sources have authority to get the information correct, but it is important, for a digital service in particular, that this is dealt with very carefully and that we have good assurance processes.
On the specific point about gender identity, the Bill does not create or prescribe new ways in which to determine that, but work is ongoing to try to ensure that there is consistency and accuracy. The Central Digital and Data Office has started to progress work on developing data standards and key entities and their attributes to ensure that the way data is organised, stored and shared is consistent between public authorities. Work has also been commenced via the domain expert group on the person entity, which has representations from the Home Office, HMRC, the Office for National Statistics—importantly—NHS England, the Department for Education, the Ministry of Justice, the Local Government Association and the Police Digital Service. The group has been established as a pilot under the Data Standards Authority to help to ensure consistency across organisations, and specific pieces of work are going on relating to gender in that area.
The measures in Part 2 are intended to help secure the reliability of the process through which citizens can verify their identity digitally. They do not intervene in how government departments record and store identity data. In clarifying this important distinction, and with reference to the further information I will set out, I cannot support the amendments.
I would be grateful if the Minister could confirm whether he accepts that, on some occasions, passports and drivers’ licences inaccurately reflect the sex of their holders.
I can be absolutely clear that we must have a single version of the truth on this. There needs to be a way to verify it consistently and there need to be rules. That is why the ongoing work is so important. I know from my background in scientific research that, to know what you are dealing with, data is the most important thing to get. Making sure that we have a system to get this clear will be part of what we are doing.
Amendment 6 would require the Secretary of State to assess which public authorities can reliably verify related facts about a person in the preparation of the trust framework. This exercise is out of scope of the trust framework, as the Good Practice Guide 45—a standard signposted in the trust framework—already provides guidance for assessing the reliability of authoritative information across a wide range of use cases covered by the trust framework. Furthermore, the public authorities mentioned are already subject to data protection legislation which requires personal data processed to be accurate and, where relevant, kept up to date.
Amendment 8 would require any information shared by public authorities to be clearly defined, accompanied by metadata and accurate. The Government already support and prioritise the accuracy of the data they store, and I indicated the ongoing work to make sure that this continues to be looked at and improved. This amendment could duplicate or potentially conflict with existing protections under data protection legislation and/or other legal obligations. I reassure noble Lords that the Government believe that ensuring the data they process is accurate is essential to deliver services that meet citizens’ needs and ensure accurate evaluation and research. The Central Digital and Data Office has already started work on developing data standards on key entities and their attributes to ensure that the way data is organised, stored and shared is consistent.
It is our belief that these matters are more appropriately considered together holistically, rather than by a piecemeal approach through diverse legislation such as this data Bill. As such, I would be grateful if noble Lords would consider withdrawing their amendments.
My Lords, I am very grateful to all noble Lords who have spoken on this. I actually rather liked the amendments of the noble Lord, Lord Clement-Jones—if I am allowed to reach across to him—but I think he is wrong to describe Amendments 6 and 8 as “culture war”. They are very much about AI and the fundamentals of digital. Self-ID is an attractive thought; I would very much like to self-identify as a life Peer at the moment.
My Lords, Amendments 10 and 12 seek to amend Clauses 56 and 58, which form part of the national underground asset register provisions. These two minor, technical amendments address a duplicate reference to “the undertaker’s employees” and replace it with the correct reference to “the contractor’s employees”. I reassure noble Lords that the amendments do not have a material policy effect and are intended to correct the drafting. I beg to move.
My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.
Like the noble Lord, Lord Clement-Jones, I am not going to try to better the excellent speech made by the noble Viscount, Lord Colville.
We debated at much length in Committee the definition of the scientific interest, as it will dictate the breadth of the consent exemption for the data reused. If it is too broad, it could allow data companies—I am thinking specifically of AI programs—to justify data scraping without obtaining consent, should they successfully argue that it constitutes scientific research. However, should we create too narrow a definition, we could stifle commercial research and innovation. This would be disastrous for economic growth and the UK science and technology sector, which is one of our most dynamic sectors and has the potential to become one of the most profitable. We should be looking to support and grow, not hinder. Finding the happy medium here is no small feat, but the amendment tabled by the noble Viscount, Lord Colville of Culross, goes a long way towards achieving this by threading the needle.
By requiring the research to be in the public interest to qualify for the consent exemption for data reuse, we will prevent companies cloaking purely commercial activities for their own ends in the guise of scientific research, while allowing commercial research which will benefit the general public.
This particularly chimes with my time as Health Minister, when we tried to ensure that we could bring the public with us on the use of their health data. We did a lot of focus groups on all of this, and we found that we could have very widespread—70%-plus—public support if we could demonstrate that there really was a medical research benefit from all of this. This amendment is very much in keeping with that. As I say, it threads the needle. That is why we will be strongly supporting the amendment tabled by the noble Viscount, Lord Colville, and we hope he is minded to put the matter to a Division.
I am grateful to the noble Viscount, Lord Colville, for his amendment and his engagement on this matter. I fully agree with the importance of ensuring that the term “scientific research” is not abused. Clause 67 will help avoid the misuse of the term by introducing a test of whether the research could reasonably be described as scientific. By explicitly requiring a reasonableness test, which is a well-known part of law, the provision is narrowing not broadening the current position.
I will speak first to government Amendment 40, tabled in my name, concerning the ICO’s duty relating to children’s personal data. Before that, though, I thank the noble Lords, Lord Stevenson and Lord Russell, the noble Baroness, Lady Harding, and in particular the noble Baroness, Lady Kidron, for such considered debates on this incredibly important issue, both in today’s discussion in the House and in the meetings we have had together. Everyone here wants this to be effective and recognises that we must protect children.
The Government are firmly committed to maintaining high standards of protection for children, which is why they decided not to proceed with measures in the previous Data Protection and Digital Information Bill that would have reduced requirements for data protection impact assessments, prior consultation with the ICO and the designation of data protection officers. The ICO guidance is clear that organisations must complete an impact assessment in relation to any processing activity that uses children’s or other vulnerable people’s data for marketing purposes, profiling or other automated decision-making, or for offering online services directly to children.
The Government also expect organisations which provide online services likely to be accessed by children to continue to follow the standards on age-appropriate design set out in the children’s code. The noble Baroness, Lady Kidron, worked tirelessly to include those provisions in the Data Protection Act 2018 and the code continues to provide essential guidance for relevant online services on how to comply with the data protection principles in respect of children’s data. In addition to these existing provisions, Clause 90 already includes a requirement for the ICO to consider the rights and interests of children when carrying out its functions.
I appreciate the point that the noble Baroness made in Committee about the omission of the first 10 words of recital 38 from these provisions. As such, I am very happy to rectify this through government Amendment 40. The changes we are making to Clause 90 will require the Information Commissioner to consider, where relevant, when carrying out its regulatory functions the fact that children merit special protection with regard to their personal data. I hope noble Lords will support this government amendment.
Turning to Amendment 15 from the noble Baroness, Lady Kidron, which excludes children’s data from Clause 68, I reassure her that neither the protections for adults nor for children are being lowered. Clause 68 faithfully transposes the existing concept of giving consent to processing for an area of scientific research from the current recital. This must be freely given and be fully revokable at any point. While the research purpose initially identified may become more specific as the research progresses, this clause does not permit researchers to use the data for research that lies outside the original consent. As has been highlighted by the noble Viscount, Lord Camrose, excluding children from Clause 68 could have a detrimental effect on health research in children and could unfairly disadvantage them. This is already an area of research that is difficult and underrepresented.
I know that the noble Baroness, Lady Kidron, cares deeply about this but the fact is that if we start to make research in children more difficult—for example, if research on children with a particular type of cancer found something in those children that was relevant to another cancer, this would preclude the use of that data—that cannot be right for children. It is a risk to move and exempt children from this part of the Bill.
Amendment 16 would prevent data controllers from processing children’s data under the new recognised legitimate interests lawful ground. However, one of the main reasons this ground was introduced was to encourage organisations to process personal data speedily when there is a pressing need to do so for important purposes. This could be where there is a need to report a safeguarding concern or to prevent a crime being committed against a child. Excluding children’s data from the scope of the provision could therefore delay action being taken to protect some children—a point also made in the debate.
Amendment 20 aims to prohibit further processing of children’s personal data when it was collected under the consent lawful basis. The Government believe an individual’s consent should not be undermined, whether they are an adult or a child. This is why the Bill sets out that personal data should be used only for the purpose a person has consented to, apart from situations that are in the public interest and authorised by law or to comply with the UK GDPR principles. Safeguarding children or vulnerable individuals is one of these situations. There may be cases where a child’s data is processed under consent by a social media company and information provided by the child raises serious safeguarding concerns. The social media company must be able to further process the child’s data to make safeguarding referrals when necessary. It is also important to note that these public interest exceptions apply only when the controller cannot reasonably be expected to obtain consent.
I know the noble Baroness, Lady Kidron, hoped that the Government might also introduce amendments to require data controllers to apply a higher standard of protection to children’s data than to adults’. The Government have considered Amendment 22 carefully, but requiring all data controllers to identify whether any of the personal data they hold relates to children, and to apply a higher standard to it, would place disproportionate burdens on small businesses and other organisations that currently have no way of differentiating age groups.
Although we cannot pursue this amendment as drafted, my understanding of the very helpful conversations that I have had with the noble Baroness, Lady Kidron, is that she intended for this amendment to be aimed at online services directed at or likely to be accessed by children, not to every public body, business or third sector organisation that might process children’s data from time to time.
I reassure noble Lords that the Government are open to exploring a more targeted approach that focuses on those services that the noble Baroness is most concerned about. The age-appropriate design code already applies to such services and we are very open to exploring what further measures could be beneficial to strengthen protection for children’s data. This point was eloquently raised by the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Stevenson, and is one that we would like to continue. Combined with the steps we are taking in relation to the new ICO duty, which will influence the support and guidance it provides for organisations, we believe this could drive better rates of compliance. I would be very pleased to work with all noble Lords who have spoken on this to try to get this into the right place.
I turn to Amendment 27, tabled by the noble Baroness, Lady Kidron. I agree with her on the importance of protecting children’s rights and interests when undertaking solely automated decision-making. However, we think this amendment, as currently drafted, would cause operational confusion as to when solely automated decision-making can be carried out. Compliance with the reformed Article 22 and the wider data protection legislation will ensure high standards of protection for adults and children alike, and that is what we should pursue.
I now turn to Amendment 39, which would replace the ICO’s children’s duty, and for which I again thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell. As a public body, the ICO must adhere to the UK’s commitment to the UN Convention on the Rights of the Child, and we respectfully submit that it is unnecessary to add further wording of this nature to the ICO’s duty. We believe that government Amendment 40, coupled with the ICO’s principal objective to secure an appropriate level of protection, takes account of the fact that the needs of children might not always look the same.
Finally, to address Amendment 45, the Government believe that the Bill already delivers on this aim. While the new annual regulatory action report in Clause 101 will not break down the activity that relates to children, it does cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. This will deliver greater transparency and accountability on the ICO’s actions. Furthermore, Clause 90 requires the ICO to set out in its annual report how it has complied with its statutory duties. This includes the new duty relating to children.
To conclude, I hope that the amendment we tabled today and the responses I have set out reassure noble Lords of our commitment to protect children’s data. I ask noble Lords to support the amendment tabled in my name, and hope that the noble Baroness, Lady Kidron, feels content to withdraw her own.
Before the Minister sits down, I have some things to say about his words. I did not hear: “agree to bring forward a government amendment at Third Reading”. Those are the magic words that would help us get out of this situation. I have tried to suggest several times that the Government bring forward their own amendment at Third Reading, drafted in a manner that would satisfy the whole House, with the words of the noble Viscount, Lord Camrose, incorporated and the things that are fundamental.
I very much admire the Minister and enjoy seeing him in his place but I say to him that we have been round this a few times now and a lot of those amendments, while rather nerdy in their obsession, are based on lived experience of trying to hold the regulator and the companies to account for the law that we have already passed. I am seeking those magic words before the Minister sits down.
I have likewise enjoyed working with the noble Baroness. As has been said several times, we are all working towards the same thing, which is to protect children. The age-appropriate design code has been a success in that regard. That is why we are open to exploring what further measures can be put in place in relation to the ICO duty, which can help influence and support the guidance to get that into the right place. That is what I would be more than happy to work on with the noble Baroness and others to make sure that we get it right.
I am presuming a little here that the Minister’s lack of experience in the procedures of the House is holding him back, but I know he is getting some advice from his left. The key thing is that we will not be able to discuss this again in this House unless he agrees that he will bring forward an amendment. We do not have to specify today what that amendment will be. It might not be satisfactory, and we might have to vote against it anyway. But the key is that he has to say this now, and the clerk has to nod in agreement that he has covered the ground properly.
We have done this before on a number of other Bills, so we know the rules. If the Minister can do that, we can have the conversations he is talking about. We have just heard the noble Baroness, Lady Kidron, explain in a very graceful way that this will be from a blank sheet of paper so that we can build something that will command the consensus of the House. We did it on the Online Safety Bill; we can do it here. Please will he say those words?
I am advised that I should say that I am happy for the amendment to be brought forward, but not as a government amendment. We are happy to hear an amendment from the noble Baroness at Third Reading.
Let us be quite clear about this. It does not have to be a government amendment, but the Government Minister has to agree that it can be brought forward.
(2 weeks, 6 days ago)
Lords ChamberMy Lords, I also welcome this plan, perhaps with rather less baggage than the Conservative Benches. The Prime Minister and the Secretary of State invoked Babbage, Lovelace, Turing, the pioneering age of steam and even the white heat of the technological revolution, but at its core there is an important set of proposals with great potential. However, it is a wish list rather than a plan at present.
I particularly welcome the language in the plan around regulation, particularly where it refers to regulation assisting innovation, which is a change of tone. However, the plan and Statement raise many questions. In particular, how will the Government ensure that AI development mitigates risks beyond just safety to ensure responsible AI development and adoption, especially given the fact that a great deal of UK development will involve open-source applications?
On the question of the introduction of AI into the public sector, the Government are enormously enthusiastic. But, given their public sector digital transformation agenda, why are the Government watering down citizens’ rights in automated decision-making in the Data (Use and Access) Bill?
We welcome the recognition of the need to get the economic benefits for the UK from public sector data which may be used to develop AI models. What can the Minister tell us at this stage about what the national data library will look like? It is not clear that the Government yet know whether it will involve primary or secondary legislation or whatever. The plan and response also talk about “sovereign compute”, but what about sovereign cloud capability? The police cannot even find a supplier that guarantees its records will be stored in the UK.
While the focus on UK training is welcome, we must go beyond high-level skills. Not only are the tech companies calling out for technical skills, but AI is also shaping workplaces, services and lives. Will the Digital Inclusion Action Committee, chaired by the noble Baroness, Lady Armstrong, have a role in advising on this? Do the changes to funding and delivery expected for skills boot camps contribute to all of this?
On the question of energy requirements for the new data centres, will the new AI energy council be tasked with ensuring that they will have their own renewable energy sources? How will their location be decided, alongside that of the new AI growth centres?
The plan cannot be game-changing without public investment. It is about delivery, too, especially by the new sovereign data office; it cannot all be done with private sector investment. Where is the public money coming from, and over what timescale? An investment plan for compute is apparently to be married to the spending review; how does a 10-year timescale fit with this? I am very pleased that a clear role is identified for the Alan Turing Institute, but it is not yet clear what level of financial support it will get, alongside university research, exacompute capacity, and the British Business Bank in the spin-out/start-up pipeline support. What will the funding for the Compound Semiconductor Applications Catapult and the design and manufacturing ecosystem consist of?
The major negative in the plan for many of us, as the Minister already knows, is the failure to understand that our creative industries need to be able to derive benefits from their material used for training large language models. The plan ominously recommended reforming,
“the UK text and data mining regime so that it is at least as competitive as the EU”,
and the Government have stacked the cards in the consultation over this. We on these Benches and the creative industries will be fighting tooth and nail any new text and data mining exemption requiring opt-out.
My Lords, I anticipated that this Statement would attract interest from Members of this House, and I thank the noble Lords, Lord Markham and Lord Clement-Jones, for their comments and their broad welcoming of the report. I will try to respond to as many points as I can, but first I will reiterate the importance of this announcement.
Through the publication of the AI Opportunities Action Plan and the Government’s response, we are signalling that our ambition is high when it comes to embracing the opportunities presented by AI. This is a plan to exploit the economic growth that AI will bring and to drive forward the Government’s plan for change. Training the UK’s workforce is a key part of the plan, and there are steps with clear timelines as to when we will do that. I will come back to training a little later.
We need to diffuse AI technology across the economy and public services for better productivity and opportunity, and embrace the transformational impact it is going to have on everyday lives, from health and education to business and government services.
As has rightly been pointed out, AI is advancing at an extraordinary pace. That is why you will see in this response very tight timelines for actions. The one that was picked out on training, which is 2027, is only one part of the response; you will see that Skills England is due to report very shortly with the first phase of its recommendations and will follow that in autumn with further work. So most of the timelines are very tight, recognising the challenge that the pace of advancement in AI brings.
The benefits extend far beyond economic growth. It is the catalyst that we need for a public service revolution, including, of course, in the NHS. It will drive growth and innovation and deliver better outcomes for citizens. It also lies at the heart of two important missions for the Government: kick-starting economic growth and delivering an NHS fit for the future. By investing in AI now, we are ensuring that the UK is prepared to harness the transformational potential that undoubtedly exists. This will improve the quality and delivery of public services. The plan is a way to do that with real speed and ambition.
The issue of regulation has been raised and there is no doubt that the regulatory environment will be critical in driving trust and capitalising on the technology offers that arise. By bringing forward the recommendations in the plan, we will continue to support the AI Safety Institute and further develop the AI assurance ecosystem, including the small companies that will arise as a result, to increase trust in and adoption of AI.
The Government are committed to supporting regulators in evaluating their AI capabilities and understanding how they can be strengthened. Part of this is the role of the regulatory innovation office. The vast majority of AI should be regulated at the point of use by the expert regulators, but some relates to fast-evolving technology. That is why we will continue to deliver on manifesto commitments by placing binding requirements on the developers of the most powerful AI models. Those commitments will build on the work that has already been done at the Seoul and Bletchley AI safety summits and will be part of strengthening the role of the AI Safety Institute. This issue of making sure that we get the safety side of this right as we develop opportunities is of course key.
The question of copyright was raised by the noble Lord, Lord Clement-Jones, and I know that this is an extremely hot issue at the moment, which will be discussed many times over the next few days and weeks. The Government have issued a consultation, in which there are three principles: the owners of copyright should have control; there should be a mechanism to allow access to data to enable companies to develop their models in the UK, rather than elsewhere in the world; and, critically, there must be transparency. Where does the data flow and how can you work out the input from the output? Those three areas are a key part of the consultation and the consultation is crucial. We have a session planned for next week to go through this in some detail, and I invite and welcome all noble Lords to it, because getting this right will be important for the country. I look forward to discussing those proposals over the next few days and weeks.
Delivering the AI Opportunities Action Plan will require a whole-of-government effort. We are starting that work immediately to deliver on the commitments, build the foundations for AI growth, drive adoption across the economy and build UK capability. We are already expecting initial updates on a series of actions by this spring. For instance, DSIT will explore options for growing the domestic AI safety market and will provide a public update on this by spring this year.
Turning to some of the very specific points, I completely agree that training is crucial and we have to get it right. There are several recommendations and, as I said, the earliest will give a readout this spring. I do understand that this is not something that can wait until 2027; it has to start immediately.
It is important to lay out for the House the situation with compute. This spring, there will be access to two new major compute facilities for AI: Dawn in Cambridge and Isambard-AI in Bristol. When fully active this year, they will increase the AI compute facility something like thirtyfold, instantly. Those are the types of compute infrastructure that are needed. It is AI-specific compute infrastructure. It is not the case that the plan for the future starts now; it is happening now and those compute infrastructures will be used by academia, SMEs and others over the course of the year and beyond. The plan beyond that is to increase the compute infrastructure twentyfold by 2030. That requires a 10-year plan and for us to think into the future about what will be needed for us to be at the forefront of this. Exascale of course is different; it is being looked at as part of that, but it is not the same.
On energy, the noble Lord recognises that one of the most difficult things in government is to join up across departments. That is why it is important.
The national data library will be essential. I welcome the offer of help on health from the noble Lord, Lord Markham, and I will certainly take him up on that; this is an important area to look at. Noble Lords will be hearing much more about the national data library over the next few months. I completely agree that, as we develop this technology, we will need to ensure that citizens’ rights are properly protected. That is something that we will continue to discuss as part of the Data (Use and Access) Bill, among other issues.
Funding will be picked up; it is a fully funded programme, but then we will need to go into a spending review, as Governments always have to.
I will wrap up there to leave plenty of time for others to ask questions, but I hope that I have addressed some of the initial questions.
My Lords, on behalf of the Communications and Digital Select Committee of your Lordships’ House, I am pleased to welcome the AI Opportunities Action Plan, with the exception of the recommendation that relates to copyright. We will come back to that. It is important to emphasise the extent to which change will be necessary to deliver on this plan. In particular, the Government have to acknowledge a change in mindset across Whitehall and the public sector.
Perhaps I could ask the Minister how the Government will ensure that the action plan benefits UK start-ups and scale-ups and does not entrench market dominance by the established players in this area.
I thank the noble Baroness for her input to date and on the important copyright issue. The question of market dominance is important. It is worth reflecting that Matt Clifford is an entrepreneur who deals with start-ups; the report is very strong on start-ups and what needs to be done to make sure that they are part of this, including what regulatory change needs to take place to encourage start-ups to do this. At the moment, it is quite difficult for them to navigate the system, including procurement. Government procurement is notoriously difficult for start-ups, and many of the specific aims of the plan pull that together to allow start-ups to access government procurement plans.
So there are very clear ambitions here to make this about growing an ecosystem of companies in this country, while recognising that many of the existing major companies, with which we will also have to work, are not here. Driving this forward will be a key task for DSIT right the way across government. It will need all-of-government activity, as outlined in the report.
My Lords, the Minister talked about the national data library, which is very welcome, but data in the library needs to be safe and its use carefully thought through. What role does the Minister think public interest thresholds should play in deciding what data is collected and how it should be used?
Noble Lords will hear much more about the national data library over the coming months, but it is important to recognise that data is valuable only if it is collected well, curated properly and is interoperable and accessible. We need to ensure that it is properly protected, both for individual privacy, which is the point the noble Lord raises, and to make sure that we get the appropriate valuation of the data and that that value flows back into the UK and into public services. These will all be key features of the national data library.
My Lords, I welcome the Statement, but I draw my noble friend’s attention to the element which refers to the “immense” energy used by this new technology. Is the AI energy council already in the process of estimating the quantity of energy required, and am I right in thinking that the data centres will be placed around the country in locations that enable them to have access to sufficient energy for them to work?
My noble friend is quite right. The energy issue is crucial for any plan for AI, and that is why the energy council is being set up. It is precisely why Culham is the first place identified; it has a significant energy supply already. We anticipate that the centres will be based around the country in places where there is renewable energy or where other sources of energy can be accessed easily in order to provide the power the centres require. It is also important that the council looks at the overall environmental impact, which will be part of this.
On energy consumption, it is known what is required for a single data centre and, as we need multiple data centres, the type and amount we will require is known. It is crucial that this is done on top of everything else that the energy is required for. This is a big and difficult problem, but we can already see an answer to it with the first identification of a site for the AI growth zone.
My Lords, I declare my technology interests as set out the register. I welcome the plan; it has 50 excellent recommendations, but does the Minister not agree that to bring these to life we need an arrowhead focus from government on broad AI legislation—much broader than what is currently planned—that includes an AI authority that is agile, nimbly focused and horizontally applicable; AI-responsible officers; the protection of creatives; and right-sized regulation that is good for citizens, innovators and consumers, in order to deliver according to the fundamental truth that these are our data, our decisions and our AI futures?
I certainly agree that it is a significant challenge, and I add one other thing. The challenge is not only one of regulation of procurement and making sure that we have the data systems correct; it is one of making sure that we actually deliver, rather than talking about it. Delivery will be key, and we need a proper mechanism to deliver this in the form of a mission with real delivery outcomes. That is why I was pleased to see that we have very tight timelines on all the recommendations in the report. We must make sure that that happens and, as we do so, that we bring in the other necessary controls and actions to propel every part of this, from funding start-ups right the way through to procurement, and, as the noble Lord said, ensuring that we look after the privacy and autonomy of the data.
My Lords, the Minister acknowledged the importance of the data collected being interoperable and very reliable. With that mind, what discussions has he had with the First Ministers of Wales and Scotland to ensure that data such as NHS data is collected in a fashion that is comparable and therefore usable?
Clearly, this is a UK-wide issue. I am pleased that Scotland has been at the forefront of data in health for many years and has done an extremely good job of getting that into the right place. As we develop the national data library, these questions of data collection, interoperability, curation—which is incredibly important—and systems to ensure privacy and protection will be discussed widely right across the UK. We need to make sure that everything is interoperable, otherwise we will undo the value that we are creating.
My Lords, I welcome the Minister’s focus on delivery, which is vital if we are to make an impact in AI. I say with the greatest respect to my noble friend Lord Holmes that legislation is the last thing we need. The coalition Government’s experience with the Government Digital Service was to find that we made rapid progress before powers were devolved down to individual departments, which then did everything in their power to make sure that nothing worked. While the Minister focuses on the delivery of the AI action plan, could he sort out the confusing quango landscape that now exists after 14 years of endless initiatives, and perhaps have a central function which relentlessly pushes through this excellent plan?
I thank the noble Lord very much. I will not add to his comments about the 14 years of endless initiatives, but it is crucial that when we do something such as this, we do it properly. Obviously, my experience was in setting up the Vaccine Taskforce to do just that, and this is the same sort of problem. We have to get everybody across government working on this; there is a big delivery task. Delivery should be our focus and we should keep holding ourselves to account for timelines and deliverables.
My Lords, Monday’s Statement on the AI Opportunities Action Plan highlighted the Government’s ambitious vision on AI adoption across the UK, and I welcome it. While the plan outlines significant investment and initiatives to boost AI infrastructure and capabilities, there are concerns about how SMEs will fare in this rapidly evolving landscape, which is largely dominated by the big tech companies. Recent data shows that only 25% of SMEs are currently using AI, despite 42% of them wishing to use it to increase their productivity. However, these small companies often lack the resources and the expertise to fully benefit from AI adoption. What specific measures will be implemented to protect SMEs from being squeezed out by the larger AI companies, and how will the Government facilitate meaningful collaborations between SMEs and the AI giants to foster the innovations and maintain a diverse, competitive AI infrastructure?
I thank my noble friend. There are two different aspects to his SME question—the SME use of AI, and the AI SMEs—and both are dealt with in the report, I think. Many of the recommendations indicate what would be done, but I will outline some of the points on SMEs for AI. There is an important join-up task to be undertaken, which is part of what this plan does: the things we fund at the beginning of the process, such as grants from Innovate UK to get companies off the ground, to supporting that funding through BBB and beyond, linking to regulation to make it as simple as we can to enable innovation, and linking in turn to procurement to ensure that there are procurement signals to allow these companies to get the investment to grow and to scale into the companies they could be.
On the adoption side, there is a specific group working on adoption of AI technologies across the UK and a report is due out by the Government Chief Scientific Adviser and the National Technology Adviser on adoption of technologies more broadly, which is about ensuring that we get uptake of new technologies in companies. We know that we have a long tail of companies that do not do that in the UK, and it will be an important part of making sure that the entire economy benefits.
My Lords, I am sure the Minister has noted that the Statement he has given us has a certain flavour of the 1960s about it, with the talk of harnessing the “white heat” of revolution, and all that, but from the point of view of those of us who went through that period, it might be helpful to know one or two of the things that went wrong, because it did not end terribly happily the last time we had this revolution of white heat. The problem then was that the Government’s PR people became a little too enthusiastic, and the Minister might discourage them today from phrases about seizing the future, embracing this, that and the other, and other generalities, of which there were plenty last time, but none of them led to the results that people wanted.
There is a repeat of the old fallacy that the Government deliver growth. It does not. We know that the Government can facilitate growth and can stop growth, and certainly that has happened in the past, but the idea that the Government alone are somehow going to lead, rather than develop entirely new relationships with the private sector as the digital age demands, is one that needs to be examined carefully before the Government rush into more mistakes.
There is another problem, which the noble Viscount, Lord Stansgate, reminded me of—it was not quite so intense then but it is intense now. This whole revolution and the data centres demand enormous amounts of electricity—far more than seems to be planned by the energy department. It talks about 200 gigawatts, moving up from 65 gigawatts, but data centres can drink whole communities’ electricity, just like that. The Statement mentioned 500 megawatts, but we are really talking about gigawatts of a kind for which no planning is in place at the moment. Can we be assured that the SMR side of the Government’s energy transition gets a push? Will the Minister talk to the energy people and tell them that, unless they bring forward the SMR revolution, which is going on in many other countries, and go slow on the white elephant technologies such as Sizewell C—
I thank the noble Lord for his enthusiasm for the white heat of SMRs, which is an important point. There is a very clear set of recommendations, from an entrepreneur who understands how to set up and run companies. The approach is one of ensuring that there is funding for start-ups, innovation, regulatory clearance and a procurement pool, which are exactly the types of things that will deliver growth. They are facilitators of growth, because the noble Lord is right that growth comes from the private sector. That is what must be supported and that is what this plan aims to do.
On the power supply, I have already said that the join-up between DSIT and DESNZ in the energy council is exactly the right approach to make sure that we get a joined-up government approach to this. I suspect that it will require SMRs, among other approaches to getting energy in the right place.
My Lords, I draw the Minister’s attention The AI Mirror, a book by Shannon Vallor, who holds a chair at the Edinburgh Futures Institute. It makes the crucial point that generative so-called artificial intelligence is not intelligent or creative but only reflects back to us—hence the mirror metaphor—what we have previously created. Will the Government acknowledge that one of the great risks of the explosion in the use of AI is stagnation—a building in and entrenching of the discrimination, racism and inequalities that already exist across our public and private systems, as was infamously demonstrated in Australia in the Robodebt scandal?
It is important to recognise that there is more than one type of AI, including generative AI and specific models. It is the case that AI is very dependent on the data put in, and there are risks of bias being entrenched. That is an important safety issue that must be looked at and that we must be aware of. On whether it is intelligent, the answer is that we are not in the era of general artificial intelligence but at an earlier stage. These are not yet fully intelligent machines. Whether they get to that and over what time period is something of an unknown, but we are in an era where we can do pretty remarkable things, and we should harness that.
My Lords, the Minister will be aware that there has been a tendency for high-tech and research investment to go overwhelmingly to the south and east of England in recent decades. I want to underline the regional dimension of AI. The supercomputer was going to be in Edinburgh, which has an excellent computing faculty and a large element of highly trained people. Leeds and Manchester also have useful workforces already trained for this. The renewable energy and the water—which I understand is necessary to cool these computers—is much more easily available in the north and west of the United Kingdom than in the south and east. Can the Minister ensure, to the best of his ability, that we do not yet again have facilities built in the south and east of England, thus increasing the pressure on housing and everything else in the south and east and leaving the north and west in poverty?
I absolutely assure the noble Lord that he will see growth zones in those areas. They will not be concentrated in the south-east. The reason the first one happened to be in Culham was to do with the immediacy of potential private sector interaction and the power supply. On the compute facility in Edinburgh, ARCHER2, the very important computer there, will be extended to the end of 2026, and we are looking actively at what happens next. I reiterate that that computer is not primarily about AI, although it will have AI capabilities.
My Lords, I reassure my noble friend on the Liberal Democrat Benches that he should not worry too much about this. In September, I spent a significant amount of time in Ayrshire, in the company of a representative of one of the largest asset managers in the world. They were looking for a site in Ayrshire, thankfully, for what has become known as critical compute infrastructure. I was in the company of the local Member of Parliament, who was very keen to get this infrastructure there. In the first conversation we had with this investor, it was clear that access to energy was the most important factor as to whether we got this substantial investment. It was equally clear that global competition for this sort of investment was going to be dependent on the comparative rollout of newer advanced reactors.
We have a particular problem with this in Scotland. The current Scottish National Government are in opposition to building new nuclear power stations. When they were in coalition with the Scottish Greens, the position of the Scottish Greens was that there was nothing safe or secure about nuclear power. The point is that the new advanced reactors are much safer than they were. Will the Government, and the Minister in particular, come to Scotland to talk to SNP politicians and explain that this nuclear power is much safer, and that investment in it will bring this sort of investment into the country, so that we will not be left behind?
I reiterate that SMRs are part of the solution to this: they have lower core power and lower pressure, use a large fraction of coolant, and have safety advantages over traditional approaches. That will be made clear. That is why the AI energy council is so important, to make sure that this is properly thought-through and that we get these in the right place to support the data centres that are required.
(2 weeks, 6 days ago)
Lords ChamberTo ask His Majesty’s Government what assessment they have made of the implications for online safety posed by small, high-risk online platforms, such as 8Chan.
The Government are extremely concerned about the impact of small but risky services that host hateful and harmful content. The Online Safety Act will require such services to remove illegal content and, where relevant, protect children from legal but harmful material. Ofcom has established a Small but Risky supervision task force in recognition of their unique risks. The regulator will identify, manage and enforce against such services where they fail to comply with their duties.
I thank my noble friend the Minister for his Answer, but will he set out whether the Government expect Ofcom to take enforcement action against small but high-harm sites that are identified as problems? Have they made an assessment of the likely timescales for enforcement action, including the use of service disruption measures?
I thank my noble friend for that important question. Where there is evidence of non-compliance, Ofcom has set out that it will move quickly to enforcement, and that action will follow in spring this year, because companies will have had three months to get their positions sorted out—I think that 16 March is the date by which they have to do it. Ofcom will be able to apply fines, including global levies, and it will be able to apply to the courts for business disruption measures and have the flexibility to submit these applications urgently.
My Lords, the Minister’s response is somewhat baffling. Given the amendment to the Bill as it passed through the House, as a result of the amendment from the noble Baroness, Lady Morgan, it was quite clear that high-risk smaller platforms would be included in category 1 and bear all the consequences. Yet, despite the Secretary of State’s concerns, which were expressed in a letter last September, the Government have not insisted that Ofcom include those platforms in category 1. What does that mean? Why are the Government not taking proper legal advice and insisting that these smaller, high-risk platforms bear all the duties of category 1 services?
I thank the noble Lord for his question. Category 1, in the way that the Bill was ultimately approved, was for large sites with many users. The possibility remains that this threshold can be amended. It is worth remembering that category 1 imposes two additional duties: a duty that the company must apply its service agreements properly and a duty that users can make it possible for themselves not to see certain things. For many of the small and harmful sites, those things would not apply anyway, because users have gone there deliberately to see what is there, but the full force of the Act applies to those small companies, which is why there is a special task force to make sure that that is applied properly.
My Lords, Ofcom’s illegal harms code states that it has removed some of the code’s measures from smaller sites, due to evidence that they were not proportionate, but it is not clear which measures have been removed and why. Can the Minister provide further detail on which small sites are impacted and what measures they will not be required to follow?
My understanding of this is that the Online Safety Act applies to all small companies and nobody is exempt. The things that would not apply would be the specific things in category 1, or indeed in category 2A and 2B, which are to do with the ability to apply and monitor a service contract, and the ability to ensure that users can exempt themselves from seeing certain activities. Those would not apply, but everything else does apply, including all the force of the Act in terms of the application to illegal content and the priority harms that have been identified.
I must admit that, probably like many noble Lords, I had to do a bit of research into 8chan and the others as part of this. In fact, I got a bit worried that I might get into trouble doing it on House of Lords servers. What I saw was that, before 8chan, there was 2chan and then 4chan, and 8chan is now 8kun. It is like whack-a-mole: while we can try to do all the technical moves, it is very difficult. So, coming at it from the other end of the telescope, the user end, I think we have done a lot of good things about getting messaging out about anti-fraud and I wonder whether there are things we can learn from that, to educate and equip young people, teachers and parents so that they are aware, and attacking it from that end as well.
I hope the noble Lord does not get caught out from his search terms. Of course, he is absolutely right that part of this is about education and making people aware of what is there. I suspect that, as this gets introduced over the course of this year and enforcement starts, awareness will rise, and it will be incredibly important to include education as well.
My Lords, will my noble friend the Minister kindly tell the House how the Government can ensure that the people who are putting their dates of birth online are actually the people who are putting their dates of birth online? How do we ensure that accuracy?
I thank my noble friend for his question. I am not able to give him a technical answer on exactly how that is done. There are verification systems in place to ensure that, and indeed there are more detailed verification systems coming online in terms of children’s ages. That is something that Ofcom is pursuing, but I will find a more detailed answer for him.
My Lords, the Minister quite rightly mentioned children in his initial Answer, and we all want to protect children primarily, but will he also recognise the harm that can be done to vulnerable adults? I think particularly of those with addiction problems, eating disorders and people with learning disabilities, who are not as safe online as we are. Can he say whether the Government have made an assessment of the different types of harms that are on these smaller sites that fall outside the regulations? Have they broken down this type of harm by distinct categories and will they make this information available?
The so-called Small but Risky task force that was set up in response to an exchange of letters between the Secretary of State and the CEO of Ofcom is undertaking a review of all the risks of these small units. I do not know the detail of whether it has broken it down into the categories suggested by the noble Baroness but I think that is an extremely good idea and I hope it will do it, because it is an important activity.
My Lords, having recognised the Herculean task that Parliament has given Ofcom in terms of regulating platforms—Ofcom is set to become probably the world’s most formidable regulator in this space, with commensurate expertise—I will trot out a quick cliché and say, let us not allow the best to be the enemy of the good but support Ofcom as it navigates this very complex environment. Picking up what the Minister mentioned earlier about education, can he update the House on Ofcom’s plans for what is clunkingly called “media literacy”, because prevention is better than cure and the more we can educate children, and indeed adults, on the perils of the internet and how to navigate it safely, the better it will be? It seems almost to be a bit of an orphan within Ofcom’s responsibilities.
I think the noble Lord is right that Ofcom has a very large task ahead of it. It is a very professional organisation and one that takes all its duties very seriously. I cannot comment in detail on what it is doing on the media side, but I know that that is part of what it intends to do. I will pick up on something else he said: the urgency now is to get this implemented and the danger is that we add lots of things to it now. We must get on and do this. It is very important to get this working. We know that the enforcement starts just after March and that the new codes for children will come out in early summer. Getting this moving is the key priority, and working out how to stop the really unacceptable activity that goes on on some of these sites.
My Lords, is the Minister sensitive to the dangers to free speech of overfetishising online safety and to the censorship recently admitted to by the head of Meta, Mark Zuckerberg? This is all under the cloak of Governments demanding the clamping down on online harms. Are the Government advising Ofcom to ensure that any overzealousness, however well intentioned, should be reined in for the protection of free speech in a democratic society?
Well, the issue of Meta is one for the US. It applies there and not here. The rules of the Online Safety Act apply across all companies and we expect all companies to adhere to them. They are carefully calibrated and designed to ensure the safety of users and to protect them from sometimes disgraceful content.
(3 weeks ago)
Lords ChamberThat the Bill be considered on Report in the following order:
Clauses 1 to 56, Schedule 1, Clauses 57 and 58, Schedule 2, Clauses 59 to 65, Schedule 3, Clauses 66 to 70, Schedule 4, Clause 71, Schedule 5, Clauses 72 to 80, Schedule 6, Clauses 81 to 84, Schedules 7 to 9, Clauses 85 to 102, Schedule 10, Clauses 103 to 107, Schedule 11, Clauses 108 to 111, Schedule 12, Clauses 112 and 113, Schedule 13, Clauses 114 and 115, Schedule 14, Clauses 116 to 119, Schedule 15, Clause 120, Schedule 16, Clauses 121 to 138, Title.
(2 months ago)
Lords ChamberMy Lords, I refer to my interest as chair of the National Preparedness Commission and beg leave to ask the Question standing in my name on the Order Paper.
We are working closely with international partners following the breakage of two subsea telecommunications cables in the Baltic Sea a fortnight ago. It is important that we let those investigations run their course. Subsea cables are critical to UK telecommunications digital infrastructure, and we are committed to maintaining and enhancing the security and resilience of that infra- structure. We will continue to co-ordinate with security partners, the subsea cables industry and international bodies on this issue.
My Lords, I am grateful to the Minister for that reply and the recognition of the criticality to the UK of these subsea connections. What consideration are the Government giving to protection and making sure that we can recover quickly in circumstances in which those cables are disrupted or severed? I understand that in Australia, for example, the equivalent of Ofcom requires a licence from those making those connections, and that licence must specify what arrangements are in place for the immediate repair of any severed cable. Are we considering such measures or any others?
I thank my noble friend for that question. There are 64 cable systems that leave the UK, with 116 cables. About 200 cables break every year around the world, and 10 to 20 of those are in the UK. There is a system of payment from the companies for a ship which gives 24-hour, seven-days-a-week coverage for repairs, as well as systems, of course, to get other commercial repairs done at a slower pace. We work closely with others around the world, including the Australians, and are aware of that model. There are rather specific circumstances which mean that, at the moment, that does not work here, but the ability to get ships rapidly to broken cables is important and that is facilitated by the planning arrangements in place.
My Lords, protection is important, but there is no such thing as a perfect defence. Apart from repair, resilience is crucial in this area as in so many others. What stress-testing has been carried out to identify the range of impacts that could result from interruptions to our undersea infrastructure? What measures would be necessary to ameliorate the impact of those interruptions?
The cable system is regularly reviewed. As I said, 10 to 20 cable breaks occur per year, largely as a result of fishing, anchor pullage and undersea landslips. DSIT, the MoD and other parts of the system review this under the national risk assessment to keep looking at what is required for a resilient system.
My Lords, do we have an agreed retaliation doctrine?
As I have said, the view is that, of the 10 to 20 breaks per year, nearly all are due to fishing vessels, anchors and natural events under the sea. Clearly, that is not a retaliation issue. I think the noble Earl is talking about malign attacks, and we have to wait for the outcome of the investigations into the current breaks.
My Lords, the German Defence Minister was quick off the mark in ascribing the damage to sabotage. Do the Government agree? Do they point the finger at the Chinese or the Russians?
At the moment, the answer is neither, because an investigation is being undertaken by Lithuanian, Swedish, Finnish and German ministries to try to understand exactly what went on. Until that report is out, it is premature to speculate.
My Lords, it is not just undersea fibre-optic cables which bring vital supplies to our shores. UK energy security is highly dependent on undersea gas pipelines and electricity interconnectors. Recently, we have seen reports of suspicious Russian ships near Norwegian gas hubs. Pat McFadden has warned of cyberattacks on our energy networks. Can the Minister reassure us that the UK Government are actively working with our allies to provide adequate protection for our undersea energy infrastructure?
I thank the noble Earl for the question. This is an important area. As I have said, most of the breaks are not malign, but there is, of course, that risk. Regular reviews are undertaken as part of the national risk assessment. The MoD works with DSIT and others to look at what the risks are. We also work continuously with partners, including NATO. In 2023, there was a specific NATO action to look at critical undersea infrastructure co-ordination to make sure that a response and detection system was in place.
My Lords, the joint maritime operations centre, together with the embedded national maritime intelligence centre, are able to monitor shipping throughout our EEZ and European waters, so we know where those ships are. We know which Russian ships in particular are involved in this sort of operation. We have now purchased the RFA “Proteus”, which we should be able to get to where these events are happening. When will we get the second ship? When will we be on top of one of these when it is doing something? I do not know about retaliation, but would we then be able to arrest a ship in our EEZ that was damaging one of our cables?
I thank the noble Lord for the question. It is quite a long way from my brief in DSIT. If I may, I shall try to get somebody to answer it for him.
My Lords, according to experts, around 75% of transatlantic undersea cables in the northern hemisphere pass through or near Irish Sea waters. As a country that spends around 0.2% of its GDP on security and defence, the Republic of Ireland does not possess anywhere near the capability to protect them. Has this job fallen to the United Kingdom Government? If so, who is paying the bill?
The detection of breaks is done from land, but the ability to repair them is through an agreement with the commercial companies, which pay into a fund that allows a ship to be on 24/7 standby to provide protection. That is paid for by the companies that put the cables in place.
My Lords, we of course recognise and share the Government’s and House’s concern about increased Russian military activity around these undersea cables. I was pleased that the Minister a couple of times referenced the risk assessments going on, but can he tell the House a little more and expand on his earlier answers about those risk assessments? How do they take place and how often do they occur?
The national risk assessment is undertaken regularly and led by the Cabinet Office. In this instance, DSIT is the department responsible for the risk to the cables overall, but it is in collaboration with the MoD, the Cabinet Office and others, particularly in relation to assessing risks other than those that I have outlined.
My Lords, are these cables used only for civilian purposes or also for military purposes?
The cables provide the connections that we need for all purposes across telecommunications.
My Lords, may I tempt the Minister again to stray a little from his brief and to return to naval support? At the moment, RFA “Proteus” is the only ship that protects our undersea cable structure. Is his department making representations to the strategic defence review to ensure that a second vessel is purchased?
There are two vessels. The “Sovereign” is the repair vessel I referred to, which the cable companies pay for and is on standby 24/7 to repair the cables. “Proteus” has a different purpose; it is an MoD vessel that can take account of all underwater structures. It is not a DSIT vessel but an MoD vessel with broad responsibilities.
My Lords, one way to mitigate risk is to have redundancy in the capacity of the cables, but redundancy costs money for the commercial organisations that own those cables. What is DSIT doing to ensure that there is sufficient redundancy to give us the protection that we need?
I thank the noble Lord for a very important question. As I said, there are some 64 cable systems and 116 cables. We have a lot of redundancy in the system. Despite getting 10 to 20 breaks every year, they do not lead to an interruption because of that redundancy. Three things are important for the redundancy: the number of cables, the geographical diversity or spread of the cables—which provides protection—and the 24/7 emergency repair capability, with a planning consent that allows the vessel to get in very quickly.
(2 months, 2 weeks ago)
Lords ChamberWe have heard really wonderful insights and thoughtful contributions from across your Lordships’ House this afternoon and I am really grateful to the noble Baroness, Lady Stowell, for organising this engaging debate on such an important topic. It is probably the only debate I am going to take part in which has LLMs, SLMs, exaflops, Eeyore and Tigger in the same sitting.
The excellent report from the Communications and Digital Committee was clear that AI presents an opportunity, and it is one that this Government wish to seize. Although the report specified LLMs and generative AI, as has been pointed out by many, including the noble Lord, Lord Knight, AI is of course broader than just that. It represents a route to stronger economic growth and a safer, healthier and more prosperous society, as the noble Viscount, Lord Camrose, has just said, and we must harness it—it is incredibly important for the country.
Breakthroughs in general-purpose technologies are rare—the steam engine, electricity and the internet—and AI is set to be one such technology. The economic opportunities are already impressive. The AI market contributed £5.8 billion in GVA to our economy in 2023, it employs over 60,000 people and is predicted to grow rapidly in size and value over the next decade. Investing in technology has always been important for growth, and investing in AI is no exception.
Today, already, a new generation of UK-founded companies is ensuring that we are at the forefront of many of these approaches, and leading AI companies have their European headquarters in London. We have attracted significant investment from global tech giants—AWS, Microsoft, CoreWeave and Google—amounting to over £10 billion. This has bolstered our AI infrastructure, supported thousands of jobs and enhanced capacity for innovation.
The investment summit last month resulted in commitments of £63 billion, of which £24.3 billion was directly related to AI investment. The UK currently ranks third globally in several key areas: elite AI talent, the number of AI start-ups, inward investment into AI, and readiness for AI adoption. But we need to go further. In July, DSIT’s Secretary of State asked Matt Clifford to develop an ambitious AI opportunities action plan. This will be published very soon and will set out the actions for government to grow the UK’s AI sector, drive adoption of AI across the economy, which will boost growth and improve products and services, and harness AI’s power to enhance the quality and efficiency of public services. Of course, as was raised early in this debate, this also has to be about creating spin-outs and start-ups and allowing them to grow.
One of the largest near-term economic benefits of AI is the adoption of existing tools to transform businesses and improve the quality of work—a point raised very clearly by the noble Lord, Lord Ranger. AI tools are already being used to optimise complex rotas, reduce administrative burdens and support analytical capabilities and information gathering, and in healthcare to interpret medical scans, giving back more time for exchanges that truly need a human touch. Government will continue to support organisations to strengthen the foundations required to adopt AI; this includes knowledge, data, skills, talent, intellectual property protections and assurance measures. I shall return to some of those points.
In the public sector, AI could unlock a faster, more efficient and more personalised offer to its citizens, at better value to the taxpayer. In an NHS fit for the future—the noble Lord, Lord Tarassenko, made these points very eloquently—AI technology could transform diagnostics and reduce simpler things, such as administrative burdens, improving knowledge and information flows within and between institutions. It could accelerate the discovery and development of new treatments—and valuable datasets, such as the UK Biobank, will be absolutely essential.
The noble Lord, Lord Tarassenko, rightly identified the importance of building large multimodal models on trusted data and the opportunity that that presents for the UK—a point that the noble Lord, Lord Knight, also raised. Several NHS trusts are already running trials on the use of automated transcription software. The NHS and DHSC are developing guidance to ensure responsible use of these tools and how they can be rolled out more widely.
The noble Lord, Lord Kamall, rightly pointed out the role of the human in the loop, as we start to move these things into the healthcare sector. The Government can and should act as an influential customer to the UK AI sector by stimulating demand and providing procurement. That procurement pool will be increasingly important as companies scale.
DSIT, as the new digital centre of government, is working to identify promising AI use cases and rapidly scale them, and is supporting businesses across the UK to be able to do the same. The new Incubator for Artificial Intelligence is one example.
The Government recently announced that they intend to develop an AI assurance platform, which should help simplify the complex AI assurance and governance landscape for businesses, so that many more businesses can start with some confidence.
Many noble Lords touched on trust, and AI does require trust; it is a prerequisite for adopting AI. That is why we have committed to introducing new, binding requirements on the handful of companies developing the most advanced AI models, as we move towards the potential of true artificial general intelligence. We are not there yet, as has been pointed out. This legislation will build on the voluntary commitments secured at the Seoul and Bletchley Park AI safety summits and will strengthen the role of the AI Safety Institute, putting it on a statutory footing.
We want to avoid creating new rules for those using AI tools in specific sectors—a point that the noble Viscount, Lord Camrose, raised—and will instead deal with that in the usual way, through existing expert regulators. For example, the Office for Nuclear Regulation and the Environment Agency ran a joint AI sandbox last year, looking at AI and the nuclear industry. The Medicines and Healthcare Products Regulatory Agency, or MHRA, launched one on AI medical devices. We have also launched the Regulatory Innovation Office to try to streamline the regulatory approach, which will be particularly important for AI, ensuring that we have the skills necessary for regulators to be able to undertake this new work. That point was raised by several people, including the noble Baroness, Lady Healy.
New legislation will instead apply to the small number of developers of the most far-reaching AI models, with a focus on those systems that are coming tomorrow, not the ones we have today. It will build on the important work that the AI Safety Institute has undertaken to date. Several people asked whether that approach is closer to the USA or the EU. It is closer to the US approach, because we are doing it for new technologies. We are not proposing specific regulation in the individual sectors, which will be looked after by the existing regulators. The noble Lords, Lord Knight and Lord Kamall, raised those points.
It is important—everyone has raised this—that we do not introduce measures that restrict responsible innovation. At the recent investment summit, leaders in the field were clear: some guidelines are important. They create some clarity for companies. Companies currently do not have enough certainty and cannot progress. Getting that balance right will be essential and that is why, as part of this AI Bill, we will be launching an extensive consultation, leading to input, I hope, from experts from industry, academia and, of course, from this House, where many people have indicated today the very insightful points they have to make.
I was asked by the noble Lord, Lord Ranger, whether pro-innovation regulation would be the theme. That was a topic of a review that I undertook in my last role and that will certainly be a theme of what we wish to do. We will continue to lead the development of international standards through the AI Standards Hub—a partnership between the Alan Turing Institute, the British Standards Institution and the National Physical Laboratory—and by working with international bodies. Indeed, I went to speak to one of the international standards bodies on this topic a few weeks ago.
I turn to some other specific points that were raised during the debate. The AI Safety Institute’s core goal is to make frontier AI safer. It works in partnership with businesses, Governments and academia to develop research on the safety of AI and to evaluate the most capable models. It has secured privileged access to top AI models from leading companies, including test models pre deployment and post deployment with OpenAI, Google DeepMind and Anthropic among others. The institute has worked very closely with the US to launch the international network of AI safety institutes, enabling the development and adoption of interoperable principles, policies and best practice. That meeting has taken place in California this week. The noble Baroness, Lady Wheatcroft, asked for an update and I think we will have the update when the readout of that meeting is known. Just this week the AI Safety Institute shared a detailed report outlining pre-deployment of Anthropic’s upgraded Claude 3.5 Sonnet model. This will help advance the development of shared scientific benchmarks and best practices of safety testing and is an important step because it begins to show exactly how these things can also be made public.
I was asked about mandatory safety testing. I think this model, which has been a voluntary one and has engaged big companies so that they want to come to the AI Safety Institute, is the correct one. I have also noted that there are some other suggestions as to how people may report safety issues. That is an important thing to consider for the future.
To respond to the points raised by the noble Lords, Lord Strasburger and Lord Griffiths, the question of the existential threat is hotly debated among experts. Meta scientist Yann LeCun states that fears that AI will pose a threat to humanity are “preposterously ridiculous”. In contrast, Geoffrey Hinton has said it is time to confront the existential dangers of artificial intelligence. Another British Nobel prize winner, Demis Hassabis, the CEO of DeepMind, one of the most important AI companies in the world, suggests a balanced view. He has expressed optimism about AI, with its potential to revolutionise many fields, but emphasises the need to find a middle way for managing the technology.
To better understand these challenges, the Government have established a central AI risk function which brings together policymakers and AI experts with a mission to continuously monitor, identify, assess and prepare for AI-associated risks. That must include in the long term the question of whether what I will call “autonomous harm” is a feature that will emerge and, if so, over what time and what the impact of that might be.
I turn to data, the very feedstock for AI. First, data protection law applies to any processing of personal data, regardless of the technology, and we are committed to maintaining the UK’s strong data protection framework. The national data library will be the key to unlocking public data in a safe and secure way, and many speakers this afternoon have indicated how important it will be to have the data to ensure that we get training of the models. There is a huge opportunity, particularly, as has been indicated, in relation to areas such as the NHS.
The Information Commissioner’s Office has published guidance that outlines how organisations developing and using AI can ensure that AI systems that process personal data do so in ways that are accountable, transparent and fair.
On copyright, I will not list the numerous noble Lords who have made comments on copyright. It is a crucial area, and the application of copyright law to AI is as disputed globally as it is in the UK. Addressing uncertainty about the UK’s copyright framework for AI is a priority for DSIT and DCMS. We are determined to continue to enable growth in our AI and creative industries, and it is worth noting that those two are related. It is not that the creative industries are on one side and AI on the other; many creative individuals are using AI for their work. Let me say up front that the Government are committed to supporting the power of human-centred creativity as well as the potential of AI to unlock new horizons.
As the noble Baroness, Lady Featherstone, has rightly pointed out, rights holders of copyright material have called for greater control over their content and remuneration where it is used to train AI models, as well as for greater transparency. At the same time, AI developers see access to high-quality material as a prerequisite to being able to train world-leading models in the UK. Developing an approach that addresses these concerns is not straightforward, and there are issues of both the input to models and the assessment of the output from models, including the possibility of watermarking. The Government intend to engage widely, and I can confirm today that we will shortly launch a formal consultation to get input from all stakeholders and experts. I hope that this starts to address the questions that have been raised, including at the beginning by the noble Baroness, Lady Stowell, as well as the comments by the noble Baroness, Lady Healy.
On the important points that the noble Viscount, Lord Camrose, raises about offshoring and the need for international standards, I completely agree that this is a crucial area to look at. International co-operation will be crucial and we are working with partners.
We have talked about the need for innovation, which requires fair and open competition. The Digital Markets, Competition and Consumers Act received Royal Assent in May, and the Government are working closely with the Competition and Markets Authority to ensure that the measures in the Act commence by January 2025. It equips the CMA with more tools to tackle competition in the digital and AI markets. The CMA itself undertook work last year that identified the issues in some of the models that need to be looked at.
Demand for computing resource is growing very quickly. It is not just a matter of size but of configuration and systems architecture. Two compute clusters are being delivered as part of the AI research resource in Bristol and Cambridge. They will be fully operational next year and will expand the UK’s capacity thirtyfold. Isambard-AI is made up of more than 5,500 Nvidia GPUs and will be the UK’s most powerful public AI compute facility once it is fully operational next year. The AI opportunities action plan will set out further requirements for compute, which we will take forward as part of the multiyear spending review. I just say in passing that it is quite important not to conflate exascale with AI compute; they are different forms of computing, both of which are very important and need to be looked at, but it is the AI compute infrastructure that is most relevant to this.
The noble Lord, Lord Tarassenko, and the noble Baroness, Lady Wheatcroft, asked about sovereign LLMs and highlighted the opportunity to build new models based on really specific trusted data sources in the UK. This point was also raised in the committee report and is a crucial one.
I have tried to answer all the questions. I hope that I have but, if I have not, I will try to do so afterwards. This is a really crucial area and I am happy to come back and update as this goes on, as the noble Viscount, Lord Camrose, asked me to. We know that this is about opportunity, but we also know that people are concerned, rightly, about socioeconomic risks, labour market rights and infringement of rights.
There are several other points I would make. It is why we have signed the Council of Europe’s convention on AI and human rights, why we are funding the Fairness Innovation Challenge to develop solutions to AI bias, why the algorithmic transparency recording standard is being rolled out across all departments, why the Online Safety Act has powers to protect against illegal content and specifically to prevent harms to children and why the central AI risk function is working with the AI Safety Institute to identify and reduce the broader risks. The Government will drive private and public sector AI development, deployment and adoption in a safe, responsible and trustworthy way including, of course, with international partners.
I thank noble Lords for their comments today. It is with great urgency that we start to rebuild Britain, using the technology we have today, and prepare for the technologies of tomorrow. We are determined, as the noble Viscount, Lord Camrose, said, that everyone in society should benefit from this revolutionary technology. I look forward very much to continuing engagement on this important topic with what I hope is an increasing number of noble Lords who may find this rather relevant to everyday life.