(6 days, 10 hours ago)
Lords ChamberMy Lords, I draw attention to my AI interests in the register. I thank the Minister for her upbeat introduction to the Bill and all her engagement to date on its contents. It has been a real pleasure listening to so many expert speeches this afternoon. The noble Lord, Lord Bassam, did not quite use the phrase “practice makes perfect”, because, after all, this is the third shot at a data protection Bill over the past few years, but I was really taken by the vision and breadth of so many speeches today. I think we all agree that this Bill is definitely better than its two predecessors, but of course most noble Lords went on to say “but”, and that is exactly my position.
Throughout, we have been reminded of the growing importance of data in the context of AI adoption, particularly in the private and public sectors. I think many of us regret that “protection” is not included in the Bill title, but that should go hand in hand if not with actual AI regulation then at least with an understanding of where we are heading on AI regulation.
Like others, I welcome that the Bill omits many of the proposals from the unlamented Data Protection and Digital Information Bill, which in our view— I expect to see a vigorous shake of the head from the noble Viscount, Lord Camrose—watered down data subject rights. The noble Lord, Lord Bassam, did us a great favour by setting out the list of many of the items that were missing from that Bill.
I welcome the retention of some elements in this Bill, such as the digital registration of birth and deaths. As the noble Lord, Lord Knight, said, and as Marie Curie has asked, will the Government undertake a review of the Tell Us Once service to ensure that it covers all government departments across the UK and is extended to more service providers?
I also welcome some of the new elements, in particular amendments to the Online Safety Act—essentially unfinished business, as far back as our Joint Committee. It was notable that the noble Lord, Lord Bethell, welcomed the paving provisions regarding independent researchers’ access to social media and search services, but there are questions even around the width of that provision. Will this cover research regarding non-criminal misinformation on internet platforms? What protection will researchers conducting public interest research actually receive?
Then there is something that the noble Baroness, Lady Kidron, Ian Russell and many other campaigners have fought for: access for coroners to the data of young children who have passed away. I think that will be a milestone.
The Bill may need further amendment. On these Benches we may well put forward further changes for added child protection, given the current debate over the definition of category 1 services.
There are some regrettable omissions from the previous Bill, such as those extending the soft opt-in that has always existed for commercial organisations to non-commercial organisations, including charities. As we have heard, there are a considerable number of unwelcome retained provisions.
Many noble Lords referred to “recognised legitimate interests”. The Bill introduces to Article 6 of the GDPR a new ground of recognised legitimate interest, which counts as a lawful basis for processing if it meets any of the descriptions in the new Annex 1 to the GDPR in Schedule 4 of the Bill. The Bill essentially qualifies the public interest test under Article 6(1)(e) of the GDPR and, as the noble Lord, Lord Vaux, pointed out, gives the Secretary of State powers to define additional recognised legitimate interests beyond those in the annex. This was queried by the Constitution Committee, and we shall certainly be kicking the tyres on that during Committee. Crucially, there is no requirement for the controller to make any balancing test, as the noble Viscount, Lord Colville, mentioned, taking the data subject’s interests into account. It just needs to meet the grounds in the annex. These provisions diminish data protection and represent a threat to data adequacy, and should be dropped.
Almost every noble Lord raised the changes to Article 22 and automated decision-making. With the exception of sub-paragraph (d), to be inserted by Clause 80, the provisions are very similar to those of the old Clause 14 of the DPDI Bill in limiting the right not to be subject to automated decision-making processing or profiling to special category data. Where automated decision-making is currently broadly prohibited with specific exceptions, the Bill will permit it in all but a limited set of circumstances. The Secretary of State is given the power to redefine what ADM actually is. Again, the noble Viscount, Lord Colville, was right in how he described what the outcome of that will be. Given the Government’s digital transformation agenda in the public sector and the increasing use of AI in the private sector, this means increasing the risk of biased and discriminatory outcomes in ADM systems.
Systems such as HART, which predicted reoffending risk, PredPol, which was used to allocate policing resources based on postcodes, and the gangs matrix, which harvests intelligence, have all been shown to have had discriminatory effects. It was a pleasure to hear what the noble Lord, Lord Arbuthnot, had to say. Have the Government learned nothing from the Horizon scandal? As he said, we need to move urgently to change the burden of proof for computer evidence. What the noble Earl, Lord Errol, said, in reminding us of the childlike learning abilities of AI, was extremely important in that respect. We should not put our trust in that way in the evidence given by these models.
ADM safeguards are critical to public trust in AI, and our citizens need greater not less protection. As the Ada Lovelace Institute says, the safeguards around automated decision-making, which exist only in data protection law, are more critical than ever in ensuring that people understand when a significant decision about them is being automated, why that decision has been made, and the routes to challenge it or ask for it to be decided by a human. The noble Viscount, Lord Colville, and the noble Lord, Lord Holmes, set out that prescription, and I entirely agree with them.
This is a crucial element of the Bill but I will not spend too much time on it because, noble Lords will be very pleased to hear, I have a Private Member’s Bill on this subject, providing much-needed additional safe- guards for ADM in the public sector, coming up on 13 December. I hope noble Lords will be there and that the Government will see the sense of it in the meantime.
We have heard a great deal about research. Clause 68 widens research access to data. There is a legitimate government desire to ensure that valuable research does not have to be discarded because of a lack of clarity around reuse or because of very narrow distinctions between the original and new purpose. However, it is quite clear that the definition of scientific research introduced by the Bill is too broad and risks abuse by commercial interests. A number of noble Lords raised that, and I entirely agree with the noble Baroness, Lady Kidron, that the Bill opens the door to data reuse and mass data scraping by any data-driven product development under the auspices of scientific research. Subjects cannot make use of their data rights if they do not even know that their data is being processed.
On overseas transfers, I was very grateful to hear what the noble and learned Lord, Lord Thomas, had to say about data adequacy, and the noble Lords, Lord Bethell, Lord Vaux and Lord Russell, also raised this. All of us are concerned about the future of data adequacy, particularly the tensions that are going to be created with the new Administration in the US if there are very different bases for dealing with data transfer between countries.
We have concerns about the national security provisions. I will not go into those in great detail, but why do the Government believe that these clauses are necessary to safeguard national security?
Many noble Lords raised the question of digital verification services. It was very interesting to hear what the noble Earl, Lord Erroll, had to say, given his long-standing interest in this area. We broadly support the provisions, but the Constitution Committee followed the DPRRC in criticising the lack of parliamentary scrutiny of the framework to be set by the Secretary of State or managed by DSIT. How will they interoperate with the digital identity verification services being offered by DSIT within the Government’s One Login programme?
Will the new regulator be independent, ensure effective governance and accountability, monitor compliance, investigate malicious actors and take enforcement action regarding these services? For high levels of trust in digital ID services, we need high-quality governance. As the noble Lord, Lord Vaux, said, we need to be clear about the status of physical ID alongside that. Why is there still no digital identity offence? I entirely agreed with what the noble Lords, Lord Lucas and Lord Arbuthnot, said about the need for factual clarity underlying the documents that will be part of the wallet—so to speak—in terms of digital ID services. It is vital that we distinguish and make sure that both sex and gender are recorded in our key documents.
There are other areas about which we on these Benches have concerns, although I have no time to go through them in great detail. We support the provisions on open banking, which we want to see used and the opportunities properly exploited. However, as the noble Lord, Lord Holmes, said, we need a proper narrative that sells the virtues of open banking. We are concerned that the current design allows landlords to be given access to monitoring the bank accounts of tenants for as long as an open banking approval lasts. Smart data legislation should mandate that the maximum and default access duration be no longer than 24 hours.
A formidable number of noble Lords spoke about web trawling by AI developers to train their models. It is vital that copyright owners have meaningful control over their content, and that there is a duty of transparency and penalties for scraping news publisher and other copyrighted content.
The noble and learned Lord, Lord Thomas, very helpfully spoke about the Government’s ECHR memorandum. I do not need to repeat what he said, but clearly, this could lead to a significant gap, given that the Retained EU Law (Revocation and Reform) Act 2023 has not been altered and is not altered by this Bill.
There are many other aspects to this. The claims for this Bill and these provisions are as extravagant as for the old one; I think the noble Baroness mentioned the figure of £10 billion at the outset. We are in favour of growth and innovation, but how will this Bill also ensure that fundamental rights for the citizen will be enhanced in an increasingly AI-driven world?
We need to build public trust, as the noble Lord, Lord Holmes, and the noble Baroness, Lady Kidron, said, in data sharing and access. To achieve the ambitions of the Sudlow review, there are lessons that need to be learned by the Department of Health and the NHS. We need to deal with edtech, as has been described by a number of noble Lords. All in all, the Government are still not diverging enough from the approach of their predecessor in their enthusiasm for the sharing and use of data across the public and private sectors without the necessary safeguards. We still have major reservations, which I hope the Government will respond to. I look forward—I think—to Grand Committee.
(2 weeks ago)
Lords ChamberTo ask His Majesty’s Government, following the recommendation of the Vallance review of the regulation of emerging digital technologies, whether they plan to set out a policy position on the relationship between intellectual property rights and the training of generative AI models.
My Lords, the AI and creative sectors are both essential to our mission to grow the UK economy. Our goal is to find the right balance between fostering innovation in AI while ensuring protection for creators and our vibrant creative industries. This is an important but complex area and we are very aware of the need to resolve the issues. We are working with stakeholders to understand their views and will set out our next steps soon.
My Lords, I thank the Minister for that reply, but the Prime Minister, in a recent letter to the News Media Association, said:
“We recognise the basic principle that publishers should have control over and seek payment for their work, including when thinking about the role of AI”.
Will the Minister therefore agree with the House of Lords Communications and Digital Committee and affirm the rights of copyright owners in relation to their content used for training purposes on large language models? Will she rule out any widening of the text and data-mining exception and include in any future AI legislation a duty on developers to keep records of the material and data used to train their AI models?
My Lords, I pay tribute to the Lords committee that has considered this issue. We are keen to make progress in this area but it is important that we get it right. The previous Government had this on their table for a long time and were not able to resolve it. The Intellectual Property Office, DSIT and DCMS are working together to try to find a way forward that will provide a solution for creative media and the AI sectors. Ministers—my colleagues Chris Bryant and Feryal Clark—held round tables with representatives of the creative industries and the AI sector recently, and we are looking at how we can take this forward to resolve the many issues and questions that the noble Lord has quite rightly posed for me today.
(4 weeks ago)
Grand CommitteeMy Lords, this order was laid before the House on 9 September this year. The Online Safety Act lays the foundations of strong protection for children and adults online. I am grateful to noble Lords for their continued interest in the Online Safety Act and its implementation. It is critical that the Act is made fully operational as soon as possible, and the Government are committed to ensuring that its protections are delivered as soon as possible. This statutory instrument will further support the implementation of the Act by Ofcom.
This statutory instrument concerns Ofcom’s ability to share business information with Ministers for the purpose of fulfilling functions under the Online Safety Act 2023, under Section 393 of the Communications Act 2003. This corrects an oversight in the original Online Safety Act that was identified following its passage.
Section 393 of the Communications Act 2003 contains a general restriction on Ofcom disclosing information about particular businesses without consent from the affected businesses, but with exemptions, including where this facilitates Ofcom in carrying out its regulatory functions and facilitates other specified persons in carrying out specific functions. However, this section does not currently enable Ofcom to share information with Ministers for the purpose of fulfilling functions under the Online Safety Act. This means that, were Ofcom to disclose information about businesses to the Secretary of State, it may be in breach of the law.
It is important that a gateway exists for sharing information for these purposes so that the Secretary of State can carry out functions under the Online Safety Act, such as setting the fee threshold for the online safety regime in 2025 or carrying out post-implementation reviews of the Act required under Section 178. This statutory instrument will therefore amend the Communications Act 2003 to allow Ofcom to share information with the Secretary of State and other Ministers, strictly for the purpose of fulfilling functions under the Online Safety Act 2023.
There are strong legislative safeguards and limitations on the disclosure of this information, and Ofcom is experienced in handling confidential and sensitive information obtained from the services it regulates. Ofcom must comply with UK data protection law and would need to show that the processing of any personal data was necessary for a lawful purpose. As a public body, Ofcom is also required to act compatibly with the Article 8 right of privacy under the European Convention on Human Rights.
We will therefore continue to review the Online Safety Act, so that Ofcom is able to support the delivery of functions under the Act where it is appropriate. That is a brief but detailed summary of why this instrument is necessary. I should stress that it contains a technical amendment to deal with a very small legal aspect. Nevertheless, I will be interested to hear noble Lords’ comments on the SI. I beg to move.
My Lords, I thank the Minister for her introduction and for explaining the essence of the SI. We all have a bit of pride of creation in the Online Safety Act; there are one or two of us around today who clearly have a continuing interest in it. This is one of the smaller outcomes of the Act and, as the Minister says, it is an essentially an oversight. I would say that a tidying-up operation is involved here. It is rather gratifying to see that the Communications Act still has such importance, 21 years after it was passed. It is somewhat extraordinary for legislation to be invoked after that period of time in an area such as communications, which is so fast-moving.
My question for the Minister is whether the examples that she gave or which were contained in the Explanatory Memorandum, regarding the need for information to be obtained by the Secretary of State in respect of Section 178, on reviewing the regulatory framework, and Section 86, on the threshold for payment of fees, are exclusive. Are there other aspects of the Online Safety Act where the Secretary of State requires that legislation?
We are always wary of the powers given to Secretaries of State, as the noble Viscount, Lord Camrose, will probably remember to his cost. But at every point, the tyres on legislation need to be kicked to make sure that the Secretary of State has just the powers that they need—and that we do not go further than we need to or have a skeleton Bill, et cetera—so the usual mantra will apply: we want to make sure that the Secretary of State’s powers are proportionate.
It would be very useful to hear from the Minister what other powers are involved. Is it quite a number, were these two just the most plausible or are there six other sets of powers which might not be so attractive? That is the only caveat I would make in this respect.
(4 weeks ago)
Grand CommitteeMy Lords, these regulations were laid before the House on 12 September this year. The Government stated in their manifesto that they would
“use every government tool available to target perpetrators and address the root causes of abuse and violence”
in order to achieve their
“landmark mission to halve violence against women and girls in a decade”.
Through this statutory instrument, we are broadening online platforms’ and search engines’ responsibilities for tackling intimate image abuse under the Online Safety Act. More than one in three women have experienced abuse online. The rise in intimate image abuse is not only devastating for victims but also spreads misogyny on social media that can develop into potentially dangerous relationships offline. One in 14 adults in England and Wales has experienced threats to share intimate images, rising to one in seven young women aged 18 to 34.
It is crucial that we tackle these crimes from every angle, including online, and ensure that tech companies step up and play their part. That is why we are laying this statutory instrument. Through it, we will widen online platforms’ and search engines’ obligations to tackle intimate image abuse under the Online Safety Act. As noble Lords will know, the Act received Royal Assent on 26 October 2023. It places strong new duties on online user-to-user platforms and search services to protect their users from harm.
As part of this, the Act gives service providers new “illegal content duties”. Under these duties, online platforms need to assess the risk that their services will allow users to encounter illegal content or be
“used for the commission or facilitation of a priority offence”.
They then need to take steps to mitigate identified risks. These will include implementing safety-by-design measures to reduce risks and content moderation systems to remove illegal content where it appears.
The Online Safety Act sets out a list of priority offences for the purposes of providers’ illegal content duties. These offences reflect the most serious and prevalent online illegal content and activity. They are set out in schedules to the Act. Platforms will need to take additional steps to tackle these kinds of illegal activities under their illegal content duties.
The priority offences list currently includes certain intimate image abuse offences. Through this statutory instrument, we are adding new intimate image abuse offences to the priority list. This replaces an old intimate image abuse offence, which has now been repealed. These new offences are in the Sexual Offences Act 2003. They took effect earlier this year. The older offence was in the Criminal Justice and Courts Act 2015. The repealed offence covered sharing intimate images where the intent was to cause distress. The new offences are broader; they criminalise sharing intimate images without having a reasonable belief that the subject would consent to sharing the images. These offences include the sharing of manufactured or manipulated images, including so-called deepfakes.
Since these new offences are more expansive, adding them as priority offences means online platforms will be required to tackle more intimate image abuse on their services. This means that we are broadening the scope of what constitutes illegal intimate image content in the Online Safety Act. It also makes it clear that platforms’ priority illegal content duties extend to AI-generated deepfakes and other manufactured intimate images. This is because the new offences that we are adding explicitly cover this content.
As I have set out above, these changes affect the illegal content duties in the Online Safety Act. They will ensure that tech companies play their part in kicking this content off social media. These are just part of a range of wider protections coming into force next spring through the Online Safety Act that will mean that social media companies have to remove the most harmful illegal content, a lot of which disproportionately affects women and girls, such as through harassment and controlling or coercive behaviour.
Ofcom will set out the specific steps that providers can take to fulfil their illegal content duties for intimate image abuse and other illegal content in codes of practice and guidance documentation. It is currently producing this documentation. We anticipate that the new duties will start to be enforced from spring next year once Ofcom has issued these codes of practice and they have come into force. Providers will also need to have done their risk assessment for illegal content by then. We anticipate that Ofcom will recommend that providers should take action in a number of areas. These include content moderation, reporting and complaints procedures, and safety-by-design steps, such as testing their algorithm systems to see whether illegal content is being recommended to users. We are committed to working with Ofcom to get these protections in place as quickly as possible. We are focused on delivering.
Where companies are not removing and proactively stopping this vile material appearing on their platforms, Ofcom will have robust powers to take enforcement action against them. This includes imposing fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is highest.
In conclusion, through this statutory instrument we are broadening providers’ duties for intimate image abuse content. Service providers will need to take proactive steps to search for, remove and limit people’s exposure to this harmful kind of illegal content, including where it has been manufactured or manipulated. I hope noble Lords will commend these further steps that we have taken that take the provisions in the Online Safety Act a useful further step forward. I commend these regulations to the Committee, and I beg to move.
My Lords, I thank the Minister for her introduction. I endorse everything she said about intimate image abuse and the importance of legislation to make sure that the perpetrators are penalised and that social media outlets have additional duties under Schedule 7 for priority offences. I am absolutely on the same page as the Minister on this, and I very much welcome what she said. It is interesting that we are dealing with another 2003 Act that, again, is showing itself fit for purpose and able to be amended; perhaps there is some cause to take comfort from our legislative process.
I was interested to hear what the Minister said about the coverage of the offences introduced by the Online Safety Act. She considered that the sharing of sexually explicit material included deepfakes. There was a promise—the noble Viscount will remember it—that the Criminal Justice Bill, which was not passed in the end, would cover that element. It included intent, like the current offence—the one that has been incorporated into Schedule 7. The Private Member’s Bill of the noble Baroness, Lady Owen—I have it in my hand—explicitly introduces an offence that does not require intent, and I very much support that.
I do not believe that this is the last word to be said on the kinds of IIA offence that need to be incorporated as priority offences under Schedule 7. I would very much like to hear what the noble Baroness has to say about why we require intent when, quite frankly, the creation of these deepfakes requires activity that is clearly harmful. We clearly should make sure that the perpetrators are caught. Given the history of this, I am slightly surprised that the Government’s current interpretation of the new offence in the Online Safety Act includes deepfakes. It is gratifying, but the Government nevertheless need to go further.
My Lords, I welcome the Minister’s remarks and the Government’s step to introduce this SI. I have concerns that it misses the wider problems. The powers given to Ofcom in the Online Safety Act require a lengthy process to implement and are not able to respond quickly. They also do not provide individuals with any redress. Therefore, this SI adding to the list of priority offences, while necessary, does not give victims the recourse they need.
My concern is that Ofcom is approaching this digital problem in an analogue way. It has the power to fine and even disrupt business but, in a digital space—where, when one website is blocked, another can open immediately—Ofcom would, in this scenario, have to restart its process all over again. These powers are not nimble or rapid enough, and they do not reflect the nature of the online space. They leave victims open and exposed to continuing distress. I would be grateful if the Government offered some assurances in this area.
The changes miss the wider problem of non-compliance by host websites outside the UK. As I have previously discussed in your Lordships’ House, the Revenge Porn Helpline has a removal rate of 90% of reported non-consensual sexually explicit content, both real and deepfake. However, in 10% of cases, the host website will not comply with the removal of the content. These sites are often hosted in countries such as Russia or those in Latin America. In cases of non-compliance by host websites, the victims continue to suffer, even where there has been a successful conviction.
If we take the example of a man who was convicted in the UK of blackmailing 200 women, the Revenge Porn Helpline successfully removed 161,000 images but 4,000 still remain online three years later, with platforms continuing to ignore the take-down requests. I would be grateful if the Government could outline how they are seeking to tackle the removal of this content, featuring British citizens, hosted in jurisdictions where host sites are not complying with removal.
(1 month, 1 week ago)
Lords ChamberThe noble Lord is absolutely right. The scale of violent images featuring women and girls in our country is intolerable, and this Government will treat it as the national emergency it is. The noble Lord will be pleased to hear that the Government have set out an unprecedented mission to halve violence against women and girls within a decade. We are using every government tool we have to target the perpetrators and address the root cause of violence. That involves many legislative and non-legislative measures, as the noble Lord will appreciate, including tackling the education issue. However, ultimately, we have to make sure that the legislation is robust and that we take action, which we intend to do.
My Lords, as the Minister and others have mentioned, there is considerable and increasing concern about deepfake pornographic material, particularly the so-called nudification apps, which can be easily accessed by users of any age. What action will the Government be taking against this unacceptable technology, and will an offence be included in the forthcoming crime and policing Bill?
The noble Lord raises an important point. Where nudification apps and other material do not come under the remit of the Online Safety Act, we will look at other legislative tools to make sure that all new forms of technology—including AI and its implications for online images —are included in robust legislation, in whatever form it takes. Our priority is to implement the Online Safety Act, but we are also looking at what other tools might be necessary going forward. As the Secretary of State has said, this is an iterative process; the Online Safety Act is not the end of the game. We are looking at what further steps we need to take, and I hope the noble Lord will bear with us.
(2 months, 2 weeks ago)
Lords ChamberMy Lords, it is a pleasure to follow the noble Lord, Lord Hunt, and a particular pleasure to follow so closely the comprehensive introduction by our excellent former chair, the noble Lord, Lord Hollick.
As the noble Lord alluded to, the Grenfell report and today’s Statement have been an extremely sobering reminder of the importance of effective regulation and the effective oversight of regulators. The principal job of regulation is to ensure societal safety and benefit—in essence, mitigating risk. In that context, the performance of the UK regulators, as well as the nature of regulation, is crucial.
In the early part of this year, the spotlight was on regulation and the effectiveness of our regulators. Our report was followed by a major contribution to the debate from the Institute for Government. We then had the Government’s own White Paper, Smarter Regulation, which seemed designed principally to take the growth duty established in 2015 even further with a more permissive approach to risk and a “service mindset”, and risked creating less clarity with yet another set of regulatory principles going beyond those in the Better Regulation Framework and the Regulators’ Code.
Our report was, however, described as excellent by the Minister for Investment and Regulatory Reform in the Department for Business and Trade under the previous Government, the noble Lord, Lord Johnson of Lainston, whom I am pleased to see taking part in the debate today. I hope that the new Government will agree with that assessment and take our recommendations further forward.
Both we and the Institute for Government identified a worrying lack of scrutiny of our regulators—indeed, a worrying lack of even identifying who our regulators are. The NAO puts the number of regulators at around 90 and the Institute for Government at 116, but some believe that there are as many as 200 that we need to take account of. So it is welcome that the previous Government’s response said that a register of regulators, detailing all UK regulators, their roles, duties and sponsor departments, was in the offing. Is this ready to be launched?
The crux of our report was to address performance, strategic independence and oversight of UK regulators. In exploring existing oversight, accountability measures and the effectiveness of parliamentary oversight, it was clear that we needed to improve self-reporting by regulators. However, a growth duty performance framework, as proposed in the White Paper, does not fit the bill.
Regulators should also be subject to regular performance evaluations, as we recommended; these reviews should be made public to ensure transparency and accountability. To ensure that these are effective, we recommended, as the noble Lord, Lord Hollick mentioned, establishing a new office for regulatory performance—an independent statutory body analogous to the National Audit Office—to undertake regular performance reviews of regulators and to report to Parliament. It was good to see that, similar to our proposal, the Institute for Government called for a regulatory oversight support unit in its subsequent report, Parliament and Regulators.
As regards independence, we had concerns about the potential politicisation of regulatory appointments. Appointment processes for regulators should be transparent and merit-based, with greater parliamentary scrutiny to avoid politicisation. Although strategic guidance from the Government is necessary, it should not compromise the operational independence of regulators.
What is the new Government’s approach to this? Labour’s general election manifesto emphasised fostering innovation and improving regulation to support economic growth, with a key proposal to establish a regulatory innovation office in order to streamline regulatory processes for new technologies and set targets for tech regulators. I hope that that does not take us down the same trajectory as the previous Government. Regulation is not the enemy of innovation, or indeed growth, but can in fact, by providing certainty of standards, be the platform for it.
At the time of our report, the IfG rightly said:
“It would be a mistake for the committee to consider its work complete … new members can build on its agenda in their future work, including by fleshing out its proposals for how ‘Ofreg’ would work in practice”.
We should take that to heart. There is still a great deal of work to do to make sure that our regulators are clearly independent of government, are able to work effectively, and are properly resourced and scrutinised. I hope that the new Government will engage closely with the committee in their work.
(3 months, 4 weeks ago)
Lords ChamberI thank my noble friend for those good wishes. Of course, he is raising a really important issue of great concern to all of us. During the last election, we felt that the Government were well prepared to ensure the democratic integrity of our UK elections. We did have robust systems in place to protect against interference, through the Defending Democracy Taskforce and the Joint Election and Security Preparedness unit. We continue to work with the Home Office and the security services to assess the impact of that work. Going forward, the Online Safety Act goes further by putting new requirements on social media platforms to swiftly remove illegal misinformation and disinformation, including where it is AI-generated, as soon as it becomes available. We are still assessing the need for further legislation in the light of the latest intelligence, but I assure my noble friend that we take this issue extremely seriously. It affects the future of our democratic process, which I know is vital to all of us.
My Lords, I welcome the creation of an AI opportunities plan, announced by the Government, but, as the noble Lord, Lord Knight, says, we must also tackle the risks. In other jurisdictions across the world, including the EU, AI-driven live facial recognition technology is considered to seriously infringe the right to privacy and have issues with accuracy and bias, and is being banned or restricted for both law enforcement and business use. Will the Government, in their planned AI legislation, provide equivalent safeguards for UK citizens and ensure their trust in new technology?
I thank the noble Lord for that question and for all the work he has done on the AI issue, including his new book, which I am sure is essential reading over the summer for everybody. I should say that several noble Lords in this Chamber have written books on AI, so noble Lords might want to consider that for their holiday reading.
The noble Lord will know that the use and regulation of live facial recognition is for each country to decide. We already have some regulations about it, but it is already governed by data protection, equality and human rights legislation, supplemented by specific police guidance. It is absolutely vital that its use is only when it is necessary, proportionate and fair. We will continue to look at the legislation and at whether privacy is being sufficiently protected. That is an issue that will come forward when the future legislation is being prepared.