(1 week, 1 day ago)
Lords ChamberMy Lords, the technological capabilities and their misuse that have prompted this Statement are, needless to say, deeply disturbing and demand our careful attention. The use of AI to generate non-consensual sexual imagery of women and children is both grotesque in itself, but also corrosive of trust in technology more broadly.
We therefore welcome the Secretary of State’s confirmation that new offences criminalising the creation or solicitation of such material will be brought into force this week. We support the enforcement of these laws. We also welcome Ofcom’s decision to open a formal investigation into the use of Grok on X under the Online Safety Act, an investigation that must proceed swiftly to protect victims and hold platforms to account.
Hard though it is to predict the misuses of emerging technologies, we must collectively find better ways to be ready for them before they strike. I fear there is a pervasive and damaging sense of regulatory, legislative and political uncertainty around AI. As long as that remains the case, we risk remaining a victim of events beyond our control.
From the outset of this Parliament, and indeed in opposition, the Government have pledged to legislate on AI. Reviews and policy documents, including the Clifford AI Opportunities Action Plan, promised a framework to drive adoption and regulatory clarity. However, we still have no clear timeline, nor even a clear account of the Government’s policy on AI.
It is worth noting that the legislative tools the Government are now relying on to implement their proposed new offences, such as the creation and solicitation of non-consensual intimate images, are the product of amendments introduced by this House to the Data (Use and Access) Act. Ministers have repeatedly argued both that binding AI regulation must come, and that the existing multi-regulator framework is sufficient.
Evidence to the House of Commons Science, Innovation and Technology Committee late last year confirmed that the Secretary of State would not commit to a specific AI Bill, instead speaking of considering targeted interventions rather than an overarching legislative framework. This may indeed be the right approach, but its unclear presentation and communication drive uncertainty that undermines confidence for investors, businesses and regulators, but above all for citizens.
Progress on other AI-related policy commitments seems to have stalled too. I do not underestimate the difficulty of the problem, but work thus far on AI and copyright has been pretty disappointing. I am not seeking to go into that debate now, but only to make the point that it contributes to a widespread sense of uncertainty about tech in general and AI in particular.
Frankly, this uncertainty has been compounded by inconsistent political messaging. Over the weekend, reports emerged that the Government were considering banning X altogether before subsequently softening that position, creating wholly unnecessary confusion. At the same time, the Government have mischaracterised X’s decision to move its nudification tools behind a paywall as a means to boost profits, when the platform argues, reasonably persuasively, that this is a measure to ensure that those misusing the tools cannot do so anonymously.
Nor has there been much effective communication from the Government about their regulatory intentions for AI. This leaves the public and businesses unclear on how AI will be regulated and what standards companies are expected to meet. Political and legislative uncertainty in this case is having real consequences. It weakens our ability to deter misuse of AI technologies; it undermines public confidence, and it leaves regulators and enforcement agencies in a reactive posture rather than being empowered to act with a clear statutory direction.
We of course support efforts to criminalise harmful uses of AI. However, under the Government’s current Sentencing Bill, most individuals convicted of these new AI-related offences against women and girls will be liable for only suspended sentences, meaning that they could leave court free to continue using the technology that enabled their crime. This is concerning. It cannot be right that someone found guilty of producing non-consensual sexual imagery may walk free, unrestrained and with unimpeded access to the tools that facilitated their offending.
As I say, we support Ofcom’s work and the use of existing powers, but law without enforcement backed by a coherent, predictable regulatory regime will offer little real protection. Without proper sentencing, regulatory certainty and clear legislative direction for AI, these laws will not provide the protection that we need.
We urge the Government to publish a clear statement on their intentions on comprehensive AI regulation, perhaps building on the AI White Paper that we produced in government, to provide clarity for both tech companies and the public, and to underpin the safe adoption of AI across the economy and society. We must assume that new ways to abuse AI are being developed as we speak. Either we have principled, strategic approaches to deal with them, or we end up lurching from one crisis to the next.
My Lords, we on the Liberal Democrat Benches welcome the Secretary of State’s Statement, as well as her commitment to bring the new offence of creating or requesting non-consensual intimate images into force and to make it a priority offence. However, why has it taken this specific crisis with Grok and X to spur such urgency? The Government have had the power for months to commence this offence, so why have they waited until women and children were victimised on an industrial scale?
My Commons colleagues have called for the National Crime Agency to launch an urgent criminal investigation into X for facilitating the creation and distribution of this vile and abusive deepfake imagery. The Secretary of State is right to call X’s decision to put the creation of these images behind a paywall insulting; indeed, it is the monetisation of abuse. We welcome Ofcom’s formal investigation into sexualised imagery generated by Grok and shared on X. However, will the Minister confirm that individuals creating and sharing this content will also face criminal investigation by the police? Does the Minister not find it strange that the Prime Minister needs to be reassured that X, which is used by many parliamentarians and government departments, will comply with UK law?
While we welcome the move to criminalise nudification apps in the Crime and Policing Bill, we are still waiting for the substantive AI Bill promised in the manifesto. The Grok incident proves that voluntary agreements are not enough. I had to take a slightly deep breath when I listened to what the noble Viscount, Lord Camrose, had to say. Who knew that the Conservative Party was in favour of AI regulation? Will the Government commit to a comprehensive, risk-based regulatory framework, with mandatory safety testing, for high-risk models before they are released to the public, of the kind that we have been calling for on these Benches for some time? We need risk-proportionate, mandatory standards, not voluntary commitments that can be abandoned overnight.
Will the Government mandate the adoption of hashtagging technology that would make the removal of non-consensual images possible, as proposed by the noble Baroness, Lady Owen of Alderley Edge, in Committee on the Crime and Policing Bill—I am pleased to see that the noble Lord, Lord Hanson, is in his place—and as advocated by StopNCII.org?
The Secretary of State mentioned her commitment to the safety of children, yet she has previously resisted our calls to raise the digital age of consent to 16, in line with European standards. If the Government truly want to stop companies profiteering from children’s attention and data, why will they not adopt this evidence-based intervention?
To be absolutely clear, the creation and distribution of non-consensual intimate images has nothing whatever to do with free speech. These are serious criminal offences. There is no free speech right to sexually abuse women and children, whether offline or online. Any attempt to frame this as an issue of freedom of expression is a cynical distortion designed to shield platforms from their legal responsibilities.
Does the Minister have full confidence that Ofcom has the resources and resolve to take on these global tech giants, especially now that it is beginning to ramp up the use of its investigation and enforcement powers? Will the Government ensure that Ofcom uses the full range of enforcement powers available to it? If X continues to refuse compliance, will Ofcom deploy the business disruption measures under Part 7, Chapter 6 of the Online Safety Act? Will it seek service restriction orders under Sections 144 and 145 to require payment service providers and advertisers to withdraw their services from the non-compliant platform? The public expect swift and decisive action, not a drawn-out investigation while the abuse continues. Ofcom must use every tool Parliament has given it.
Finally, if the Government believe that X is a platform facilitating illegal content at scale, why do they continue to prioritise it for official communications? Is it not time for the Government to lead by example and reduce their dependence on a platform that seems ideologically opposed to the values of decency and even perhaps the UK rule of law, especially now that we know that the Government have withdrawn their claim that 10.8 million families use X as their main news source?
AI technologies are developing at an exponential rate. Clarity on regulation is needed urgently by developers, adopters and, most importantly, the women and children who deserve protection. The tech sector can be a force for enormous good, but only when it operates within comprehensive, risk-proportionate regulatory frameworks that put safety first. We on these Benches will support robust action to ensure that that happens.
(1 month, 1 week ago)
Lords Chamber
Baroness Lloyd of Effra (Lab)
In many areas—in fact, the entire industrial strategy and particularly the Technology Adoption Review—that has been done in concert with the private sector. It is an incredibly important part of the approach. To take one example, the skills package in construction takes that approach forward; both the private and public sectors are putting themselves forward together to provide more opportunities for young people. That is the approach that we will take across digital and AI skills, as I mentioned.
I draw noble Lords’ attention to my technology interests, as set out in the register. What assessment have the Government made of the critique of the CBI and others that their technology adoption plans are too fragmented? Does the Minister agree that, without strong co-ordination across different technology adoption initiatives, we will be unable either to assess their collective impacts or to learn their individual lessons?
Baroness Lloyd of Effra (Lab)
The technology review and many others have identified that there is no silver bullet in respect of technology adoption. What is needed in the creative industries is perhaps completely different from what is needed in the energy sector, for example. The review’s approach and its adoption into the industrial strategy is to match the needs of a particular sector with a set of technological or digital approaches. Beneath that are some common themes—for example, on skills, connectivity or infrastructure. We have to look at it in that way: measures cut across the economy and specific measures are suited to subsectors.
(1 month, 1 week ago)
Lords Chamber
Baroness Lloyd of Effra (Lab)
The noble Lord cites some important evidence which, along with other evidence about the links between social media use and different cohorts of young people, young adults and so on, is very important. The Government and Ofcom are looking at that carefully. As I said before, we continue to keep open all the issues here to protect children from unsafe content, while allowing them to participate actively in the digital world, which can provide many opportunities to young people and much education.
My Lords, screen addiction is a growing problem for all ages, but far more so for children. In July, Peter Kyle, the former Secretary of State for DSIT, committed to bringing forward proposals in the autumn to restrict children’s screen time. Since the reshuffles, we have heard no more about those proposals. Can the Minister clarify this point today? Will the Government be bringing forward a package along the lines set out by the former Secretary of State?
Baroness Lloyd of Effra (Lab)
We are focusing on implementation of the Online Safety Act: protecting children from harmful content, backing Ofcom as it goes through the children’s risk assessments of the platform operators, and ensuring that the duties that came in in July are effective. That is the priority for the time being. As I said, we are looking at the evidence and assessing what other measures may be needed. If we need to do so in due course, we will do so.
(1 month, 1 week ago)
Lords Chamber
Baroness Lloyd of Effra (Lab)
There are very important roles for our regulators. There are also very important governance systems in place that govern how regulators work and how they are accountable to Parliament. I do not think there is any case at present to take the action my noble friend suggests.
My Lords, in May, the Vodafone-Three merger was completed, reducing the number of mobile operators in the country from four to three. Building on the question from my noble friend Lord Vaizey, six months on from the merger, what is the Government’s assessment of its impact, first on consumer prices and secondly on investment in the infrastructure that improves both the digital economy and rural connectivity?
Baroness Lloyd of Effra (Lab)
As part of that merger, there was a commitment to invest £11 billion in infrastructure. That is a very important part of the continued rollout of our digital infrastructure, and it is monitored through Ofcom’s Connected Nations report, which is published regularly.
(1 month, 2 weeks ago)
Grand CommitteeMy Lords, I hope this is one of those occasions when we agree that what is coming here is a good thing—something that is designed to deal with an evil and thus is necessary. I want just to add a bit of flesh to the bones.
If we have regulation, we must make sure—as we are doing now—that it is enforced. I congratulate the Government on the age-verification activities that were reported on this morning, but can we get a little more about the tone, let us say, with which we are going to look at future problems? The ones we have here—cyber flashing and self-harm—are pretty obviously things that are not good for you, especially for younger people and the vulnerable.
I have in front of me the same figures of those who have experienced disturbing reactions to seeing these things, especially if they did not want to see them. Self-harm is one of those things; it makes me wince even to think about it. Can we make sure that not only those in the industry but those outside it know that action will be taken? How can we report across more? If we do not have a degree of awareness, reporting and everything else gets a bit slower. How do we make sure that everybody who becomes a victim of this activity knows that it is going on?
It is quite clear that the platforms are responsible; everybody knows that. It is about knowing that something is going on and being prepared to take action; that is where we will start to make sure not only that this is unacceptable and action will be taken but that everybody knows and gets in on the act and reporting takes place.
I could go on for a considerable length of time, and I have enough briefing to do so, but I have decided that the Grand Committee has not annoyed me enough to indulge in that today. I congratulate the Minister, but a little more flesh about the action and its tone, and what we expect the wider community to do to make sure this can be enacted, would be very helpful here. Other than that, I totally welcome these actions. Unpleasant as it is that they are necessary, I welcome them and hope that the Government will continue to do this. We are always going to be playing a little bit of catch-up on what happens, but let us make sure that we are running fast and that what is in front of us does not get too far away.
My Lords, as we have heard, this instrument amends Schedule 7 to the Online Safety Act 2023 to add cyber flashing and content encouraging self-harm to the list of priority offences. I thank the Minister for setting out some of the most alarming facts and figures associated with those offences.
As well as passing the Online Safety Act, which placed duties on social media sites and internet services to tackle illegal content, the previous Government outlawed cyber flashing and sharing or threatening to share intimate images without consent by amending the Sexual Offences Act 2003. We welcome the draft regulations, which we agree are in line with the Act’s overarching purpose to tackle harmful content online. As has been highlighted, young people are especially vulnerable to cyber flashing and content encouraging self-harm, and we must be proactive in tracking the trends of illegal activity, especially online, and its impact on UK users, to ensure that the law continues to be proportionate and effective.
We therefore support the move to categorise cyber flashing and content encouraging self-harm as priority offences under the Act rather than as relevant offences. We share the Government’s view that this will oblige services to remove such material as soon as they are made aware of it, as well as to prevent it appearing in the first place through risk assessments and specialised measures. However, I feel there are some broader issues that we should take into account, and I would be grateful if the Minister could comment on these.
First, on the use of VPNs, or virtual private networks, to override protections, my belief—I would welcome the Minister’s view on this—is that the Online Safety Act creates an obligation on platforms to prevent users gaining access to the wrong content for them, regardless of any technical workarounds they may be using. In other words, it is not a defence for a platform to claim that the user had deployed a VPN. Can the Minister confirm this? Needless to say, I am seeking not to downplay the VPN issue but merely to establish clearly where responsibility lies for addressing it.
Secondly, on the use of AI in ways that drive self-harm, obviously AI that assists in suicide ideation or less extreme forms of self-harm is subject to these controls. But where an AI that is not initially designed for a harmful purpose gradually takes on the role of, say, a psychotherapist or—I am told—in some cases a deity, the conditions become highly propitious for self-harm. Can the Minister comment on how the Act’s protections cover these emergent rather than designed properties? The noble Lord, Lord Addington, put this very well in his question too, and I look forward to hearing the Minister’s views on that.
Thirdly, and more generally, online harms are, of course, created faster than the rules that ban them, and a key part of Ofcom’s role is to monitor for gaps in the legislation as they emerge so that rules can adapt as needed. As far as the Government are aware now, what gaps has Ofcom identified so far in the existing legislation, if any?
We therefore support these regulations to strengthen the Online Safety Act, to better protect UK users from cyber flashing and content encouraging self-harms. We count on the Government to be proactive in ensuring that legislation is kept updated to tackle the changing ways in which unlawful content is proliferated and to be transparent about the way the Government and regulators balance the broader considerations mentioned. I look forward to the Minister’s response.
Baroness Lloyd of Effra (Lab)
My Lords, I thank noble Lords for their broad support for adding these offences to the priority offences list. This is an important step in improving the online safety regime and improving the environment in which we all use the internet, particularly children and vulnerable people. This will help fulfil the Government’s commitment to improving online safety and strengthening protections for women and girls.
On the points made by the noble Lord, Lord Addington, about tone and proactivity, it is really important that we communicate what we are doing, both in the online world and in terms of violence against women and girls in the physical world. We know that we must all do more to tackle misogynistic abuse, pile-ons, harassment and stalking, and the Government’s whole approach to tackling violence against women and girls is an active one and is something that we have real, serious goals on. We welcome everyone supporting that move forward. For example, the publication of Ofcom’s guidance, A Safer Life Online for Women and Girls, sets out the steps that services can take to create safer online spaces, and the Government will be setting out our strategy for tackling violence against women and girls in due course as part of that. I think that the publication of Ofcom’s report this morning, which sets out the activity that it has taken and will take, will help raise the profile, as the noble Lord says, about what is expected of services in terms of the urgency and the rigour with which these changes are made.
On the question of VPNs, which we talked about a little earlier, we do not have a huge amount of information or research about their use, particularly by young people to circumvent age assurance. We know that there are legitimate reasons to use VPNs, and we do not have a huge amount of evidence about their use by young people, either very young people or older teenagers. Ofcom and the Government are committed to increasing the research and evidence for how VPNs are being used and whether this is indeed a way that age assurance is being circumvented, or whether it is for what might be legitimate reasons, such as security or privacy reasons. That is an important piece of the evidence puzzle to know exactly what measures to take subsequently.
I am particularly interested in whether it is a legitimate defence for a platform to say, “We could not have prevented this access because a VPN was in use”, and therefore whether it falls to the platforms themselves to figure out how to prevent abuse via VPNs.
(2 months ago)
Lords Chamber
Baroness Lloyd of Effra (Lab)
The noble Lord asks a very good question about our sovereign capabilities. The Sovereign AI Unit’s remit spans the full AI stack, including large language models. Our priority is to secure UK access to the best models, including by deepening strategic partnerships and remaining open to backing UK companies to compete. However, we are focusing our efforts where there is greater opportunity for the UK to advance its strategic position in AI, looking across the value chain. This could mean supporting companies developing narrow models in high-impact sectors in which the UK has strengths, such as defence or drug discovery, or backing paradigm-shifting approaches in computing that can outperform incumbents.
My Lords, in September the Government announced plans for a national digital identity system—a policy that will have very profound implications for the safe use of AI, particularly agentic AI. Can the Minister confirm that the interaction between the Government’s digital identity scheme and AI systems will be explicitly included within the scope of the consultation? If not, can the Minister commit to ensuring that it is?
Baroness Lloyd of Effra (Lab)
The noble Viscount asks about digital ID, as he highlights a proposal which was announced a few months ago. Digital ID will help make it easier for people to access the services they are entitled to and prevent illegal working. It will streamline interactions with the state, saving time and cutting frustrating paperwork. A public consultation on the digital ID will launch in the coming few weeks, to ensure the system is secure, trusted and inclusive. I will take back his specific question on the coverage of the consultation coming up.
(2 months, 3 weeks ago)
Lords ChamberMy Lords, not much we debate in your Lordships’ House unites us so thoroughly as our shared recognition that children must be protected from harmful online content and behaviours. I am delighted that we are as one when it comes to the importance of shielding young people from extreme pornography, content promoting self-harm or suicide, or other serious risks.
This makes it all the more important to scrutinise how the Government and Ofcom have chosen to implement these protections. The role of the draft codes of practice, laid in April this year and brought into effect in July, is to translate Parliament’s intentions into practical rules for service providers. As the noble Lord, Lord Russell, set out so clearly, there are some serious concerns about whether these codes are achieving their stated objectives, and I thank the noble Lord, Lord Clement-Jones, for bringing this important Motion to the House today and for giving us the chance to air our views.
There is some evidence that the codes are being applied in a way that risks overreach and unintended consequences. Some platforms, such as X and Reddit, in attempting to comply, blocked wide-ranging content, including parliamentary debates on grooming gangs and posts relating to the wars in Ukraine and Gaza. Several experts have warned that such overapplication risks stifling legitimate public debate. It has even been suggested that some platforms deliberately overapply some rules as a way to influence government towards weakening them.
The Act was always designed to respect freedom of expression—political and otherwise—while protecting internet users, especially children, from harm. The Government’s own guidance confirms this, but clearly the practical effect has not always to date reflected that intent.
There also exist concerns about the complexity and accessibility of the codes. Platforms, parents and of course children themselves in some instances may struggle to understand what duties are required and how to enforce them. The guidance is hundreds of pages long and, while Ofcom has issued advice on risk assessments and age-verification measures, there is a real danger that the practical realities of compliance, particularly for smaller providers, leave gaps in protection. Complexity should not become a barrier to the very protections these codes are meant to provide.
We have also been discussing the iterative approach taken by Ofcom. Presenting the codes as a first step, to be refined over time, is in principle essential, for two reasons. The first is that, as we know, this is a pioneering piece of legislation and we must remain open to adapting it. The second is that I am afraid that the people we are up against are inventive users of fast-moving technology.
However, the iterative approach is also clearly creating uncertainty. Civil society organisations have reported that their concerns were not fully addressed during consultation. Children face immediate risks and it is imperative that the Government ensure that these gaps are closed without delay. The noble Lord, Lord Clement-Jones, cited the statistic that a young life aged between 10 and 19 is lost to suicide every week where technology has been a factor. The codes should not act or be viewed as a ceiling for safety standards. Rather, they must set a floor for safety standards and be subject to firm and measurable enforcement.
Enforcement and proportionality are, of course, critical. The Act grants Ofcom significant powers, including fines, criminal liability and restrictions on financial and commercial arrangements. Yet there are practical challenges to ensuring that these powers are applied in a proportionate and evidence-based way. The critical challenge facing the Government as they operate the Act’s machinery is to protect children while avoiding excessive interference with legitimate content and adult access to lawful material.
All that said, we on these Benches do have questions over the Government’s handling of these codes. Our purpose is to challenge the Government to deliver children’s online safety effectively and proportionately. While I welcome the Minister to her place and wish her the very best for her very important role, particularly in this respect, I ask her for some greater clarity, if she is able to provide it, on three strands of Ofcom’s work. First, how will Ofcom monitor implementation by platforms? Secondly, how will it ensure that civil society is genuinely incorporated, and of course that consultees recognise that they have been listened to? Thirdly, how will it address current gaps in coverage without delay?
I am delighted to be participating in this important debate and to have the opportunity to seek these assurances from the Government. We must see rapid action to ensure that the codes protect children in practice, do not inadvertently suppress legitimate debate, and are accessible and enforceable in the real world. I support the scrutiny behind this regret Motion and hope that, when the Minister rises, she will provide answers that reassure us all that the protection of children online is being delivered with both effectiveness and proportionality.
The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Lloyd of Effra) (Lab)
My Lords, I thank noble Lords for their valuable contributions today, and I thank the noble Lord, Lord Clement-Jones, for initiating the debate. I absolutely acknowledge the huge expertise in the Room today. I thank the noble Lord, Lord Russell, for his suggestion of further discussions with individual Members.
I found reading the Secondary Legislation Scrutiny Committee’s report an excellent basis for this discussion. That committee plays a very important role, as do other committees, such as the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee. The role of ongoing scrutiny by all these bodies is absolutely essential. On the matter of the specific committee that the noble Lord, Lord Russell, mentioned, it would be for the House to decide whether that would be set up to monitor this legislation and the codes.
As others have mentioned, we are working closely with Ofcom to monitor the effectiveness of the Online Safety Act. While the early signs are encouraging, the true test will be whether adults and children are having a safer online experience. Ofcom has put in place a robust monitoring and evaluation program, tracking changes firms are making in response to regulation, gathering data from the supervised services and commissioning research to measure impact. Some of that research has been mentioned in the course of the debate. It is quite extensive and provides a lot of information to civil society organisations, Members of this House and others.
What binds us together is the determination to do everything we need to do to keep children safe online, as built on the evidence. That is a priority. The previous Secretary of State, in issuing his statement of strategic priorities, made it clear that the first priority was safety by design. That builds on the safety by design measures within the codes, such as the safer design of algorithms to filter out harmful content from children’s feeds. On 25 July, Ofcom published its statement, setting out what it proposes to do in consequence of that statement of strategic priorities. Under the Act, it must publish further annual reviews of what action it has taken as a result of the statement of strategic priorities, including on safety by design.
We have taken action to strengthen the regulatory framework by making further offences priority offences under the Online Safety Act, reflecting the most serious and prevalent illegal content and online activity—for example, laying an SI to make cyberflashing, encouraging self-harm and the sharing of intimate images without consent priority offences under the Act.
Others have mentioned the importance of basing our decisions on good evidence of what is happening. Recognising that further research was required to improve the evidence base, the Government have commissioned a feasibility study to explore the impact of smartphones and social media use on children.
(6 months, 1 week ago)
Grand CommitteeMy Lords, I want to add a few comments to the discussions on these regulations. I am sorry; I was a little slow off the mark.
I want to say from the outset that I believe we are going to need much more oversight to protect everyone —in particular our children and other vulnerable groups —from tech, particularly relating to online risks. I will say more on this during the passage of the Children’s Wellbeing and Schools Bill, which is currently before the House. The situation is always evolving and, unfortunately, predators always seem to be one step ahead.
I have always felt that internet and tech companies could do, but choose not to do, more to make their products safer. I know this from personal experience; as I said, I will say more about this in our proceedings on the other Bill before the House. I strongly feel that such companies are complicit in this. It is regrettable that we have to regulate this area in the way we do, but here we are. Having this draft online safety super-complaints regulation is a welcome piece of the jigsaw. If implemented robustly, it has the potential to contribute meaningfully to a safer and more accountable online environment. I worry, though, and want this to work. I have a few questions for the Minister.
Following on from a concern that other noble Lords have raised, I would appreciate hearing from the Minister whether there is going to be a new ombudsman and how this might be funded. I know that there is funding ring-fenced, but we really need to involve appropriate leadership and expertise. How much is enforcement likely to cost?
As currently worded, the regulations do not seem to allow smaller groups, such as victim support groups or small NGOs, to feed information and complaints into the regulator. I feel that allowing for smaller groups would be beneficial to the online safety of the vulnerable.
Appeals were again raised by the noble Lord, Lord Stevenson, and others. We know that, in these regulations, a group needs to apply to have their case approved to be heard. If it is rejected, is there a mechanism for appealing? Is it correct that summary decisions are the only material published after investigations? If so, I believe that we should offer more transparency than this for the public and for case law.
Finally, what powers would a regulator have against the largest companies with their related resources and well-funded legal departments? I am thinking of Instagram, Facebook, various other social media and internet companies, gaming companies and other tech organisations. Will the regulator really have the powers to enforce punishments and change? I sincerely hope it will. A lot is at stake here; we need to get this right not only for today’s users but for future ones.
My Lords, as we have heard, the purpose of the super-complaints mechanism is to allow eligible entities with expertise in online safety matters, such as civil society groups, to raise systemic issues with Ofcom. Such issues may include instances where the features or conduct of regulated services may be causing significant harm, adversely affecting freedom of expression or otherwise adversely impacting users, members of the public or particular groups.
We welcome the Government’s decision to bring forward these regulations, which will help Ofcom to understand the kinds of risks, issues and threats to users identified by the specified groups. We continue to believe that the regulations strike an effective balance between the need to learn from the experience of users and the need to prioritise the testimony of those with experience, expertise and knowledge when considering complaints. It is important that we construct a feedback mechanism, but it is also important that this mechanism can be wielded by Ofcom in a way that is genuinely helpful and can lead to targeted and effective action. The point about concrete outcomes from the process was well made by the noble Lord, Lord Stevenson; I look forward to hearing the Minister’s remarks on that.
The regulations make it clear that eligible groups must meet a required standard before their complaints will be considered. To be eligible to submit complaints under the regulations, an entity must: represent the interests of users, the public or specific user groups; be independent of regulated services; show expertise in online safety, such as regular expert contributions to public or media discussions; and be expected to consider Ofcom’s guidance in its work. In other words, this feed- back mechanism is designed to facilitate communication between Ofcom and independent expert groups. This is right and we very much hope that it will ensure that the case load for Ofcom—I take on board the points and concerns about this—will be such that genuine and proper consideration is given to each complaint raised.
That being said, I hope the Minister can give us some information on how this will be reported back to Parliament. Will we have sight of the volume of cases taken on by Ofcom and will we be able to see how many complaints have been upheld and how many rejected? I appreciate that, as part of the process, while any super-complaint is live, it must be subject to protection from outside interference, but having this information after the fact would make an important metric that noble Lords and Members in the other place will be able to use to assess how well the machinery and Ofcom overall are working. As has been discussed, the regulations are in the public interest and our collective ability to monitor their effectiveness would be greatly aided by this information—particularly in the context of the Minister’s welcome remark about the need for agility in this fast-moving space.
Further to this point, as I said at the beginning of my remarks, the regulations relate to complaints about systemic issues that could negatively affect freedom of expression, pose a risk of harm to the public or cause other adverse effects for users. Can the Minister, when she rises, please share some more information about how users and members of the wider public will be informed about such harms? It seems to me that it is possible to foresee circumstances where, if a complaint is made by an authoritative body to Ofcom under the regulations, it would be wise to warn users and members of the public of this even before Ofcom concludes its investigations, which, as the regulations make clear, could be completed after a period of as many as 105 days. I think that that is the total day count; I may disagree with the noble Baroness, Lady McIntosh, but the point stands in any case.
Does the Minister agree that, if there is a chance of a serious risk being posed to users, the public should know about it as soon as possible? Can she tell us whether there are circumstances in which the Government will issue warnings once complaints are raised, or will they rely on the relevant complainant group to do so? Once Ofcom has concluded its investigations, if it finds that there are risks posed to users, will the Government or Ofcom undertake to inform users at that stage?
Finally—this is, I am afraid, a slightly more trivial question about the mechanics of the eligibility criteria—the fourth criterion for a complainant group
“is that the entity can be relied upon to have due regard to any guidance published by Ofcom”.
Clearly, this is testable in the negative, but can the Minister comment on how entities that have not actively demonstrated unsuitability will be assessed and monitored against this important criterion? Clarity on these points would be much appreciated and would provide us with valuable further information on how the Government envisage using these regulations to keep people safe.
In conclusion, we support the intent behind these regulations and the way in which they have been constructed; I look forward to the Minister’s remarks. We feel that, on the whole, these regulations offer a clear framework for expert, independent entities—
To pick up exactly where I left off, as with any regulatory mechanism, transparency is key to ensuring public trust and parliamentary accountability. We therefore urge the Government to clarify how the outcomes of this process will be communicated to Parliament and the public, particularly where serious harms are identified. Only then can we be confident that this mechanism will not only protect users but uphold the openness and scrutiny that must underpin all aspects of the Online Safety Act.
My Lords, I thank all noble Lords for their valuable contributions to this debate, including those who have rightly identified that we have taken the comments from the stakeholder engagement to heart and made changes to the eventual proposals. I will go through the very many questions that noble Lords have asked. I pay tribute to the work of the Secondary Legislation Scrutiny Committee; we welcome its report and the scrutiny it has given to our proposals.
In no particular order, I will first pick up the question of scrutiny. The noble Lord, Lord Stevenson, asked about Parkinson’s law—if I can put it that way. We have spoken about this and there have been a number of different discussions about it. We recognise that the Science, Innovation and Technology Committee and the Lords Communications and Digital Committee play a vital role in scrutinising the regime. The SI was shared with those committees in advance. He will know that Parkinson’s law is not as emphatic as it might be—it is a caveated law—but we nevertheless take on board the concerns raised about it and have met the chairs of those committees to talk about how we can take these issues forward. We have had a very good dialogue with them, on the understanding that we do not want to delay what can sometimes be very important and game-changing regulations by having a long extra scrutiny process. Nevertheless, we are trying to find a way to resolve this issue and discussions are continuing with officials.
(7 months, 1 week ago)
Lords ChamberMy Lords, I declare an interest as the chair of the Authors’ Licensing and Collecting Society. We should all be grateful to the noble Lord, Lord Berkeley, for the very gracious way he introduced his amendment, particularly given the history of this inter-House discussion.
Whether it is betrayal, disrespect, negligence, bloody-mindedness, a bad dream or tone-deafness, whatever the reality, we find ourselves once again in this Chamber debating an issue that should have been settled long ago. I share the profound anger and frustration expressed by the noble Baroness, Lady Kidron, and admire her unwavering determination, even if she, for very honourable reasons, will not be voting today. As she pointed out, the Prime Minister, who entertained the tech industry at Chequers and Downing Street, is complicit in the situation we are in today.
We are here today because the Government have point-blank refused to move, repeatedly presenting the same proposition on three occasions while this House, by contrast, has put forward a series of genuine solutions in an attempt to find a way forward, as the noble Lord, Lord Forsyth, pointed out. The only new element seems to be a promise of a cross-party parliamentary working party, but what is so enticing about merely more talking when action is desperately needed?
Amendment 49U, tabled by the noble Lord, Lord Berkeley, and designed to amend the 1988 copyright Act, is a reasoned compromise. It requires identifying the copyrighted works and the means by which they were accessed, unless the developer has obtained a licence. That seems to be a fair trade-off. The noble Lord also pointed out that Minister Bryant has rather inadvertently made it clear that today’s amendment does not invoke financial privilege on this occasion. The Government argue that legislating piecemeal would be problematic, but the historical precedent of the Napster clause in the Digital Economy Act 2010 demonstrates that Parliament can and should take powers to act when a sector is facing an existential threat. There is an exact parallel with where we are today.
This is not about picking a side between AI and creativity, as we have heard across the House today. It is about ensuring that both can thrive through fair collaboration based on consent and compensation. We must ensure that the incentive remains for the next generation of creators and innovators. Given how Ministers have behaved in the face of the strength of feeling of the creative industries, how can anyone in those industries trust this Government and these Ministers ever again? Will they trust their instincts to appease big tech? I suspect not. I do not regard the noble Baroness, Lady Jones, as personally liable in this respect, but I hope she feels ashamed of her colleagues in the Commons, of the behaviour of her department and of her Government. In this House we will not forget.
There is still time for the Government to listen, to act and to secure a future where human creativity is not plundered but valued and protected. If the noble Lord, Lord Berkeley, chooses to put this to a vote, on these Benches we will support him to the hilt. I urge all noble Lords from all Benches, if he does put it to a vote, to support the UK creative industries once again.
My Lords, as everybody has said, it is deeply disappointing that we once again find ourselves in this position. The noble Baroness, Lady Kidron, has brought the concerns of copyright owners to the attention of the Government time and again. Throughout the progress of the Bill, the Government have declined to respond to the substance of those concerns and to engage with them properly. As I said in the previous round of ping-pong—I am starting to lose count—the uncertainty of the continued delay to this Bill is hurting all sides. Even businesses that are in industries far removed from concerns about AI and copyright are waiting for the data Bill. It has been delayed because of the Government’s frankly stubborn mismanagement of the Bill.
I understand completely why the noble Lord, Lord Berkeley of Knighton, feels sufficiently strongly about how the Government have acted to move his very inventive amendment. It strikes at the heart of how this Government should be treating your Lordships’ House. If Ministers hope to get their business through your Lordships’ House in good order, they will rely on this House trusting them and collaborating with them. I know that these decisions are often made by the Secretary of State. I have the highest respect for the Minister, but this is a situation of the Government’s making. I note in passing that it was very disappointing to read that the Government’s planned AI Bill will now be delayed by at least a year.
All that said, as the Official Opposition we have maintained our position, as ping-pong has progressed, that protracted rounds of disagreement between the other place and your Lordships’ House should be avoided. This situation could have been avoided if the Government had acted in good faith and sought compromise.
My Lords, I thank noble Lords for their contributions. I repeat again our absolute commitment to the creative sector and our intention to work with it to help it flourish and grow. This is London Tech Week. All Ministers, including me and my colleagues, have been involved in that, showcasing the UK’s rising tech talent to the world. I do not feel I should apologise for our involvement with the tech sector in that regard.
(7 months, 2 weeks ago)
Lords ChamberMy Lords, I once again declare an interest as chair of the Authors’ Licensing and Collecting Society, and once again give the staunch support of these Benches to the noble Baroness, Lady Kidron, on her Motion A1. She made an incontestable case once again with her clarion call.
I follow the noble Lord, Lord Russell, and others in saying that we are not in new territory. I have a treasured cartoon on my wall at home that relates to the passage of the Health and Social Care Bill as long ago as 2001, showing Secretary of State Alan Milburn recoiling from ping-pong balls. Guess who was hurling the ping-pong balls? The noble Earl, Lord Howe, that notable revolutionary, and I were engaging in rounds of parliamentary ping-pong—three, I think. Eventually, compromises were reached and the Bill received Royal Assent in April 2001.
What we have done today and what we are going to do today as a House is not unprecedented. There is strong precedent for all Benches to work together on ping-pong to rather good effect. As the noble Baroness, Lady Kidron, says, what we are proposing today will not, in the words of the Minister, “collapse” the Bill: it will be the Government’s choice what to do when the Bill goes back to the Commons. I hugely respect the noble Lord, Lord Knight, but I am afraid that he is wrong. It was not a manifesto commitment; there is no Salisbury convention that can be invoked on this occasion. It has nothing at all to do with data adequacy except that the Government feel that they have to get the Bill through in order to get the EU Commission to start its work. If anything, the Bill makes data adequacy more difficult. I say to the noble Lord, Lord Brennan, that I agree with almost everything he said: everything he said was an argument for the noble Baroness’s amendment. Once again, as ever, I agree with the noble Lord, Lord Stevenson, as I so often do on these occasions. I regard him as the voice of reason, and I very much hope that the Government will listen to what he has to say.
Compromise is entirely within the gift of the Government. The Secretary of State should take a leaf out of Alan Milburn’s book. He did compromise on an important Bill in key areas and saw his Bill go through. I am afraid to say that the letter that Peers have received from the Minister is simply a repeat of her speech on Monday, which was echoed by Minister Bryant in the Commons yesterday. The Government have tabled these new amendments, which reflect the contents of that letter. Despite those amendments, however, the Government have not offered a concession to legislate for mandated transparency provisions within the Bill, which has been the core demand of the Lords amendments championed by the noble Baroness, Lady Kidron, for the reasons set out in the speeches we have heard today.
In the view of these Benches, the noble Baroness, Lady Kidron, other Members of this House, and countless creatives have made the absolutely convincing case for a transparency duty which would not prejudge the outcome of the AI and copyright consultation. We have heard the chilling points made by the noble Lords, Lord Russell and Lord Pannick, about US policy in this area and about the attitude of the big tech companies towards copyright. We are at a vital crossroads in how we ensure the future of our creative industries. In the face of the development of AI and how it is being trained, we must take the right road, and I urge the Government to settle now.
My Lords, given where we are, I will speak very briefly, but I will make just two points. First, I think it is worth saying that the uncertainty surrounding where we are with AI and copyright is itself damaging, not just to the creative sector, not just to AI labs and big tech in general, but to all those who will themselves be impacted by the Bill’s many other provisions. Overall, I think it is worth reminding ourselves that this is an important Bill whose original conception did not even address AI and copyright. It carried very important and valuable provisions—as the Minister pointed out in her opening remarks—on digital verification services, smart data schemes, the national underground asset register and others. These can genuinely drive national productivity. Indeed, that is why my party proposed them when we were in government. It is, therefore, deeply frustrating that the Government have not yet found a way forward on this, and I am afraid that I very much agree with the noble Lord, Lord Knight. The way the Government have gone about this has been reprehensible: I think that is the word I would use.