Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Science, Innovation & Technology
(2 days, 23 hours ago)
Lords ChamberMy Lords, it is a pleasure to open the second day on Report on the Data (Use and Access) Bill. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 38 in my name, I will not speak to any other amendments in this group.
Amendment 38 goes to the heart of the issue du jour: regulators have seldom been so much in the press and in the public eye. As the press would have it, they were hauled into No. 11 just a few days ago, but this speaks to what we want from our regulators across our economy and society. At their best, our regulators are the envy of the world. Just consider the FCA when we did the fintech regulatory sandbox: as a measure of success, it was replicated in well over 50 jurisdictions around the world.
We know how to do right-sized regulation and how to set up our regulators to succeed to do that most difficult of tasks—to balance innovation, economic growth, and consumers’ and citizens’ rights. That is what all regulators should be about. It is not straightforward; it is complex but entirely doable.
Amendment 38 simply proposes wording to assist the Information Commissioner’s Office. When it comes to the economic growth duty—“#innovation”—it simply refers back to Section 108 of the 2015 Act. I believe that bringing this clarity into the Bill will assist the regulator and enable all the conversations that are rightly going on right now, and all the plans that are being produced and reported on, such as those around AI, to be properly discussed and given proper context, with an Information Commissioner’s Office that is supported through clarity as to its responsibilities and obligations when it comes to economic growth. In simple terms, this would mean that these responsibilities are restricted and clearly set out according to Section 108 of the 2015 Act. It is critical that this should be the case if we are to have clarity around the commissioner’s independence as a supervisory authority on data protection, an absolutely essential condition for EU adequacy decisions.
I look forward to the Minister’s response. I hope that he likes my drafting. I hope that he will accept and incorporate my amendment into the Bill. I look forward to the debate. I beg to move.
My Lords, I rise to support Amendment 38 in the name of the noble Lord, Lord Holmes. More than ever before, the commissioner, alongside other regulators, is being pressured to support the Government’s growth and innovation agenda. In Clause 90, the Bill places unprecedented obligations on the ICO to support innovation. The question, in respect of both the existing growth duty and Clause 90, is whether they are in any sense treated as overriding the ICO’s primary responsibilities in data protection and information rights. How does the ICO aim to balance those duties, ensuring that its regulatory actions support economic growth while maintaining necessary protections?
We need to be vigilant. As it is, there are criticisms regarding the way the Information Commissioner’s Office carries out its existing duties. Those criticisms can be broadly categorised into issues with enforcement, independence and the balancing of competing interests. The ICO has a poor record on enforcement; it has been reluctant to issue fines, particularly to public sector organisations. There has been an overreliance on reprimands, as I described in Committee. The ICO has been relying heavily on reprimands, rather than stronger enforcement actions. It has also been accused of being too slow with its investigations.
There are concerns about these new duties, which could pose threats to the ability of the Information Commissioner’s Office to effectively carry out its primary functions. For that reason, we support the amendment from the noble Lord, Lord Holmes.
My Lords, I support the amendment in the name of the noble Baroness, Lady Kidron, to which I have added my name. I will speak briefly because I wish to associate myself with everything that she has said, as is normal on these topics.
Those of us who worked long and hard on the Online Safety Act had our fingers burnt quite badly when things were not written into the Bill. While I am pleased—and expect to be even more pleased in a few minutes—that the Government are in favour of some form of code of conduct for edtech, whether through the age-appropriate design code or not, I am nervous. As the noble Baroness, Lady Kidron said, every day with Ofcom we are seeing the risk-aversion of our regulators in this digital space. Who can blame them when it appears to be the flavour of the month to say that, if only the regulators change the way they behave, growth will magically come? We have to be really mindful that, if we ask the ICO to do this vaguely, we will not get what we need.
The noble Baroness, Lady Kidron, as ever, makes a very clear case for why it is needed. I would ask the Minister to be absolutely explicit about the Government’s intention, so that we are giving very clear directions from this House to the regulator.
My Lords, it is a pleasure to follow the noble Baroness, Lady Harding. I have added a few further words to my speech in response, because she made an extremely good point. I pay tribute to the noble Baroness, Lady Kidron, and her tenacity in trying to make sure that we secure a code for children’s data and education, which is so needed. The education sector presents unique challenges for protecting children’s data.
Like the noble Baronesses, Lady Kidron and Lady Harding, I look forward to what the Minister has to say. I hope that whatever is agreed is explicit; I entirely agree with the noble Baroness, Lady Harding. I had my own conversation with the Minister about Ofcom’s approach to categorisation which, quite frankly, does not follow what we thought the Online Safety Act was going to imply. It is really important that we absolutely tie down what the Minister has to say.
The education sector is a complex environment. The existing regulatory environment does not adequately address the unique challenges posed by edtech, as we call it, and the increasing use of children’s data in education. I very much echo what the noble Baroness, Lady Kidron, said: children attend school for education, not to be exploited for data mining. Like her, I cross over into considering the issues related to the AI and IP consultation.
The worst-case scenario is using an opt-in system that might incentivise learners or parents to consent, whether that is to state educational institutions such as Pearson, exam boards or any other entity. I hope that, in the other part of the forest, so to speak, that will not take place to the detriment of children. In the meantime, I very much look forward to what the Minister has to say on Amendment 44.
My Lords, I thank the noble Baroness, Lady Kidron, for moving her amendment. Before I begin, let me declare my interest as a recently appointed director of Lumi, an edtech provider—but for graduates, not for schools.
AI has the potential to revolutionise educational tools, helping teachers spend less time on marking and more time on face-to-face teaching with children, creating more innovative teaching tools and exercises and facilitating more detailed feedback for students. AI presents a real opportunity to improve education outcomes for children, opening more opportunities throughout their lives. There are deeply compelling promises in edtech.
However—there is always a however when we talk about edtech—creating and using AI education tools will require the collection and processing of children’s personal data. This potentially includes special category data—for instance, medical information pertaining to special educational needs such as dyslexia. Therefore, care must be taken in regulating how this data is collected, stored, processed and used. Without this, AI poses a major safeguarding risk. We share the concerns of the noble Baroness, Lady Kidron, and wholeheartedly support the spirit of her amendment.
We agree that it is prudent to require the ICO to make a code of practice on children’s data and education, and I particularly welcome a requirement on the ICO to consult with and involve parents. Parents know their children best, needless to say, and have their best interests at heart; their input will be critical in building trust in AI-assisted educational tools and facilitating their rollout and benefits for children throughout the UK.
However, as I said earlier at Report—and I shall not repeat the arguments now—we have concerns about the incorporation of international law into our law, and specifically, in this instance, the UN Convention on the Rights of the Child. We cannot therefore support the amendment as drafted. That said, we hope very much that the Government will listen carefully to the arguments raised here and take steps to introduce appropriate safeguards for children and young people in our data legislation regime. I suspect that most parents will greatly welcome more reassurance about the use of their children’s data.
My Lords, I can be pretty brief. We have had some fantastic speeches, started by the noble Baroness, Lady Kidron, with her superb rallying cry for these amendments, which we 100% support on these Benches. As she said, there is cross-party support. We have heard support from all over the House and, as the noble and learned Baroness, Lady Butler-Sloss, has just said, there has not been a dissenting voice.
I have a long association with the creative industries and with AI policy and yield to no one in my enthusiasm for AI—but, as the noble Baroness said, it should not come at the expense of the creative industries. It should not just be for the benefit of DeepSeek or Silicon Valley. We are very clear where we stand on this.
I pay tribute to the Creative Rights in AI Coalition and its campaign, which has been so powerful in garnering support, and to all those in the creative industries and creators themselves who briefed noble Lords for this debate.
These amendments respond to deep concerns that AI companies are using copyright material without permission or compensation. With the new government consultation, I do not believe that their preferred option is a straw man for a text and data mining exemption, with an opt out that we thought was settled under the previous Government. It starts from the false premise of legal uncertainty, as we have heard from a number of noble Lords. As the News Media Association has said, the Government’s consultation is based on a mistaken idea, promoted by tech lobbyists and echoed in the consultation, that there is a lack of clarity in existing copyright law. This is completely untrue. The use of copyrighted content without a licence by gen AI firms is theft on a mass scale and there is no objective case for a new text and data mining exception.
No effective opt-out system for the use of content by gen AI models has been proposed or implemented anywhere in the world, making the Government’s proposals entirely speculative. It is vital going forward that we ensure that AI companies cannot use copyrighted material without permission or compensation; that AI development does not exploit loopholes to bypass copyright laws; that AI developers disclose the sources of the data they use for training their models, allowing for accountability and addressing infringement; and that we reinforce the existing copyright framework, rather than creating new exceptions that disadvantage creators.
These amendments would provide a mechanism for copyright holders to contest the use of their work and ensure a route for payment. They seek to ensure that AI innovation does not come at the expense of the rights and livelihoods of creators. There is no market failure. We have a well-established licensing system as an alternative to the Government’s proposed opt-out scheme for AI developers using copyrighted works. A licensing system is the only sustainable solution that benefits both creative industries and the AI sector. We have some of the most effective collective rights organisations in the world. Licensing is their bread and butter. Merely because AI platforms are resisting claims, does not mean that the law in the UK is uncertain.
Amending UK law to address the challenges posed by AI development, particularly in relation to copyright and transparency, is essential to protect the rights of creators, foster responsible innovation and ensure a sustainable future for the creative industries. This should apply regardless of which country the scraping of copyright material takes place in, if developers market their product in the UK, regardless of where the training takes place. It would also ensure that AI start-ups based in the UK are not put at a competitive disadvantage due to the ability of international firms to conduct training in a different jurisdiction.
As we have heard throughout this debate, it is clear that the options proposed by the Government have no proper economic assessment underpinning them, no technology for an opt-out underpinning them and no enforcement mechanism proposed. It baffles me why the Conservative Opposition is not supporting these amendments, and I very much hope that the voices we have heard on the Conservative Benches will make sure that these amendments pass with acclamation.
I thank the noble Baroness, Lady Kidron, for moving this incredibly important group and all those speakers who have made the arguments so clearly and powerfully. I pay tribute to noble Baroness’s work on copyright and AI, which is so important for our arts and culture sector. As noble Lords have rightly said, our cultural industries make an enormous contribution to our country, not just in cultural terms but in economic ones, and we must ensure that our laws do not put that future at risk.
In the build-up to this debate I engaged with great pleasure with the noble Baroness, Lady Kidron, and on these Benches we are sympathetic to her arguments. Her Amendment 61 would require the Government to make regulations in this area. We accept the Government’s assurance that this is something they will seek to address, and I note the Minister’s confirmation that their consultation will form the basis of the Government’s approach to this issue. Given the importance of getting this right, our view is that the Government’s consultation is in mid-flight, and we have to allow it to do its work. Whatever view we take of the design and the timing of the consultation, it offers for now a way forward that will evidence some of the serious concerns expressed here. That said, we will take a great interest in the progress and outcomes of the consultation and will come back to this in future should the Government’s approach prove unsatisfactory.
Amendment 75 in my name also seeks to address the challenge that the growth in AI poses to our cultural industries. One of the key challenges in copyright and AI is enforceability. Copyright can be enforced only when we know it has been infringed. The size and the international distribution of AI training models render it extremely challenging to answer two fundamental questions today: first, was a given piece of content used in a training model and secondly, if so, in what jurisdiction did that use take place? If we cannot answer these questions, enforcement can become extremely hard, so a necessary, if not sufficient, part of the solution will be a digital watermark—a means of putting some red dye in the water where copyrighted material is used to train AIs. It could also potentially provide an automated means for content creators to opt out, with a vastly more manageable administrative burden.
I thank the Minister for his constructive engagement on digital watermarking and look to him to give the House an assurance that the Government will bring forward a plan to develop a technological standard for a machine-readable digital watermark. I hope that, if and when he does so, he is able to indicate both a timeline and an intention to engage internationally. Subject to receiving such reassurances when he rises, I shall not move my amendment.
My Lords, Amendment 46 seeks a review of court jurisdiction. As I said in Committee, the current system’s complexity leads to confusion regarding where to bring data protection claims—tribunals or courts? This is exacerbated by contradictory legal precedents from different levels of the judiciary, and it creates barriers for individuals seeking to enforce their rights.
Transferring jurisdiction to tribunals would simplify the process and reduce costs for individuals, and it would align with the approach for statutory appeals against public bodies, which are typically handled by tribunals. In the Killock v Information Commissioner case, Mrs Justice Farbey explicitly called for a “comprehensive strategic review” of the appeal mechanisms for data protection rights. That is effectively what we seek to do with this amendment.
In Committee, the noble Baroness, Lady Jones, raised concerns about transferring jurisdiction and introducing a new appeals regime. She argued that the tribunals lacked the capacity to handle complex data protection cases, but tribunals are, in fact, better suited to handle such matters due to their expertise and lower costs for individuals. Additionally, the volume of applications under Section 166—“Orders to progress complaints”—suggests significant demand for tribunal resolution, despite its current limitations.
The noble Baroness, Lady Jones, also expressed concern about the potential for a new appeal right to encourage “vexatious challenges”, but introducing a tribunal appeal system similar to the Freedom of Information Act could actually help filter out unfounded claims. This is because the tribunal would have the authority to scrutinise cases and potentially dismiss those deemed frivolous.
The noble Baroness, Lady Jones, emphasised the existing judicial review process as a sufficient safeguard against errors by the Information Commissioner. However, judicial review is costly and complex, presenting a significant barrier for individuals. A tribunal system would offer a much more accessible and less expensive avenue for redress.
I very much hope that, in view of the fact that this is a rather different amendment—it calls for a review—the Government will look at this. It is certainly called for by the judiciary, and I very much hope that the Government will take this on board at this stage.
I thank the noble Lord, Lord Clement-Jones, for moving his amendment, which would require the Secretary of State to review the potential impact of transferring to tribunals the jurisdiction of courts that relate to all data protection provisions. As I argued in Committee, courts have a long-standing authority and expertise in resolving complex legal disputes, including data protection cases, and removing the jurisdiction of the courts could risk undermining the depth and breadth of legal oversight required in such critical areas.
That said, as the noble Baroness, Lady Jones of Whitchurch, said in Committee, we have a mixed system of jurisdiction for legal issues relating to data, and tribunals have an important role to play. So, although we agree with the intentions behind the amendment from the noble Lord, Lord Clement-Jones, we do not support the push to transfer all data protection provisions from the courts to tribunals, as we believe that there is still an important role for courts to play. Given the importance of the role of the courts in resolving complex cases, we do not feel that this review is necessary.
My Lords, before the noble Viscount sits down, I wonder whether he has actually read the amendment; it calls for a review, not for transfer. I think that his speech is a carryover from Committee.
I thank the noble Lord, Lord Clement-Jones, for Amendment 46. It would require a review of the impact of transferring all data protection-related cases to the relevant tribunals. Currently there is a mixture of jurisdictions for tribunals and courts for data protection cases, depending on the nature of the proceedings. This is on the basis that certain claims are deemed appropriate for tribunal, while others are appropriate for courts, where stricter rules of evidence and procedure apply—for example, in dealing with claims by data subjects against controllers for compensation due to breaches of data protection legislation. As such, the current system already provides clear and appropriate administrative and judicial redress routes for data subjects seeking to exercise their rights.
Tribunals are in many cases the appropriate venue for data protection proceedings, including appeals by controllers against enforcement action or applications by data subjects for an order that the ICO should progress a complaint. Claims by individuals against businesses or other organisations for damages arising from breach of data protection law fall under the jurisdiction of courts rather than tribunals. This is appropriate, given the likely disparity between the resources of the respective parties, because courts apply stricter rules of evidence and procedures than tribunals. While court proceedings can, of course, be more costly, successful parties can usually recover their costs, which would not always be the case in tribunals.
I hope that the noble Lord agrees that there is a rationale for these different routes and that a review to consider transfer of jurisdictions to tribunals is therefore not necessary at this time.
My Lords, I thank the Minister for that dusty reply. I wonder whether he has been briefed about particular legal cases, such as Killock or Delo, where the judiciary themselves were confused about the nature of the different jurisdictions of tribunal and court. The Minister and, indeed, the noble Viscount, Lord Camrose, seemed to make speeches on the basis that all is wonderful and the jurisdiction of the courts and tribunals is so clearly defined that we do not need a review. That is not the case and, if the Minister were better briefed about the obiter, if not the judgments, in Delo and Killock, he might appreciate that there is considerable confusion about jurisdiction, as several judges have commented.
I am very disappointed by the Minister’s reply. I think that there will be several judges jumping up and down, considering that he has not really looked at the evidence. The Minister always says that he is very evidence-based. I very much hope that he will take another look at this—or, if he does not, that the MoJ will—as there is considerably greater merit in the amendment than he accords. However, I shall not press this to a vote and I beg leave to withdraw the amendment.
My Lords, I too support this. I well remember the passage of the Computer Misuse Act, and we were deeply unhappy about some of its provisions defining hacker tools et cetera, because they had nothing about intention. The Government simply said, “Yes, they will be committing an offence, but we will just ignore it if they are good people”. Leaving it to faceless people in some Civil Service department to decide who is good or bad, with nothing in the Bill, is not very wise. We were always deeply unhappy about it but had to go along with it because we had to have something; otherwise, we could not do anything about hacking tools being freely available. We ended up with a rather odd situation where there is no defence against being a good guy. This is a very sensible amendment to clean up an anomaly that has been sitting in our law for a long time and should probably have been cleaned up a long time ago.
My Lords, I support Amendments 47 and 48, which I was delighted to see tabled by the noble Lords, Lord Holmes and Lord Arbuthnot. I have long argued for changes to the Computer Misuse Act. I pay tribute to the CyberUp campaign, which has been extremely persistent in advocating these changes.
The CMA was drafted some 35 years ago—an age ago in computer technology—when internet usage was much lower and cybersecurity practices much less developed. This makes the Act in its current form unfit for the modern digital landscape and inhibits security professionals from conducting legitimate research. I will not repeat the arguments made by the two noble Lords. I know that the Minister, because of his digital regulation review, is absolutely apprised of this issue, and if he were able to make a decision this evening, I think he would take them on board. I very much hope that he will express sympathy for the amendments, however he wishes to do so—whether by giving an undertaking to bring something back at Third Reading or by doing something in the Commons. Clearly, he knows what the problem is. This issue has been under consideration for a long time, in the bowels of the Home Office—what worse place is there to be?—so I very much hope that the Minister will extract the issue and deal with it as expeditiously as he can.
I thank my noble friend Lord Holmes for tabling the amendment in this group. I, too, believe these amendments would improve the Bill. The nature of computing and data processing has fundamentally changed since the Computer Misuse Act 1990. Third parties hold and process immense quantities of data, and the means of accessing and interacting with that data have become unrecognisably more sophisticated. Updating the definition of unauthorised computer access through Amendment 48 is a sensible reform, as this new definition takes into account that data controllers and processors now hold substantial quantities of personal data. These entities are responsible for the security of the data they hold, so their provisions on access become legally relevant and this amendment reflects this.
When updating an offence, it is equally necessary to consider the legal defences, as my noble friend has rightly done in Amendment 47 by protecting individuals accessing information to detect or prevent a crime or whose actions are in the public interest. We on these Benches feel these amendments are wholly sensible. I urge the Minister to listen to the persuasive argument that my noble friend Lord Holmes has made and consider how we can deliver these improvements to our data legislation.
My Lords, I will speak to Amendment 48B. In our view, cookie paywalls create an unfair choose for users, essentially forcing them to pay for privacy. We tabled an amendment in Committee to ban cookie paywalls, but in the meantime, as the noble Baroness, Lady Jones, heralded at the time, the Information Commissioner’s Office has provided updated guidance on the “consent or pay” model for cookie compliance. It is now available for review. This guidance clarifies how organisations can offer users a choice between accepting personalised ads for free access or paying for an ad-free experience while ensuring compliance with data protection laws. It has confirmed that the “consent or pay” model is acceptable for UK publishers, provided certain conditions are met. Key requirements for a valid consent under this model include: users must have genuine free choice; the alternative to consent—that is, payment—must be reasonably priced; and users must be fully informed about their options.
The guidance is, however, contradictory. On the one hand, it says that cookie paywalls
“can be compliant with data protection law”
and that providers must document their assessments of how it is compliant with DPL. On the other, it says that, to be compliant with data protection law, cookie paywalls must allow users to choose freely without detriment. However, users who do not wish to pay the fee to access a website will be subject to detriment, because with a cookie paywall they will pay a fee if they wish to refuse consent. This is addressed as the “power imbalance”. It is also worth noting that this guidance does not constitute legal advice; it leaves significant latitude for legal interpretation and argument as to the compatibility of cookie paywalls with data protection law.
The core argument against “consent or pay” models is that they undermine the principle of freely given consent. The ICO guidance emphasises that organisations using these models must be able to demonstrate that users have a genuine choice and are not unfairly penalised for refusing to consent to data processing for personalised advertising. Yet in practice, given the power imbalance, on almost every occasion this is not possible. This amendment seeks to ensure that individuals maintain control over their personal data. By banning cookie paywalls, users can freely choose not to consent to cookies without having to pay a fee. I very much hope that the Government will reconsider the ICO’s guidance in particular, and consider banning cookie paywalls altogether.
My Lords, I thank my noble friend Lord Lucas for introducing this group. Amendments 48A and 50A, in his name, would ensure that regulated professionals, including financial services firms, are able to comply with current and future regulatory requirements. The example my noble friend has given—the FCA’s expectation that firms communicate effectively with consumers—is a good one. Clearly, we must avoid a circumstance where regulators expect businesses to take action that is not possible due to limiting legislation governing data use and access. My noble friend has made a forceful case and I hope the Government will be able to give the House appropriate assurance that businesses will not be put in this position as a result of this legislation.
Amendment 48B, in the name of the noble Lord, Lord Clement-Jones, seeks to ban cookie paywalls. I opposed a similar amendment when we debated it in Committee as it actually seeks to curtail choice. Currently, users have the options to pay money and stay private, share personal data and read for free, or walk away. Faced with these options, for instance, I have sadly chosen to forgo my regular evening reading of the Daily Mail’s excellent sports pages, but I see no reason why that newspaper, or anyone else, should be compelled to provide anything for free. In fact, it has been very persuasively argued by Jaron Lanier, Shoshana Zuboff and many others that it is the fact that so much of the internet is apparently, but not actually, free that has caused a great deal of damage, rather than having an open charging model. This approach finally reveals the exact cash value of individuals’ data that websites are harvesting and offers users choice. We do not agree with attempts to remove that choice.
My Lords, these amendments have to do with research access for online safety. Having sat on the Joint Committee of the draft Online Safety Bill back in 2021, I put on record that I am delighted that the Government have taken the issue of research access to data very seriously. It was a central plank of what we suggested and it is fantastic that they have done it.
Of the amendments in my name, Amendment 51 would simply ensure that the provisions of Clause 123 are acted on by removing the Government’s discretion as to whether they introduce regulations. It also introduces a deadline of 12 months for the Government to do so. Amendment 53 seeks to ensure that the regulators will enable independent researchers to research how online risks and harms impact different groups, especially vulnerable users, including children. Given the excitements we have already had this evening, I do not propose to press any of them, but I would like to hear from the Minister that he has heard me and that the Government will seek to enshrine the principle of different ages, different stages, different people, when he responds.
I note that the noble Lord, Lord Bethell, who has the other amendments in this group, to which I added my name, is not in his place, but I understand that he has sought—and got—reassurance on his amendments. So there is just one remaining matter on which I would like further reassurance: the scope of the legal privilege exception. A letter from the Minister on 10 January explains:
“The clause restates the existing law on legally privileged information as a reassurance that regulated services will not be asked to break the existing legislation on the disclosure of this type of data”.
It seems that the Minister has veered tantalisingly close to answering my question, but not in a manner that I can quite understand. So I would really love to understand—and I would be grateful to the Minister if he would try to explain to me—how the Government will prevent tech companies using legal privilege as a shield. Specifically, would CCing a lawyer on every email exchange, or having a lawyer in every team, allow companies to prevent legitimate scrutiny of their safety record? I have sat in Silicon Valley headquarters and each team came with its own lawyer—I would really appreciate clarity on this issue. I beg to move.
My Lords, I can only support what the noble Baroness, Lady Kidron, had to say. This is essentially unfinished business from the Online Safety Act, which we laboured in the vineyard to deliver some time ago. These amendments aim to strengthen Clause 123 and try to make sure that this actually happens and that we do not get the outcomes of the kind that the noble Baroness has mentioned.
I, too, have read the letter from the Minister to the noble Lord, Lord Bethell. It is hedged about with a number of qualifications, so I very much hope that the Minister will cut through it and give us some very clear assurances, because I must say that I veer back and forth when I read the paragraphs. I say, “There’s a win”, and then the next paragraph kind of qualifies it, so perhaps the Minister will give us true clarity when he responds.
My Lords, I wanted to add something, having spent a lot of time on Part 3 of the Digital Economy Act, which after many assurances and a couple of years, the Executive decided not to implement, against the wishes of Parliament. It worries me when the Executive suddenly feel that they can do those sorts of things. I am afraid that leopards sometimes do not change their spots, and I would hate to see this happen again, so Amendment 51 immediately appeals. Parliament needs to assert its authority.