(1 week ago)
Commons ChamberI would like to thank colleagues in the other place and in this House who have worked so hard to improve the Bill. By modernising data infrastructure and governance, this Bill seeks to unlock the secure, efficient use of data while promoting innovation across sectors. As a tech evangelist, as well as the Chair of the Science, Innovation and Technology Committee, I welcome it, and I am pleased to see colleagues from the Select Committee, my hon. Friend the Member for Stoke-on-Trent South (Dr Gardner) and the right hon. Member for North West Hampshire (Kit Malthouse), here for this debate.
Having spent many unhappy hours when working for Ofcom trying to find out where British Telecom’s ducts were actually buried, I offer a very personal welcome to the national underground asset register, and I thank the Minister for his work on this Bill as well as for his opening comments.
I agree with the Minister that there is much to welcome in this Bill, but much of the Second Reading debate was consumed by discussion on AI and copyright. I know many Members intend to speak on that today, so I will just briefly set out my view.
The problem with the Government’s proposals on AI and copyright are that they give all the power to the tech platforms who—let us be frank—have a great deal of power already, as well as trillions of dollars in stock market capitalisation and a determination to return value to their shareholders. What they do not have is an incentive to design appropriate technology for transparency and rights reservation if they believe that in its absence they will have free access to our fantastic creators’ ingenuity. It is essential that the Minister convinces them that if they do not deliver this technology—I agree with him that it is highly possible to do so—then he will impose it.
Perhaps the Minister could announce an open competition, with a supplier contract as the prize, for whichever innovative company designs something. The Science, Innovation and Technology Committee, sitting with the Culture, Media and Sport Committee, heard from small companies that can do just that. The tech giants might not like it, but I often say that the opposite of regulation is not no regulation—it is bad regulation. If the tech platforms do not lead, they will be obliged to follow because the House will not allow the copyright of our fantastic creators to be put at risk. The Minister knows that I think him extremely charismatic and always have done, but I do not believe that “Chris from DSIT” can prevail against the combined forces of Björn from Abba and Paul from The Beatles.
The prospects for human advancement opened by using data for scientific research are immense. As a world-leading science powerhouse, the UK must take advantage of them. That is why, despite being a strong advocate of personal data rights, I welcome the Bill’s proposals to allow the reuse of data without consent for the purposes of scientific research. I am concerned, however, that the exemption is too broad and that it will be taken advantage of by data-hungry tech companies using the exemption even if they are not truly advancing the cause of scientific progress but simply, as with copyright, training their AI models.
Huge amounts of data is already collected by platforms, such as direct messages on Instagram or via web-scraping of any website that contains an individual’s personal data such as published records or people’s public LinkedIn pages. We know it can be misused because it has been, most recently with Meta’s controversial decision to use Instagram-user data to train AI models, triggering an Information Commissioner’s Office response because of the difficulty users encountered in objecting to it. Then there is the risk of data collected via tracking cookies or the profiling of browsing behaviour, which companies such as Meta use to fingerprint people’s devices and track their browsing habits. Could the data used to create ads also be freely reusable under this exemption? The US tech firm Palantir has the contract for the NHS federated data platform. Amnesty International has already raised concerns about the potential for patients’ data being mishandled. Does the Bill mean that our health data could be reused by Palantir for what it calls research purposes?
Before the hon. Lady moves on from Palantir, I think the House should know that it is an organisation with its origins in the American security state—the National Security Agency and the Central Intelligence Agency—and I cannot understand for the life of me why we are willing to commit the data of our citizens to an organisation like that.
That is exactly what is at the heart of this matter—the data that drives that addictiveness and commercialises our children’s attention is not the way forward.
Many amazing organisations have gathered evidence in this area, and it is abundantly clear that the overuse of children’s data increases their risk of harm. It powers toxic algorithms that trap children in cycles of harmful content, recommender systems that connect them with predators, and discriminatory AI systems that are used to make decisions about them that carry lifelong consequences. Health Professionals for Safer Screens—a coalition of child psychiatrists, paediatricians and GPs— is pleading for immediate legislative action.
This is not a partisan issue. So many of us adults can relate to the feeling of being drawn into endless scrolling on our devices—I will not look around the Chamber too much. Imagine how much more difficult it is for developing minds. This is a cross-party problem, and it should not be political, but we need action now.
Let me be absolutely clear: this change is not about restricting young people’s digital access or opposing technology and innovation; it is about requiring platforms to design their services with children’s safety as the default, not as an afterthought. For years we have watched as our children’s wellbeing has been compromised by big tech companies and their profits. Our call for action is supported by the National Society for the Prevention of Cruelty to Children, 5rights, Healthcare Professionals for Safer Screens, Girlguiding, Mumsnet and the Online Safety Act network. This is our chance to protect our children. The time to act is not 18 months down the line, as the Conservatives suggest, but now. I urge Members to support new clause 1 and take the crucial steps towards creating a digital world where children can truly thrive.
To protect our children, I have also tabled amendment 45 to clause 80, which seeks to ensure that automated decision-making systems cannot be used to make impactful decisions about children without robust safeguards. The Bill must place a child’s best interests at the heart of any such system, especially where education or healthcare are concerned.
We must protect the foundational rights of our creators in this new technological landscape, which is why I have tabled new clause 2. The UK’s creative industries contribute £126 billion annually to our economy and employ more than 2.3 million people—they are vital to our economy and our cultural identity. These are the artists, musicians, writers and creators who inspire us, define us and proudly carry British creativity on to the global stage. Yet today, creative professionals across the UK watch with mounting alarm as AI models trained on their life’s work generate imitations without permission, payment or even acknowledgment.
New clause 2 would ensure that operators of web crawlers and AI models comply with existing UK copyright law, regardless of where they are based. This is not about stifling innovation; it is about ensuring that innovation respects established rights and is good for everyone. Currently, AI companies are scraping creative works at an industrial scale. A single AI model may be trained on thousands of copyrighted works without permission or compensation.
The UK company Polaron is a fantastic example, creating AI technology to help engineers to characterise materials, quantify microstructural variation and optimise microstructural designs faster than ever before. Why do I bring up Polaron? It is training an AI model built from scratch without using copyright materials.
I am emphatically on the hon. Lady’s side in her intent to protect British creativity, but how does she respond to the implicit threat from artificial intelligence providers to this and other elements of the Bill to effectively deny AI to the UK if they find the regulations too difficult to deal with?
We have a thriving innovation sector in the UK, so those companies are not going anywhere—they want to work with the UK. We actually have a system now that has a fantastic creative industry and we have innovation and business coming in. There are many ways to incentivise that. I talk a lot about money, skills and infrastructure—that is what these innovative companies are looking for. We can make sure the guardrails are right so that it works for everyone.
By ensuring that operators of web crawlers and AI models comply with existing UK copyright law, we are simply upholding established rights in a new technological context. The UK led the world in establishing trustworthy financial and legal services, creating one of the largest economies by taking a long-term view, and we can do the same with technology. By supporting new clause 2, we could establish the UK as a base for trustworthy technology while protecting our creative industries.
Finally, I will touch on new clause 4, which would address the critical gap in our approach to AI regulation: the lack of transparency regarding training data. Right now, creators have no way of knowing if their work has been used to train AI models. Transparency is the foundation of trust. Without it, we risk not only exploiting creators, but undermining public confidence in these powerful new technologies. The principle is simple: if an AI system is trained using someone’s creative work, they deserve to know about it and to have a say in how it is used. That is not just fair to creators, but essential for building an AI ecosystem that the public trust. By supporting new clause 4, we would ensure that the development of AI happens in the open, allowing for proper compensation, attribution and accountability. That is how we will build responsible AI that serves everyone, not just the tech companies.
On the point of transparency, I will touch briefly on a couple of other amendments. We must go further in algorithmic decision making. That is why I have tabled amendment 46, which would ensure that individuals receive personalised explanations in plain language when an automated decision system affects them. We cannot allow generic justifications to stand in for accountability.
I thank the Minister for that reassurance. I did take part in a Westminster Hall debate on this matter a couple of weeks ago, but one of his colleagues was responding. I made the same point then. Quite often in the media or more generally, AI seems to be pitted against our creative industries, which should not be the case, because we know that our creative industries embrace technology virtually more than any other sector. They want to use AI responsibly. They do not want to be replaced by it. The question before us is how lawmakers can ensure that AI is used ethically without this large-scale theft of IP. We are today discussing amendments that go somewhere towards providing an answer to that question.
On this issue of Luddites, surely one of the problems for English language creators is that what they create is of more value because of the reach of the English language over others. Therefore, they are more likely to have their product scraped and have more damage done to them.
My right hon. Friend makes a very good observation, but the fact is that so much content has already been scraped. Crawlers are all over the intellectual property of so many of our creators, writers and publishers—so much so that we are almost in a position where we are shutting the gate after the horse has bolted. Nevertheless, we need to do what we can legislatively to get to a better place on this issue.
New clause 2 would simply require anyone operating web crawlers for training and developing AI models to comply with copyright law. It is self-evident and incontrovertible that AI developers looking to deploy their systems in the UK should comply with UK law, but they often claim that copyright is not very clear. I would argue that it is perfectly clear; it is just that sometimes they do not like it. It is a failure to abide by the law that is creating lawsuits around the world. The new clause would require all those marketing their AI models in the UK to abide by our gold-standard copyright regime, which is the basis that underpins our thriving creative industries.
New clause 3 would require web crawler operations and AI developers to disclose the who, what, why, and when crawlers are being used. It also requires them to use different crawlers for different purposes and to ensure that rights holders are not punished for blocking them. A joint hearing of the Culture, Media and Sport Committee and the Science, Innovation and Technology Committee heard how publishers are being targeted by thousands of web crawlers with the intention of scraping content to sell to AI developers. We heard that many, if not most, web crawlers are not abiding by current opt-out protocols—robots.txt, for example. To put it another way, some developers of large language models are buying data scraped by third-party tech companies, in contravention of robots.txt protocols, to evade accusations of foul play. All this does is undermine existing licensing and divert revenues that should be returning to our creative industries and news media sector. New clause 3 would provide transparency over who is scraping copyrighted works and give creators the ability to assert and enforce their rights.
New clause 4 would require AI developers to be transparent about what data is going into their AI models. Transparency is fundamental to this debate. It is what we should all be focusing on. We are already behind the drag curve on this. California has introduced transparency requirements, and no one can say that the developers are fleeing silicon valley just yet.
New clause 20, tabled by the official Opposition, also addresses transparency. It would protect the AI sector from legal action by enabling both sides to come to the table and get a fair deal. A core part of this new clause is the requirement on the Secretary of State to commit to a plan to help support creators where their copyright has been used in AI by requiring a degree of transparency.
New clause 5 would provide the means by which we could enforce the rules. It would give the Information Commissioner the power to investigate, assess and sanction bad actors. It would also entitle rights holders to recover damages for any losses suffered, and to injunctive relief. Part of the reason why rights holders are so concerned is that the vast majority of creators do not have deep enough pockets to take on AI developers. How can they take on billion-dollar big tech companies when those companies have the best lawyers that money can buy, who can bog cases down in legislation and red tape? Rights holders need a way of enforcing their rights that is accessible, practical and fair.
The Government’s AI and copyright consultation says that it wants to ensure
“a clear legal basis for AI training with copyright material”.
That is what the new clauses that I have spoken to would deliver. Together they refute the tech sector’s claims of legal uncertainty, while providing transparency and enforcement capabilities for creators.
Ultimately, transparency is the main barrier to greater collaboration between AI developers and creators. Notwithstanding some of the unambitious Government amendments, the Opposition’s amendments would provide the long-overdue redress to protect our creative industries by requiring transparency and a widening of the scope of those who are subject to copyright laws.
The amendments would protect our professional creators and journalists, preserve the pipeline of young people looking to make a career in these sectors themselves, and cement the UK as a genuine creative industries superpower, maintaining our advantage in the field of monetising intellectual property. One day we may make a commercial advantage out of the fact that we are the place where companies can set up ethical AI companies—we could be the envy of the world.
My right hon. Friend makes a formidably important point. The amendment highlights one of the extraordinary weaknesses of the Bill, which is that it in effect reverses GDPR on a large number of citizen protections. To reiterate the point he gently made, that enormous fine will not stop TikTok, because it operates under legal compulsion. Even though it paid £450 million, it will continue to commit the criminal offence for which it has just been convicted.
I agree with my right hon. Friend: that is the peculiarity. The Minister knows only too well about the nature of what goes on in countries such as China. Chinese companies are frankly scared stiff of cutting across what their Government tell them they have to do, because what happens is quite brutal.
We have to figure out how we protect data from ill use by bad regimes. I use China as an example because it is simply the most powerful of those bad regimes, but many others do not observe data protection in the way that we would assume under contract law. For example, BGI’s harnessing of the data it has gleaned from covid tests, and its dominance in the pregnancy test market, is staggering. It has been officially allowed to take 15% of the data, but it has taken considerably more, and that is just one area.
Genomics is a huge and vital area right now, because it will dominate everything in our lives, and it populates AI with an ability to describe and recreate the whole essence of individuals, so this is not a casual or small matter. We talk about AI being used in the creative industries—I have a vested interest, because my son is in the creative industries and would support what has been said by many others about protecting them—but this area goes a whole quantum leap in advance of that. We may not even know in the future, from the nature of who they are, who we are talking to and what their vital statistics are.
This amendment is not about one country; it is about providing a yardstick against which all third countries should be measured. If we are to maintain the UK’s standing as a nation that upholds privacy, the rule of law, democracy and accountability, we must not allow data to be transferred to regimes that fundamentally do not share those values. It is high time that we did this, and I am glad to see the Minister nodding. I hope therefore that he might look again at the amendment. Out of old involvement in an organisation that he knows I am still part of, he might think to himself that maybe this is worth doing or finding some way through.
I thank the hon. Member for making that important point, and of course she is right.
I go back to this question of the threats to the database, which are not simply the product of my imagination; they are real. First, all data can be monetised, but this database is so large that huge commercial interests are now trying to get access to that health data. I do not want to cause offence to any hon. Members, all of whom I know follow the rules, but it is interesting that nearly £3 million from the private health sector was made available to over 150 different Members of Parliament. I do not suggest that any Member has done anything inappropriate—that would be wrong of me—but one wonders how almost £3 million was found by a private sector that has no commercial interest in pursuing those investments.
Secondly, on commercial interests, will the Minister confirm that at no stage will any data or any other aspect of the NHS be up for sale as part of negotiations with the United States on a trade deal? Will the Government provide some guidance on that? If the House reflects on private sector interests—which are not necessarily in the best interests of humanity—and how they make money, there is an interesting thought about health insurance. A party represented in the House is led by an individual who has suggested that we should end the way that we fund the NHS and replace it with an insurance system. If the insurance industry got access to the data held on all of us by the NHS, they would be able to see the genome of each person or of groups of people, and provide differential rates of insurance according to people’s genetic make-up. That is a serious threat. I do not think the party that has recently entered the House has thought that through, but companies providing insurance could commercialise that data. That is one reason we must never follow the track towards a national insurance system to replace the NHS.
Yesterday, the Secretary of State for Health and Social Care told the House that we will not be privatising the NHS, and I welcome that statement. Reference has already been made to Palantir—the right hon. Member for Goole and Pocklington (David Davis) mentioned it earlier—and the contract that we inherited from the previous Government. It is extraordinary that Palantir, a company that has deep roots in the United States defence establishment, should be handling the data of millions of people, when its chair has said that he is completely opposed to the central principle of the NHS and that he effectively wants a private health system in the UK. How could a £500 million contract to handle our personal data have been handed over to such a company, led by a person whose purpose seems to be to destroy the principles of our NHS? How our data is handled should be our decision, in the United Kingdom.
The Information Commissioner says that it is important that this precious and vital data, which is personal to each of us, should be protected against any possibility of cyber-attacks. However, there has already been a cyber-attack. Qilin—the way I am pronouncing it makes it sound as if someone is trying to commit murder, but there may be another way of saying it—is a Russian cyber-criminal group that obtained access to 400 GB of private information held by a company dealing with pathology testing. That is an enormous amount of data. Qilin attempted to extort from the company that held the data a financial interest. I do not know whether enough provision is made in the Bill for the protection of our data, so I suggest that there should be a new public interest test, with a report to Parliament within six months, which we can all debate and then examine whether the legislation has gone far enough.
Finally, the Information Commissioner says three things. First, the database must retain public confidence. Media discussions and opinion polling show that people are losing confidence that their personal data is secure, and I understand why that should be the case. Secondly, data should be properly protected and built from the beginning with proper safeguards against cyber-attacks. Thirdly, and perhaps most importantly, the Bill refers to an effective exemption for scientific research. As my hon. Friend the Member for Newcastle upon Tyne Central and West (Chi Onwurah) said, private companies, and perhaps US companies, might use the idea of promoting scientific research as a fig leaf to hide their search for profit from the precious commodity—data—that we have because we created our NHS. That is a very dangerous thought, and the Information Commissioner says he is not convinced that the definition of scientific research in the Bill is sufficiently strong to protect us from predatory activity by other state actors or private companies.
The hon. Gentleman is making an excellent speech and some very perceptive points. I remind him that previous attempts by the NHS to create a single data standard have all failed, because the GPs did not believe that the security levels were sufficient. It is not just the Information Commissioner; the GPs refused to co-operate, which highlights the powerful point that the hon. Gentleman is making.
I am grateful to the right hon. Gentleman for making that very serious point. When the clinicians—whose duty is to protect their patients—say they are not convinced about the safety of data being handed over to a central database, we have to listen to their reactions.
I do not intend to press my new clause to the vote, but it is important that we continue to debate this matter, because this enormous database—which can contribute to the general welfare of all humanity—must be protected in such a way that it retains confidence and ensures the security of the whole system. With that, I leave the discussion to continue on other matters.
(5 months, 3 weeks ago)
Commons ChamberI am extremely grateful to my hon. Friend for raising one of the most serious issues of our time. The Online Safety Act 2023 requires providers, as part of their risk assessment, to consider specifically how algorithms will impact a user’s exposure to illegal content and children’s exposure to harmful content. I have introduced new measures to ensure that children are kept safe, and today I issued a statement of strategic priority to Ofcom to insist that it continues to do so in future.
The Government are working closely with individual universities, the university sector and our intelligence community to ensure that our research is not only world class but safe and secure.
(1 year, 8 months ago)
Commons ChamberThe right hon. Gentleman is entirely correct. Whether it involves a particularly right-wing cause or antisemitism—or, indeed, dieting content that drags people into something more radical in relation to eating disorders—the bubble mentality created by these algorithms massively increases the risk of radicalisation, and we therefore have an increased duty to protect people.
As I have said, I am pleased to see the positive changes that have been made as a result of Opposition pressure and the uncompromising efforts of those in the House of Lords, especially Baroness Kidron, who has been nothing short of tenacious. Throughout the time in which we have been discussing the Bill, I have spoken to Members of both Houses about it, and it has been very unusual to come across anyone who knows what they are talking about, and, in particular, has the incredible depth of knowledge, understanding and wisdom shown by Baroness Kidron. I was able to speak to her as someone who practically grew up on the internet—we had it at home when I was eight—but she knew far more about it than I did. I am extremely pleased that the Government have worked with her to improve the Bill, and have accepted that she has a huge breadth of knowledge. She managed to do what we did not quite manage to do in this House, although hopefully we laid the foundations.
I want to refer to a number of points that were mentioned by the Minister and are also mentioned in the letters that the Government provided relating to the Lords amendments. Algorithmic scrutiny is incredibly important, and I, along with other Members, have raised it a number of times—again, in connection with concern about radicalisation. Some organisations have been doing better things recently. For instance, someone who searches for something may begin to go down a rabbit hole. Some companies are now putting up a flag, for instance a video, suggesting that users are going down a dark hole and should look at something a bit lighter, and directing them away from the autoplaying of the more radical content. If all organisations, or at least a significant number—particularly those with high traffic—can be encouraged to take such action rather than allowing people to be driven to more extreme content, that will be a positive step.
I was pleased to hear about the upcoming researcher access report, and about the report on app stores. I asked a previous Minister about app stores a year or so ago, and the Minister said that they were not included, and that was the end of it. Given the risk that is posed by app stores, the fact that they were not categorised as user-to-user content concerned me greatly. Someone who wants to put something on an Apple app store has to jump through Apple’s hoops. The content is not owned by the app store, and the same applies to some of the material on the PlayStation store. It is owned by the person who created the content, and it is therefore user-to-user content. In some cases, it is created by one individual. There is no ongoing review of that. Age-rating is another issue: app stores choose whatever age they happen to decide is the most important. Some of the dating apps, such as match.com, have been active in that regard, and have made it clear that their platforms are not for under-16s or under-18s, while the app store has rated the content as being for a younger age than the users’ actual age. That is of concern, especially if the companies are trying to improve age-rating.
On the subject of age rating, I am pleased to see more in the Bill about age assurance and the frameworks. I am particularly pleased to see what is going to happen in relation to trying to stop children being able to access pornography. That is incredibly important but it had been missing from the Bill. I understand that Baroness Floella Benjamin has done a huge amount of work on pushing this forward and ensuring that parliamentarians are briefed on it, and I thank her for the work that she has done. Human trafficking has also been included. Again, that was something that we pushed for, and I am glad to see that it has been put on the face of the Bill.
I want to talk briefly about the review mechanisms, then I will go on to talk about end-to-end encryption. I am still concerned that the review mechanisms are not strong enough. We have pushed to have a parliamentary Committee convened, for example, to review this legislation. This is the fastest moving area of life. Things are changing so dramatically. How many people in here had even heard of ChatGPT a year and a half ago? How many people had used a virtual reality headset? How many people had accessed Rec Room of any of the other VR systems? I understand that the Government have genuinely tried their best to make the Bill as future-proof as possible, but we have no parliamentary scrutiny mechanisms written in. I am not trying to undermine the work of the Committee on this—I think it is incredibly important—but Select Committees are busy and they have no legislative power in this regard. If the Government had written in a review, that would have been incredibly helpful.
The hon. Lady is making a very good speech. When I first came to this House, which was rather a long time ago now, there was a Companies Act every year, because company law was changing at the time, as was the nature of post-war capitalism. It seems to me that there is a strong argument for an annual Act on the handling and management of the internet. What she is saying is exactly right, and that is probably where we will end up.
I completely support the right hon. Member’s point—I would love to see this happening on an annual basis. I am sure that the Ministers who have shepherded the Bill through would be terrified of that, and that the Government team sitting over there are probably quaking in their boots at the suggestion, but given how fast this moves, I think that this would be incredibly important.
The Government’s record on post-implementation reviews of legislation is pretty shoddy. If you ask Government Departments what percentage of legislation they have put through a post-implementation review in the timeline they were supposed to, they will say that it is very small. Some Departments are a bit better than others, but given the number of reshuffles there have been, some do not even know which pieces of legislation they are supposed to be post-implementation reviewing. I am concerned that this legislation will get lost, and that there is no legislative back-up to any of the mechanisms for reviewing it. The Minister has said that it will be kept under review, but can we have some sort of governmental commitment that an actual review will take place, and that legislation will be forthcoming if necessary, to ensure that the implementation of this Bill is carried out as intended? We are not necessarily asking the Government to change it; we are just asking them to cover all the things that they intend it to cover.
On end-to-end encryption, on child sexual exploitation and abuse materials, and on the last resort provider—I have been consistent with every Minister I have spoken to across the Dispatch Box and every time I have spoken to hon. Members about this—when there is any use of child sexual exploitation material or child sexual abuse material, we should be able to require the provider to find it. That absolutely trumps privacy. The largest increase in child sexual abuse material is in self-generated content. That is horrific. We are seeing a massive increase in that number. We need providers to be able to search—using the hash numbers that they can categorise images with, or however they want to do it—for people who are sharing this material in order to allow the authorities to arrest them and put them behind bars so that they cannot cause any more harm to children. That is more important than any privacy concerns. Although Ministers have not put it in the Bill until this point, they have, to their credit, been clear that that is more important than any privacy concerns, and that protecting children trumps those concerns when it comes to abuse materials and exploitation. I am glad to see that that is now written into the Bill; it is important that it was not just stated at the Dispatch Box, even though it was mentioned by a number of Members.
(1 year, 10 months ago)
Commons ChamberMy right hon. Friend’s words, “at least among like-minded countries”, triggered a thought. If we do not include China—in lots of other areas we exclude it for moral and ethical reasons—it will be a futile exercise. As far as I can tell, China wants to be involved. What is his view on involving countries such as China?
My view is that it should be a global initiative. At the very least, strong security aspects will combine like-minded nations. We should advance that; we may put protections in place with other linked nations. I completely agree with my right hon. Friend that we should look to establish a global consensus. There is sometimes pessimism about whether it is possible to regulate genies that have come out of the bottle, but if we think of available technologies such as human cloning, there is not a country in the world—as far as I am aware —that has not recognised it as ethically wrong and acted against it. In fact, I think there is a person in China in jail at the moment for having attempted that.
I will draw my remarks to a close, having set out the rich range of challenges that stand before Governments around the world and our regulators. They are not easy things to get right, but it is of profound importance that we think carefully and put in place the best possible governance system to maximise the benefits and see off the harms that may result. For the Minister and his colleagues across Whitehall, it means a busy summer preparing for the summit, but I wish them well, and I look forward to the House taking a great interest in and participating in and around the planned summit.
(2 years, 1 month ago)
Commons ChamberI support the Budget. More importantly, the markets seem to support it as well. Stability and balance are the hallmarks of what the Chancellor has achieved, and I congratulate him on that.
If my right hon. Friend will forgive me, so many other people want to speak that it would be unfair if I took interventions.
With six minutes, and with a Budget containing so many measures, it is difficult to know what to speak about, but I want to speak briefly about children, the environment and booze—not necessarily at the same time. I very much welcome the Secretary of State’s opening remarks and her concentration on the importance of AI. Even though some of us may not fully understand all of its implications, it is absolutely where we need to grow our economy.
The £20 billion of investment in carbon capture is huge and vital. It is a vital component of our target to get to net zero. We cannot get everything not to release carbon, but we can have ways of mitigating emissions to bring us to our net zero target—hopefully sooner than 2050. It is slightly churlish of the hon. Member for Glasgow North West (Carol Monaghan), who spoke for the SNP, to say that if something is not in Scotland it does not really count. Climate change is no respecter of any border, let alone that between England and Scotland.
I absolutely welcome the Budget’s huge implications for investment in R&D, which is really important. I also absolutely welcome the freezing of fuel duty for the 13th year in a row, which will mean £200 to the average driver.
There are lots of little things in the Budget that will have a big impact, such as the help for swimming pools and leisure centres, which were hit badly during the pandemic and have now been hit by energy costs. That will be a lifeline and it will help the health of our constituents. The measure on energy prepayment meters was long overdue; it was absurd and immoral that those least able to pay should be penalised and pay that much more for using prepayment meters. Thirty million pounds has been allocated for additional veterans’ services, and there is £10 million to help with suicide prevention—a hidden illness that has a huge impact on many of our constituents and their families.
If I may talk briefly about children, I remain concerned —as I would, being a former children’s Minister—that all the emphasis has been on adult social care and not enough has been on children’s social care, where it is estimated there is still a shortfall of some £1.6 billion. We need to do something about that, because over 80% of our interventions on children in the care system and those coming into the care system are late interventions rather than preventive early interventions, which is a big change from what went on some years before.
We need to invest in our social worker workforce. This afternoon, I have been hosting the Social Worker of the Year awards, and some of the most remarkable social workers from around the country have been to Parliament to receive their awards. They are the fourth emergency service and we need much better workforce planning, as we do in the NHS, to make sure that we not only recruit more social workers, but keep them. It is a false economy not to be doing that.
I welcome the many good measures on children, particularly on children in care, but will the Chancellor consider what we can do to provide free bus travel for all care leavers aged between 18 and 25, for whom the cost of a bus fare to get to work or education is prohibitive? Will he also consider a national programme to allow care leavers to access a rent deposit as part of their benefits, since they find it harder than many to access accommodation?
On childcare, which was one of the most significant parts of the Budget, I absolutely support the measures that were announced. As Coram Family and Childcare puts it,
“the introduction of 30 hours childcare for children from 9 months old to three years old…will make a huge difference for families currently struggling with high costs”.
I welcome that, but there are question marks around sufficiency and shortages in the childcare available; currently only half of local authorities have sufficient childcare for children aged under two and less than half have enough childcare for parents working full time. With these generous measures on childcare, there is more we need to do to make sure that people with the appropriate skills are there to provide it.
I welcome the wraparound childcare available through schools from 8 am to 6 pm, which will make a real difference to parents’ ability to go to jobs and make a meaningful contribution. However, there is a problem in that only 25% of local authorities have enough after-school childcare for children aged five to 11 and the figure is even lower for those aged 12 to 14. Again, there are serious question marks about capacity, which I am sure the Chancellor will answer.
There is more I could say about children but, turning to the environment, insulating homes reduces energy waste and keeps people warmer, while lowering bills permanently. We need further public investment in insulating fuel-poor homes, and we need to create new tax incentives for owner-occupiers to do more to improve the energy efficiency of their homes—as is the case in other European countries, where it is reflected in council tax banding and other up-front fees.
Finally, on beer, the Chancellor’s measures to ensure that tax on draught beer sold in pubs does not increase are great and will save the sector around £70 million a year. However, the British Beer and Pub Association, which is already seeing its members hit by an energy crisis and the weight of debt build-up over years, says that there is a 10% increase in the duty on non-draught beers—60% of all beer sales. Can we aim for a level playing field for our beer and pub industry, which has been particularly hit during the energy crisis and the pandemic? What is in the Budget is really good, but we could do a little bit more.
I draw the attention of the House to my entry in the Register of Members’ Financial Interests.
Like two of the previous speakers, I am also a science graduate, although I do not compare myself with the Conservative party’s most famous science graduate. I had intended to make my speech essentially about science and technology, because they are massively important and, as the hon. Member for Manchester Central (Lucy Powell) pointed out, we have fantastic competitive advantages in those fields. That will be a major part of growth.
Since last Tuesday, however, dramatic events have unfolded in the banking sector—particularly over the weekend. Back in 2009-10, the then Chair of the Treasury Committee, Lord McFall, asked me to chair the Future of Banking Commission. The last week has, unfortunately, brought back some memories. One of the characteristic problems of the banking sector is its short memory, particularly when it is Wall Street that we are talking about. I hope that the House will indulge me if I remind it of the lessons of the major banking crashes of the past half century.
Back in 1933, after the great depression, the Americans passed the Glass-Steagall Act, which separated banks out into risky investment banks and straightforward commercial banks. That gave us about seven decades of stability until 1999, when President Clinton—under pressure from unwise and greedy Wall Street lobbyists—essentially removed Glass-Steagall. What followed was the collapse of several banks, including Lehman Brothers—probably precipitated by the new mark-to-market rules—in the great crisis that we saw in 2008.
In 2009, because of the crash, America passed Dodd-Frank, which required banks with more than $50 billion in assets to be subject to tight regulation. Again, under pressure from Wall Street, President Trump relaxed those regulations in 2019. I talk about Wall Street, but the whole world followed. Of course, after that relaxation, banks assumed that they had an infinite period of low interest rates and that they could borrow ad nauseam. When global interest rates sharply increased by three, four or five times, the shock destabilised a number of those banks. One such bank was Silicon Valley Bank, which had been taken out of regulation by the Trump changes.
There is a lesson for us in all that. It has caused an instability in the financial system. Chancellors, central bank governors, financial secretaries in the States and regulators have no chance but to claim that the system is robust. I am not so sure. We will not know for a while whether it is actually robust, because of the complexity of the system. Of the three major banks that have failed so far, each has failed for different reasons, and we have no clear insight into what risks other banks have taken, partly because of the deregulation under Trump and his predecessors. In that respect, we in this country are probably in a better place than either the Americans or the Europeans, but I am keeping my fingers crossed as I say that so as not to tempt fate.
There is one lesson that we should learn. A big issue on which the world is hanging at the moment is whether the takeover of Credit Suisse by UBS is a success. I draw people’s minds back to Lloyds taking over HBOS, which was done under pressure from the Government of the day—from Gordon Brown—and Lloyds itself nearly collapsing the very next year. I hope that UBS will not do the same. The point of this story is that we are in a period of extraordinary global financial instability.
I am a low-tax Tory—I would have loved the Chancellor to have had a lower-tax strategy—but I have to say that the events of the past week have demonstrated that a very small-c conservative strategy is wise under these circumstances. The more confident the markets in the Government, the better our prospects for the future. That said, I would be completely unsurprised if we had to have another Budget in the autumn owing to the nature of the transitions and changes that are now happening.
If that happens, I would ask the Chancellor, “Could you please look again at bringing back your super-deduction?” That will attract investment here in a way that will not happen with the 25% rate. I would ask, “Will you look at doing away with IR35 and at other concerns that will improve prospects for small businesses?” In my view, it will be incredibly difficult for the banks to get right the balance between inflation and growth now that their hands are tied by the instability of the banking sector. My one line to the Chancellor is this: please look, for the next Budget, at much more growth.