(7 months, 4 weeks ago)
Commons ChamberI thank my right hon. and learned Friend for his intervention and his earlier engagement, when he made his position on that point clear. He is right to say that penalties can be significant—up to 10% of global turnover—so it is fair that we allow organisations to challenge penalties on the merits of the case, but maintain the ability to impose pro-competition interventions and conduct requirements on platforms. The amendments made in the other place risk undermining that careful balance. For example, amendments to revert the appeals standard for fines to judicial review principles, to which my right hon. and learned Friend the Member for South Swindon (Sir Robert Buckland) referred, would remove a valuable safeguard on the significant new powers that the Bill gives the CMA, as would the removal of the requirement on the CMA to act proportionately. Meanwhile, amendments to the countervailing benefits exemption risk making the exemption less clear for stakeholders. Consequently, the Government have tabled a motion to disagree with those amendments.
The point about a “proportionate” response is relevant. In the original drafting of the Bill, the word used was “appropriate.” The Government changed that to “proportionate” on Report in this House, and the Lords have sought to reverse that change. What does the Minister think was disproportionate, if you like, about the word “appropriate”? What about it struck the wrong balance? Ministers keep saying that they think things strike the right balance, but they never really explain why.
We have engaged significantly, throughout the Bill’s passage and before it was introduced, with large tech and challenger tech. Our understanding is that all those cohorts are happy with where the Bill is today. Certainly, during that engagement, concerns were raised about the term “appropriate,” but the clear position that we expressed to those who raised that concern was, “Of course, there is a requirement on the CMA to act proportionately.” Putting that in the Bill does not undermine its basic principles. In fact, we understand from the situation in the European Court of Human Rights, and the property rights emanating from it, that all those things are baked in anyway, so we do not feel that the wording weakens the legislation at all, but it does strike the right balance between those two different courts.
(1 year, 7 months ago)
Commons ChamberMy hon. Friend got through part 1 a bit quicker than I thought he would—I have a question relating to part 1. Clause 38 creates a final offer mechanism for dispute resolution. The news media industry has been waiting for this legislation for a long time but it is not expressly referenced in the Bill. Can he confirm that the news industry and other industries could benefit from this final offer mechanism?
My hon. Friend makes a good point. I wish him the best of luck in the election this afternoon. It is for a very important Committee that will scrutinise this legislation. The final offer mechanism is innovative and represents a positive way forward, in that it will bring parties to the table and they will both have to make sensible offers relating to how they see a fair resolution. This will avoid them putting unrealistic claims on the table, and it could well help the news industry and many other sectors.
(1 year, 8 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I have not actually posed that question, but perhaps I could later.
This is an important debate, and it is important that we look at the issue strategically. The Government and the Labour party probably have different approaches: the Labour party’s natural position on this kind of stuff is to regulate everything as much as possible, whereas we believe that free markets have had a tremendous effect on people’s lives right across the planet. Whether we look at education, tackling poverty or child mortality, many of the benefits in our society over the last 100 years have been delivered through the free market.
Our natural inclination is to support innovation but to be careful about its introduction and to look to mitigate any of its damaging effects, and that is what is set out in the national AI strategy. As we have seen, it has AI potential to become one of the most significant innovations in history—a technology like the steam engine, electricity or the internet. Indeed, my hon. Friend the Member for Folkestone and Hythe (Damian Collins) said exactly that: this is like a new industrial revolution, and I think it is a very exciting opportunity for the future. However, we also have key concerns, which have been highlighted by hon. Members today. Although the Government believe in the growth potential of these technologies, we also want to be clear that growth cannot come at the expense of the rights and protections of working people.
Only now, as the technology rapidly improves, are most of us beginning to understand the transformative potential of AI. However, the technology is already delivering fantastic social and economic benefits for real people. The UK’s tech sector is home to a third of Europe’s AI companies, and the UK AI sector is worth more than £15.6 billion. The UK is third in the world for AI investment, behind the US and China, and attracts twice as much venture capital investment as France and Germany combined. As impressive as they are, those statistics should be put into the context of the sector’s growth potential. Recent research predicts that the use of AI by UK businesses will more than double in the next 20 years, with more than 1.3 million UK businesses using AI by 2040.
The Government have been supporting the ethical adoption of AI technologies, with more than £2.5 billion of investment since 2015. We recently announced £100 million for the Foundation Models Taskforce to help build and adopt the next generation of safe AI, £110 million for our AI tech missions fund and £900 million to establish new supercomputer capabilities. These exascale computers were mentioned in the Budget by my right hon. Friend the Chancellor. These developments have incredible potential to bring forward new forms of clean energy, and indeed new materials that can deliver that clean energy, and to accelerate things such as medical treatment. There are exciting opportunities ahead.
If we want to become an AI superpower, it is crucial that we do all we can to create the right environment to harness the benefits of AI and remain at the forefront of technological developments. Our approach, laid out in the AI White Paper, is designed to be flexible. We are ensuring that we have a proportionate, pro-innovation regulatory regime for AI in the UK, which will build on the existing expertise of our world-leading sectoral regulators.
Our regulatory regime will function by articulating five key principles, which are absolutely key to this debate and tackle many of the points that have been made by hon. Members across the Chamber. Regulators should follow these five principles when regulating AI in their sectors: safety, security and robustness; transparency and explainability; fairness; accountability and governance; and contestability and redress. That feeds into the important points made by my hon. Friend the Member for Watford (Dean Russell), who held this ministerial position immediately prior to myself, about deception, scams and fraud. We can all see the potential for that, of course.
Clearly, right across the piece, we have regulators with responsibility in those five areas. Those regulators are there to regulate bona fide companies, which should do the right thing, although we have to make sure that they do. For instance, if somebody held a database with inappropriate data on it, the Information Commissioner’s Office could easily look at that, and it has significant financial penalties at its disposal, such as 4% of global turnover or a £17 million fine. My hon. Friend the Member for Watford made a plea for a Turing clause, which I am, of course, very happy to look at. I think he was referring to organisations that might not be bona fide, and might actually be looking to undertake nefarious activities in this area. I do not think we can regulate those people very effectively, because they are not going to comply with anybody’s regulations. The only way to deal with those people is to find them, catch them, prosecute them and lock them up.
The Minister talks about safety, but does he agree that that has to be safety by design, and not just having response mechanisms built into the system so that a victim can appeal? I know he has looked at fraud a lot in the past, and there is a presumption that all will be done to combat fraud at its known source, rather than just providing redress to victims.
That is absolutely right. We will not deal with everything in the world of AI in this respect, but there needs to be overarching responsibility for preventing fraud. That is something we have committed to bringing forward in another legislative vehicle—the Economic Crime and Corporate Transparency Bill, which is passing through Parliament now—but I agree with my hon. Friend that there should be a responsibility on organisations to prevent fraud and not simply deal with the after-effects.
Our proposed framework is aligned with and supplemented by a variety of tools for trustworthy AI, such as assurance techniques, voluntary guidance and technical standards. The Centre for Data Ethics and Innovation published its AI assurance road map in December 2021, and the AI Standards Hub—a world-leading collaboration led by the Alan Turing Institute with the National Physical Laboratory and the British Standards Institution—launched last October. The hub is intended to provide a co-ordinated contribution to standards development on issues such as transparency, security and uncertainty, with a view to helping organisations to demonstrate that AI is used safely and responsibly.
We are taking action to ensure that households, public services and businesses can trust this technology. Unless we build public trust, we will miss out on many of the benefits on offer. The reality is that AI, as with other general-purpose technologies, has the potential to be a net creator of jobs. I fully understand the points raised by the hon. Member for Birkenhead—of course, we do not want to see swathes of people put out of work because of this technology. I hasten to add that that has never been the case with other technologies. There have been many concerns over the ages about how new technologies will affect jobs, but they tend to create other jobs in different sectors. The World Economic Forum estimates that robotics, automation and artificial intelligence will displace 85 million jobs globally by 2025, but create 97 million new jobs in different sectors, which I will discuss in a second. I think the hon. Member for Birkenhead asked in his speech whether I would be willing to meet him to discuss these points; I am always very happy to do that, if we can convene at another time.
The hon. Member also raised the point about how AI in the workplace has the potential to liberate the workforce from monotonous tasks such as inputting data or scanning through documents for a single piece of information. I will address the bigger concerns he has around that, but in the public sector it would leave teachers with more time to teach, clinicians with more time to spend with patients and police officers with more time on the beat, rather than being behind a desk.
As was raised in a salient point by my hon. Friend the Member for Folkestone and Hythe, AI also has tremendous potential in defence and national security. That is absolutely critical. It was interesting that leading people in the world of technology, led by Elon Musk, recently wrote a letter asking for a six-month pause while we look at how we can properly moderate the impacts of AI. I am not sure that that is a good idea, because I am not sure China and Russia would play that game. It is important that we stay ahead of the curve, for exactly the reasons pointed out by my hon. Friend.
The Minister is exactly right. That initiative also suggests that AI is not yet here but, actually, the issues we have discussed today exist already. We can look at them already; we do not need a six-month pause to do that.
That is absolutely right. There is an opportunity but also a potential threat. It is important that we continue to invest, and it is great that the UK is ahead of the game in its investment, behind only the US and China, which are obviously much bigger economies.
The key thing is that we take action on skills, skilling up our workforce in the UK to take advantage of the potential of AI. Clearly, a good computing education is at the heart of that. We have overhauled the outdated information and communications technology curriculum and replaced it with computing, and invested £84 million in the National Centre for Computing Education to inspire the next generation of computer scientists. Our national skills fund offers to do just that, with free level 3 qualifications for adults and skills bootcamps in digital courses, including coding, AI and cyber-security, available across England.
On that point, as well as the opportunities in AI, we need to look at the new opportunities in the new economy. Some jobs will be displaced, so we need to ensure that we are skilling up our workforce for other opportunities in our new economy, be it data science or green jobs with the green jobs taskforce. Recently, in Hull, there were 3,000 new jobs in the wind turbine sector with a starting salary of £32,000, which illustrates the potential for green jobs in our economy. So although jobs might be displaced, others, hopefully better-paid jobs will replace them. We want a higher-wage, higher-skilled economy.
The Government are also supporting 16 centres for doctoral training, backed by an initial £100 million, delivering 1,000 PhDs. We expanded that programme with a further £117 million at the recent launch of the Government’s science and technology framework. Last year, we invested an additional £17 million in AI and data science postgraduate conversion courses and scholarships to increase the diversity of the tech workforce, on top of the £13 million that has been invested in the programme since 2019-20. We also invested £46 million to support the Turing AI fellowships to attract the best and brightest AI talent to work in the UK.
The point about protections for workers’ rights was raised by many Members in the debate, not least the hon. Members for Gordon (Richard Thomson) and for Birkenhead; the shadow Minister, the hon. Member for Ellesmere Port and Neston (Justin Madders); and my hon. Friends the Members for Folkestone and Hythe and for Watford. It is important to see the Government’s position on workers’ rights here. We are bolstering workers’ rights, raising the national living wage, with the highest increase on record—a near 10% increase—and six private Members’ Bills that increase workers’ rights, including on flexible working and other issues. There is also the Employment (Allocation of Tips) Bill, which is the favourite Bill of my hon. Friend the Member for Watford, who was its sponsor prior to becoming the Minister.
On the concerns many raised about workplace monitoring, we are committed to protecting workers. A number of laws are already in place that apply to the use of AI and data-driven technology in the workplace, including in decision making, which was raised by the hon. Member for Ellesmere Port and Neston. The Equality Act 2010 already requires employers and service providers not to discriminate against employees, job applicants and customers. That includes discrimination through actions taken as a result of an algorithm or a similar artificial intelligence mechanism. Tackling discrimination in AI is a major strand of the Equality and Human Rights Commission’s three-year strategy. Existing data protection legislation protects workers where personal data is involved, and that is one aspect of existing regulation on the development of AI systems and other technologies.
Reforms as part of the Data Protection and Digital Information Bill will cast article 22 of the UK GDPR as a right to specific safeguards, rather than as a general prohibition on solely automated decision making. These rights ensure that data subjects are informed about, and can seek human review of, significant decisions that are taken about them solely through automated means, which was a point raised by the shadow Minister. Employment law also offers protections. The Employment Rights Act 1996 provides that employees with two years of continuous service are protected from unfair dismissal, which would encompass circumstances where employees’ article 8 and UK GDPR rights have been breached in the algorithm decision-making process that led to the dismissal.
Of course, all good employers—by their very nature—should use human judgment. The best way we can help employers in any workplace is to have a strong jobs market where employers have to compete for employees. That is the kind of market we have delivered in this economy, despite some of the difficulties that surround it.
I once again thank the hon. Member for Birkenhead for tabling this timely and important debate. To be clear again, we have a strong ambition for the UK to become a science and technology superpower, and AI is a key part of that. However, the Government recognise the concerns around these technologies and appreciate that, as with all new technologies, trust has to be built. We will continue to build our understanding of how the employment rights framework operates in an era of increasing AI use. AI has the potential to make an incredibly positive contribution to creating a high-wage, high-skill and high-productivity economy. I very much look forward to seeing the further benefits as matters progress.
(2 years, 11 months ago)
Commons ChamberThe hon. Gentleman raises an important issue. The Committee agreed in the report that there must be an expedited process of transparency, so that when people are using anonymity to abuse other people—saying things for which in public they might be sued or have action taken against them—it must be much easier to swiftly identify who those people are. People must know that if they post hate online directed at other people and commit an offence in doing so, their anonymity will not be a shield that will protect them: they will be identified readily and action taken against them. Of course there are cases where anonymity may be required, when people are speaking out against an oppressive regime or victims of abuse are telling their story, but it should not be used as a shield to abuse others. We set that out in the report and the hon. Gentleman is right that the Bill needs to move on it.
We are not just asking the companies to moderate content; we are asking them to moderate their systems as well. Their systems play an active role in directing people towards hate and abuse. A study commissioned by Facebook showed that over 60% of people who joined groups that showed extremist content did so at the active recommendation of the platform itself. In her evidence to the Committee, Facebook whistleblower Frances Haugen made clear the active role of systems in promoting and driving content through to people, making them the target of abuse, and making vulnerable people more likely to be confronted with and directed towards content that will exacerbate their vulnerabilities.
Facebook and companies like it may not have invented hate but they are driving hate and making it worse. They must be responsible for these systems. It is right that the Bill will allow the regulator to hold those companies to account not just for what they do or do not take down, but for the way they use the systems that they have created and designed to make money for themselves by keeping people on them longer, such that they are responsible for them. The key thing at the heart of the Bill and at the heart of the report published by the Joint Committee is that the companies must be held liable for the systems they have created. The Committee recommended a structural change to the Bill to make it absolutely clear that what is illegal offline should be regulated online. Existing offences in law should be written into the Bill and it should be demonstrated how the regulator will set the thresholds for enforcement of those measures online.
This approach has been made possible because of the work of the Law Commission in producing its recommendations, particularly in introducing new offences around actively promoting self-harm and promoting content and information that is known to be false. A new measure will give us the mechanism to deal with malicious deepfake films being targeted at people. There are also necessary measures to make sure that there are guiding principles that the regulator has to work to, and the companies have to work to, to ensure regard to public health in dealing with dangerous disinformation relating to the pandemic or other public health issues.
We also have to ensure an obligation for the regulator to uphold principles of freedom of expression. It is important that effective action should be taken against hate speech, extremism, illegal content and all harmful content that is within the scope of the Bill, but if companies are removing content that has every right to be there—where the positive expression of people’s opinions has every right to be online—then the regulator should have the power to intervene in that direction as well.
At the heart of the regime has to be a system where Ofcom, as the independent regulator, can set mandatory codes and standards that we expect the companies to meet, and then use its powers to investigate and audit them to make sure that they are complying. We cannot have a system that is based on self-declared transparency reports by the companies where even they themselves struggle to explain what the results mean and there is no mechanism for understanding whether they are giving us the full picture or only a highly partial one. The regulator must have that power. Crucially, the codes of practice should set the mandatory minimum standards. We should not have Silicon Valley deciding what the online safety of citizens in this country should be. That should be determined through legislation passed through this Parliament empowering the regulator to set the minimum standards and take enforcement action when they have not been met.
We also believe that the Bill would be improved by removing a controversial area, the principles in clause 11. The priority areas of harm are determined by the Secretary of State and advisory to the companies. If we base the regulatory regime and the codes of practice on established offences that this Parliament has already created, which are known and understood and therefore enforced, we can say they are mandatory and clear and that there has been a parliamentary approval process in creating the offences in the first place.
Where new areas of harm are added to the schedules and the codes of practice, there should be an affirmative procedure in both Houses of Parliament to approve those changes to the code, so that Members have the chance to vote on changes to the codes of practice and the introduction of new offences as a consequence of those offences being created.
The Committee took a lot of evidence on the question of online fraud and scams. We received evidence from the Work and Pensions Committee and the Treasury Committee advising us that this should be done: if a known scam or attempt to rip off and defraud people is present on a website or social media platform, be it through advertising or any kind of posting, it should be within the scope and it should be for the regulator to require its removal. There should not be a general blanket exemption for advertising, which would create a perverse incentive to promote such content more actively.
I thank my hon. Friend for his work on this important issue. Does he agree, as referred to in the report, that platforms must be required to proactively seek out that content and ensure it is changed, and if not, remove it, rather than all removals being prompted by users?
It is vital that companies are made to act proactively. That is one of the problems with the current regime, where action against illegal content is only required once it is reported to the companies and they are not proactively identifying it. My hon. Friend is right about that, particularly with frauds and scams where the perpetrators are known. The role of the regulator is to ensure that companies do not run those ads. The advertising authorities can still take action against individual advertisers, as can the police, but there should be a proactive responsibility on the platforms themselves.
If you will allow me to say one or two more things, Madam Deputy Speaker, we believe it is important that there should be user redress through the system. That is why the Committee recommended creating an ombudsman if complaints have been exhausted without successful resolution, but also permitting civil redress through the courts.
If an individual or their family has been greatly harmed as a consequence of what they have seen on social media, they may take some solace in the fact that the regulator has intervened against the company for its failures and levied fines or taken action against individual directors. However, as an individual can take a case to the courts for a company’s failure to meet its obligations under data protection law, that should also apply to online safety legislation. An individual should have the right, on their own or with others, to sue a company for failing to meet its obligations under an online safety Act.
I commend the report to the House and thank everyone involved in its production for their hard work. This is a Bill we desperately need, and I look forward to seeing it pass through the House in this Session.
(4 years, 6 months ago)
Commons ChamberI start by congratulating my hon. Friend the Member for Heywood and Middleton (Chris Clarkson) on an excellent maiden speech. It made me feel slightly nostalgic, because I made my maiden speech—I was trying to work it out— 10 years and one week ago. In addition to that, I made my maiden speech immediately following the hon. Member for North Antrim (Ian Paisley), who has just spoken. The right hon. Member for Doncaster North (Edward Miliband) opened for the Opposition on that occasion as well, so there are a lot of similarities even though we are talking about a different topic today.
I rise to speak in support of the Bill. It has a lot of practical and important measures to support businesses, particularly in my coastal constituency which has many businesses in the hospitality sector. They are particularly badly affected because trade cannot resume as normal. As many Members will know, businesses in the hospitality sector do not necessarily make all their money at an even pace every month throughout the year. They are effectively losing much of the summer season when they would usually seek to raise the revenue that sees them through the rest of the year. Extra financial support at this time is therefore particularly important for businesses in that sector and I welcome it strongly for that reason.
I would like to speak about one sector that is not covered by the provisions in the Bill. I do not believe it is covered by any of the measures that have been put in place so far. It does have rather unique circumstances, but I believe it is a very important sector because of the unique role it plays in our national life—professional football. Professional football clubs are unusual businesses. They have very high turnovers but operate at very small margins. Many people would say that the big clubs in the premier league have a huge amount of money that they spend on players, but most of the income they receive is tied up in the contracts of the players who play for them. They do not necessarily have very much cash.
Clubs in league one and league two are particularly vulnerable because their revenues do not come from broadcasting. Most of the income for big clubs such as Manchester United, Manchester City or Liverpool comes from people around the world watching them play on television. For them to play behind closed doors and receive that broadcasting money gives them the money to succeed. However, for clubs that play in tier 3 and tier 4 in league one and league two, the vast majority of their income comes from playing live in front of spectators. Without that income, they have no revenue. What they have is a series of fixed costs.
The reason professional football clubs have fixed costs is that, unlike almost all other businesses in this country, they cannot restructure their debts and finances by going into administration. They are bound by the laws of their leagues to pay all their football debts in full, including player salaries and transfer fees. Unless they can meet all those costs, they will be expelled from the league. This is an application of a rule that has been the subject of court cases by HMRC and of much debate on matters to do with football club insolvency in this House in my 10 years here. That is a rule called the football creditors rule. It is a rule created by the football leagues for competition reasons to ensure that clubs cannot over-extend themselves, buy better players that they cannot really afford, go into administration to clear their debts and then resume. They have to be consistent in what they can afford through the season, but it does mean that they do not have the option of restructuring their debts. Their obligations and major outgoings are largely going to be the fixed costs of paying players.
There have already been a number of warnings that we will see this summer, because of the financial distress of lots of clubs, the mass release of a large number of players. It has been estimated that up to 1,400 players may be released without being re-signed. We had a small foretaste of that in Scotland last week when Dunfermline Athletic released 17 players.
More troubling over the next few weeks will be the fact that many smaller clubs supplement their income during the summer months when they are not playing through advance sales for the following season. Advance sales of season tickets normally come through in June, which is also when advertisers will make bookings, as will people taking out matchday hospitality packages. That money comes in in June and July and keeps the clubs going while they are not playing, but it is not going to come through now because these would be advance sales for a season that has no start date and no one knows how long it will be before things go back to anything like normal. That affects the whole hospitality sector. As I said, it is less of a problem for those in the premier league, because as long as they are playing on television, although there will be some loss of income because the package is not quite the same as it would normally be, they will still be getting their money in that way, whereas other clubs will not. There is a severe danger that some clubs will simply run out of cash in the next weeks.
My hon. Friend is making an important point. Is he aware that some banks have a blanket restriction on lending money to football clubs and are applying that restriction to CBILS as well, so even though the Government support is not supposed to have a sector-based restriction, this is being applied to football clubs?