(6 months ago)
Commons ChamberWith permission, Mr Deputy Speaker, I shall make a statement on the AI Seoul summit, which the Government co-hosted with the Republic of Korea earlier this week.
The AI Seoul summit built on the legacy of the first AI safety summit, hosted by the UK at Bletchley Park in November 2023. At Bletchley, 28 countries and the European Union, representing the majority of the world’s population, signed the Bletchley declaration agreeing that, for the good of all, artificial intelligence should be designed, developed, deployed and used in a manner that is safe, human-centric, trustworthy and responsible. The same set of countries agreed to support the development of an international, independent and inclusive report to facilitate a shared science-based understanding of the risks associated with frontier AI.
At the same time, the UK announced the launch of our AI Safety Institute, the world’s first Government-backed organisation dedicated to advanced AI safety for the public good. World leaders, together with the leaders of the foremost frontier AI companies, agreed to the principle that states have a role in testing the most advanced models.
Since Bletchley, the UK has led by example with impressive progress on AI safety, both domestically and bilaterally. The AI Safety Institute has built up its capabilities for state-of-the-art safety testing. It has conducted its first pre-deployment testing for potential harmful capabilities on advanced AI systems, set out its approach to evaluations and published its first full results. That success is testament to the world-class technical talent that the institute has hired.
Earlier this week, the Secretary of State announced the launch of an office in San Francisco that will broaden the institute’s technical expertise and cement its position as a global authority on AI safety. The Secretary of State also announced a landmark agreement with the United States earlier this year that will enable our institutes to work together seamlessly on AI safety. We have also announced high-level partnerships with France, Singapore and Canada.
As AI continues to develop at an astonishing pace, we have redoubled our international efforts to make progress on AI safety. Earlier this week, just six months after the first AI safety summit, the Secretary of State was in the Republic of Korea for the AI Seoul summit, where the same countries came together again to build on the progress we made at Bletchley. Since the UK launched our AI Safety Institute six months ago, other countries have followed suit; the United States, Canada, Japan, Singapore, the Republic of Korea and the EU have all established state-backed organisations dedicated to frontier AI safety. On Tuesday, world leaders agreed to bring those institutes into a global network, showcasing the Bletchley effect in action. Coming together, the network will build “complementarity and interoperability” between their technical work and approaches to AI safety, to promote the safe, secure and trustworthy development of AI.
As part of the network, participants will share information about models, and their limitations, capabilities and risk. Participants will also monitor and share information about specific AI harms and safety incidents, where they occur. Collaboration with overseas counterparts via the network will be fundamental to making sure that innovation in AI can continue, with safety, security and trust at its core.
Tuesday’s meeting also marked an historic moment, as 16 leading companies signed the frontier AI safety commitments, pledging to improve AI safety and to refrain from releasing new models if the risks are too high. The companies signing the commitments are based right across the world, including in the US, the EU, China and the middle east. Unless they have already done so, leading AI developers will now publish safety frameworks on how they will measure the risks of their frontier AI models before the AI action summit, which is to be held in France in early 2025. The frameworks will outline when severe risks, unless adequately mitigated, would be “deemed intolerable” and what companies will do to ensure that thresholds are not surpassed. In the most extreme circumstances, the companies have also committed to
“not develop or deploy a model or system at all”
if mitigations cannot keep risks below the thresholds. To define those thresholds, companies will take input from trusted actors, including home Governments, as appropriate, before releasing them ahead of the AI action summit.
On Wednesday, Ministers from more than 28 nations, the EU and the UN came together for further in depth discussions about AI safety, culminating in the agreement of the Seoul ministerial statement, in which countries agreed, for the first time, to develop shared risk thresholds for frontier AI development and deployment. Countries agreed to set thresholds for when model capabilities could pose “severe risks” without appropriate mitigations. This could include: helping malicious actors to acquire or use chemical or biological weapons; and AI’s potential ability to evade human oversight. That move marks an important first step as part of a wider push to develop global standards to address specific AI risks. As with the company commitments, countries agreed to develop proposals alongside AI companies, civil society and academia for discussion ahead of the AI action summit.
In the statement, countries also pledged to boost international co-operation on the science of AI safety, by supporting future reports on AI risk. That follows the publication of the interim “International Scientific Report on the Safety of Advanced AI” last week. Launched at Bletchley, the report unites a diverse global team of AI experts, including an expert advisory panel from 30 leading AI nations from around the world, as well as representatives from the UN and the EU, to bring together the best existing scientific research on AI capabilities and risks. The report aims to give policymakers across the globe a single source of information to inform their approaches to AI safety. The report is fully independent, under its chair, Turing award winner, Yoshua Bengio, but Britain has played a critical role by providing the secretariat for the report, based in our AI Safety Institute. To pull together such a report in just six months is an extraordinary achievement for the international community; Intergovernmental Panel on Climate Change reports, for example, are released every five to seven years.
Let me give the House a brief overview of the report’s findings. It recognises that advanced AI can be used to boost wellbeing, prosperity and new scientific breakthroughs, but notes that, as with all powerful technologies, current and future developments could cause harm. For example, malicious actors can use AI to spark large-scale disinformation campaigns, fraud and scams. Future advances in advanced AI could also pose wider risks, including labour market disruption and economic power imbalances and inequalities. The report also highlights that, although various methods exist for assessing the risk posed by advanced AI models, all have limitations. As is common with scientific syntheses, the report highlights a lack of universal agreement among AI experts on a range of topics, including the state of current AI capabilities and how these could evolve over time. The next iteration of the report will be published ahead of the AI action summit early next year.
Concluding the AI Seoul summit, countries discussed the importance of supporting AI innovation and inclusivity, which were at the core of the summit’s agenda. We recognised the transformative benefits of AI for the public sector, and committed to supporting an environment which nurtures easy access to AI-related resources for SMEs, start-ups and academia. We also welcomed the potential of AI to provide significant advances to resolve the world’s great challenges, such as climate change, global health, and food and energy security.
The Secretary of State and I are grateful for the dedication and leadership shown by the Republic of Korea in delivering a successful summit in Seoul, just six short months after the world came together in Bletchley Park. It was an important step forward but, just as at Bletchley, we are only just getting started. The rapid pace of AI development leaves us no time to rest on our laurels. We must match that speed with our own efforts if we are to grip the risks of this technology, and seize the limitless benefits it can bring to people in Britain and around the world.
The UK stands ready to work with France to ensure that the AI action summit continues the legacy that we began in Bletchley Park, and continued in Seoul, because this is not an opportunity we can afford to miss. The potential upsides of AI are simply immense, but we cannot forget that this is the most complex technology humanity has ever produced. As the Secretary of State said in Seoul, it is our responsibility to ensure that human wisdom keeps pace with human knowledge.
I commend the Secretary of State and the Prime Minister for all the work they have done on the issue, and I commend this statement to the House.
I am grateful to the Minister for advance sight of his statement.
I hope this is in order, Mr Deputy Speaker, because I note that the Minister for Employment, the hon. Member for Bury St Edmunds (Jo Churchill) is on the Front Bench, and that she is not standing at the general election. I know she has been very cross with me on occasions over the past few years—she is probably still cross with me now. [Interruption.] As the Minister says, she is only human. On a personal note, as we have both been cancer sufferers—or survivors—and have both had more than one rodeo on that, it is sad that she is leaving. I am sure she will continue to fight for patients with cancer and on many other issues, and I pay tribute to her. It has been a delight to work with her over these years; I hope she will forgive me one day.
The economic opportunities for our country through artificial intelligence are, of course, outstanding. With the right sense of mission and the right Government, we can make the most of this emerging technology to unlock transformative changes in our economy, our NHS and our public services. Let us just think of AI in medicine. It is a personal hope that it might soon be possible to have an AI app that can accurately assess whether a mole on somebody’s back, arm or leg—or the back of their head—is a potential skin cancer, such as melanoma. That could definitely save lives. We could say exactly the same about the diagnosis of brain injury, many other different kinds of cancer and many other parts of medicine There could be no more important issue to tackle, but I fear the Government have fluffed it again. Much as I like the Minister, his statement could have been written by ChatGPT.
I have a series of questions. First, let me ask about the
“shared risk thresholds for frontier AI development and deployment”,
which the Minister says Governments will be developing. How will they be drawn up? What legal force will they have in the UK, particularly if there is to be no legislation, as still seems to be in the mind of the Government?
Secondly, the Secretary of State hails the voluntary agreements from the summit as a success, but does that mean companies developing the most advanced AI are still marking their own homework, despite the potential risks?
Thirdly, the Minister referred several times to “malicious actors”. Which “malicious actors” is he referring to? Does that include state actors? If so, how is that work integrated with the cyber-security strategy for the UK? How will that be integrated with the cyber-security strategy during the general election campaign?
Fourthly, the Government’s own artificial intelligence adviser, Professor Yoshua Bengio, to whom the Minister referred, has said that it is obvious that more regulatory measures will be needed, by which he means regulations or legislation of some kind. Why, therefore, have the Government not even taken the steps that the United States has taken using President Biden’s Executive order?
Next, have the commitments made six months ago at the UK safety summit been kept, or are these voluntary agreements just empty words? Moreover, have the frontier AI companies, which took part in the Bletchley summit, shared their models with the AI Safety Institute before deploying them, as the Prime Minister pledged they would?
Next, the Government press release stated that China participated in person at the AI Seoul summit, so can the Minister just clear up whether it signed the ministerial statement? As the shadow Minister for creative industries, may I ask why there were no representatives of the creative industries at the AI summit? Why none at all, despite the fact that this is a £127 billion industry in the UK, and that many people in the creative industries are very concerned about the possibilities, the threats, the dangers and the risks associated with AI for remuneration of creators?
The code of practice working group, which the Government set up and which was aiming at an entirely voluntary code of conduct, has collapsed, so what is the plan now? The Government originally said that they would still consider legislation, so is that still in their mind?
I love this next phrase of the Minister’s. He said, “We are only just getting started”. Clearly, somebody did not do any editing. What on earth has taken the Government so long? A Labour Government would introduce binding regulation of the most powerful frontier AI companies, requiring them to report before they train models over a capability threshold, to conduct safety testing and evaluation and to maintain strong information security protections. Why have the Government not brought forward any of those measures, despite very strong advice from all of their advisers to do so?
Finally, does the Minister agree that artificial intelligence is there for humanity, and humanity is not there for artificial intelligence?
I share the sentiments that the hon. Gentleman expressed about my hon. Friend the Member for Bury St Edmunds (Jo Churchill). It was a very sweet thing that he said—the only sweet thing he has said from the Dispatch Box. My hon. Friend has been a great friend to me, giving me advice when I became a new father. Many people do not see the hard work that goes into the pastoral care that happens here, so I am personally very grateful to her. I know that she was just about to leave the Chamber, so I will let her do so. I just wanted to place on record my thanks and gratitude to her.
I am a bit disappointed with the hon. Member for Rhondda (Sir Chris Bryant), although I have a lot of time for him. Let me first address the important matter of healthcare. We obviously hugely focus on AI safety; we have taken a world-leading position on AI safety, which is what the Bletchley and the Seoul declarations were all about.
Ultimately, the hon. Member’s final statement about AI being for humanity is absolutely right. We will continue to work at pace to help build trust in AI, because it can be a transformative tool in a number of different spheres—whether it is in the public sector or in health, as the hon. Member quite rightly pointed out. On a personal note, I hope that, as a cancer survivor he has the very best of health for a long time to come.
Earlier this week, the Prime Minister spoke about how AI can help in the way that breast cancer scans are looked at. I often talk about Brainomix, which has been greatly helpful to 37 NHS trusts in the early identification of strokes. That means that three times more people are now living independently than was previously possible. AI can also be used in other critical pathways. Clearly, AI will be hugely important in the field of radiotherapy. The National Institute for Health and Care Excellence has already recommended that AI technologies are used in the NHS to help with the contouring of CT and MRI scans and to plan radiotherapy treatment and external therapy for patients.
The NHS AI Lab was set up in 2020 to accelerate the development and the deployment of safe, ethical and effective AI in healthcare. It is worth saying that the hon. Member should not underestimate the complexity of this issue .Earlier this year, I visited a start-up called Aival, which the Government helped to fund through Innovate UK. The success of the AI models varies depending on the different machines that are used and how they are calibrated, so independent verification of the AI models, and how they are employed in the health sector specifically, is very important.
In terms of malicious actors, the hon. Member will understand that I cannot go into specific details for obvious reasons, but I assure him, as someone who sits on the defending democracy taskforce, led by the Security Minister, that we have been looking at pace at how to protect our elections. I am confident that we are prepared, having taken a cross-governmental approach, including with our agencies. It is hugely important that we ensure that people can have trust in our democratic process.
The hon. Member is right that these are voluntary agreements. I was surprised by his response, because we said clearly in our response to the White Paper that we will keep the regulator-led approach, which we have invested money in. We have given £10 million to ensure that the regulator increases its capability in a whole sphere of areas. We have also said that we will not be afraid to legislate when the time is right. That is a key difference between what the Opposition talk about and what we are doing. Our plan is working, whereas the Opposition keep talking about legislating but cannot tell us what they would legislate for.
There is no robust detail. I see that has exercised the hon. Member, who is chuntering from a sedentary position. The Opposition just have no serious plan for this.
The results speak for themselves. Around two weeks ago, we had a number of significant investments and a significant amount of job creation in the UK, with investment from CoreWeave, and almost £2 billion—[Interruption.] Those on the Opposition Front Bench would do well to listen to this. We had £2 billion of investment. Scale AI has put its headquarters in the UK. That shows our world-leading position, which is exactly why we co-hosted the Seoul summit and will support the French when they have their AI action summit. It goes to show the huge difference in our approach. We see safety as an enabler of growth and innovation, and that is exactly what we are doing.
The work goes on with the creative industries. It is hugely important, and we will not shy away from the most difficult challenges that AI presents.
I thought the shadow Minister was wise to draw attention to the potential benefits of AI in particular for health research and treatment—notably brain injury, a subject in which he and I share a passionate interest—but foolish, if I might say so, to be churlish about the steps that the Government have already taken. The Government deserve great credit for taking a lead on this internationally, and establishing the first organisation dedicated to AI safety in the world.
I thank and congratulate the Minister on that, but in balancing the advantages and risks—the costs and benefits—will he be clear that the real risk is underestimating the effect that AI may have? The internet has already done immense damage, despite the heady optimism at the time it was launched. It has brutalised discourse and blurred the distinction between truth and fiction, and AI could go further to alter our very grasp of reality. I do not want to be apocalyptic, but that is the territory that we are in, and it requires the most considered treatment if we are not to let those risks become a nightmare.
I completely agree with my right hon. Friend. We recognise the risks and opportunities that AI presents. That is why we have tried to balance safety and innovation. I refer him to the Online Safety Act 2023, which is a technology agnostic piece of legislation. AI is covered by a range of spheres where the Act looks at illegal harms, so to speak. He is right to say that this is about helping humanity to move forward. It is absolutely right that we should be conscious of the risks, but I am also keen to support our start-ups, our innovative companies and our exciting tech economy to do what they do best and move society forward. That is why we have taken this pro-safety, pro-innovation approach; I repeat that safety in this field is an enabler of growth.
I would like to thank Sir Roger Gale, who has just left the Chair. He has been excellent in the Chair and I have very much enjoyed his company as well as his chairing.
I thank the Government for advance sight of the statement. My constituents and people across these islands are concerned about the increasing use of AI, not least because of the lack of regulation in place around it. I have specific questions in relation to the declarations and what is potentially coming down the line with regulation.
Who will own the data that is gathered? Who has responsibility for ensuring its safety? What is the Minister doing to ensure that regard is given to copyright and that intellectual property is protected for those people who have spent their time and energy and massive talents in creating information, research and artwork? What are the impacts of the use of AI on climate change? For example, it has been made clear that using this technology has an impact on the climate because of the intensive amounts of electricity that it uses. Are the Government considering that?
Will the Minister ensure that in any regulations that come forward there is a specific mention of AI harms for women and girls, particularly when it comes to deepfakes, and that they and other groups protected by the Equality Act 2010 are explicitly mentioned in any regulations or laws that come forward around AI? Lastly, we waited 30 years for an Online Safety Act. It took a very long time for us to get to the point of having regulation for online safety. Can the Minister make a commitment today that we will not have to wait so long for regulations, rather than declarations, in relation to AI?
The hon. Lady makes some interesting points. The thing about AI is not just the large language models, but the speed and power of the computer systems and the processing power behind them. She talks about climate change and other significant issues we face as humanity; that power to compute will be hugely important in predicting how climate change evolves and weather systems change. I am confident that AI will play a huge part in that.
AI does not recognise borders. That is why the international collaboration and these summits are so important. In Bletchley we had 28 countries, plus the European Union, sign the declaration. We had really good attendance at the Seoul summit as well, with some really world-leading declarations that will absolutely be important.
I refer the hon. Lady to my earlier comments around copyright. I recognise the issue is important because it is core to building trust in AI, and we will look at that. She will understand that I will not be making a commitment at the Dispatch Box today, for a number of reasons, but I am confident that we will get there. That is why our approach in the White Paper response has been well received by the tech industry and AI.
The hon. Lady started with a point about how constituents across the United Kingdom are worried about AI. That is why we all have to show leadership and reassure people that we are making advances on AI and doing it safely. That is why our AI Safety Institute was so important, and why the network of AI safety institutes that we have helped to advise on and worked with other countries on will be so important. In different countries there will be nuances regarding large language models and different things that they will be approaching—and sheer capability will be a huge factor.
I pay tribute to the Government for their approach on AI. The growth of AI, and its exponential impact, has really not yet landed with most people around the world. The scale and impact of that technology is truly once in a generation, if not once in history. Ensuring that we work around the world to harness that incredibly powerful force for good for humanity is vital. It is good to see the UK playing a leading role in that and, frankly, it is good to see a cross-party approach, because this is bigger than party politics. Will all those involved—the Minister, Lord Camrose, the Secretary of State and the Prime Minister—ensure that the agenda of empowering the development of AI and putting guardrails in place is absolutely at the centre not just of UK policy but of policy across the world?
I put on record my personal thanks to my right hon. Friend for all that he has done. We worked very closely together on the introduction of the integrated care board when he was Health Secretary, and it continues to be hugely beneficial to my constituents. He raises important points about the opportunities of AI and the building of trust, which I have also spoken about. However, he mentioned a “cross-party approach”. I am not sure that the Opposition are quite there yet in terms of their approach. I say to the Opposition that there is a great tech story in this country: we are now the third most valuable economy in the world, worth over $1 trillion; we have more unicorns than France, Germany and Sweden combined; we have created 1.9 million more jobs—over 22% more—than at pre-pandemic levels; and, as I have said, just over £2 billion of investment has come in just the last fortnight. We believe in British entrepreneurs, British innovation and British start-ups. The real question is: why do the Opposition not believe in Britain?
I welcome the Minister’s statement. He is right to say that many Members across the Chamber support the Government’s clear goals and objectives. The continued focus on the Bletchley declaration is to be welcomed, and I welcome the drive to prevent disinformation and other concerns. However, although information and practice sharing will be almost universal, we must retain the ability to prevent the censorship of positions that may not be popular but should not be censored, and ensure that cyber-security is a priority for us nationally, primarily followed by our international obligations.
The hon. Gentleman is absolutely right to say that AI will play a huge role in cyber-security. We recently launched our codes of practice for developers in the cyber-security field. AI will be the defining technology of the 21st century—it is hugely important—and his questions highlight exactly why we have taken this approach. We want our regulators, which are closest to their industries, to define and be on top of what is going on. That is why we have given them capacity-building funds and asked them to set out their plans, which they did at the end of April, and we will continue to work with them.
It sounds as if there was a fair bit of discussion at the summit about AI in healthcare, particularly on its use as a medical device. The Minister will know that it has great potential, and I heard his exchanges just a moment ago. To give just one example, AI can support but not replace clinicians in mammography readings. Does he agree that we must follow the strong lead of the US in this area by ensuring that the regulatory landscape is in the right place to assist this innovation, not get in the way of it?
My hon. Friend makes a hugely important point. I refer him to what I said earlier. It was insightful for me to see how transformative AI can be in health. When I visited Aival, for example, I gained insight into the complexity of installing AI as a testing bed for different machines depending on who has manufactured and calibrated them. The regulator will play a huge role, as he can imagine, whether on heart disease, radiotherapy, or DeepMind’s work in developing AlphaFold.
I congratulate the Minister on all his enthusiastic work on AI. In his statement, he referred to the frontier AI safety commitments, and 16 companies were mentioned. One of those was Zhipu AI of Tsinghua Daxue—Tsinghua University in China—which is, of course, one of the four new AI tigers of China. How important is the work that the Minister is doing to ensure China is kept in the tent when it comes to the safety and regulation of AI, so that we do not end up with balkanisation when it comes to AI?
My hon. Friend makes a really important point. I will not try to pronounce the name of that university or that company; what I will say is that AI does not recognise borders, so it is really important for China to be in the room, having those conversations. What those 16 companies signed up to was a world first, by the way: companies from the US, the United Arab Emirates, China and, of course, the UK signed that commitment. This is the first time that they have agreed in writing that they will not deploy or develop models that test the thresholds. Those thresholds will be divined at the AI action summit in France, so my hon. Friend is exactly right that we need a collaborative global approach.
I thank the Minister for his statement.
(6 months, 1 week ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Sir George. I commend my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) for securing this important debate. She is a passionate campaigner, and I thank her for her engagement on this issue on numerous occasions, including by coming to see me. I also thank the hon. Member for Rhondda (Sir Chris Bryant); worryingly, this is the second time in about 24 hours that I have found myself agreeing with him.
The hon. Gentleman calls on me to resign. Before he asks me to join him on his Benches, I should say that a space on our Benches recently became available, if he wants it. I found myself in considerable agreement with him.
I thank all hon. Members for their contributions. This is clearly a hugely complex issue. I want to start by stating that before being a Minister, I am a parent. I probably make my colleagues sick by talking about that constantly, but it is one of the most rewarding and fulfilling privileges of my lifetime. Being a parent is also one of the scariest things. I have to worry, as we all do, about whether they will grow up to be healthy, make friends at school and, now, whether they will be safe in the online world as well as in the offline world.
I also want my children to have a fulfilling childhood, to learn the skills of tomorrow while we protect them online. Therein lies the conundrum.
I will make a little progress. I want to focus on the issue of research and data. The UK chief medical officer, among others, has systematically reviewed the scientific evidence and concluded that an association between screen-based activities and poor mental health exists, but existing research does not yet prove a causal relationship. Other investigations, however, such as those by Professor Haidt, as mentioned by my hon. Friend the Member for Penistone and Stocksbridge, into the link between these technologies and mental health have suggested a harmful relationship. The scientific community is considering Professor Haidt’s findings, and we are watching that discussion with interest.
I want to reassure hon. Members that on research and causality, I am considering every option to ensure we leave no stone unturned. I will look at this very closely to ensure that any policies that come forward are based on science and data.
I thank my hon. Friend for his reassuring insistence that he will look into the data. The US Surgeon General, who recently visited Parliament, made the point that, if social media or smartphones were a drug, they would be immediately withdrawn from the market because of the harm they are reputed to cause. Even if the full causality is not as established as the Minister wants, is the evidence not so clear and the impact so harmful that it would be sensible to withdraw social media before conducting that research?
I thank my hon. Friend, who has made that point passionately, both here and in private. The important thing is to have the data to back such a significant conclusion, because social media also benefits young people and society and a balance has to be achieved.
I am going to make some progress. To be clear, that does not rule out taking a precautionary approach. We need to consider the impacts carefully before taking action. As the National Society for the Prevention of Cruelty to Children said before this debate, it is important to strike the right balance between protecting children from harm and allowing them to reap the benefits of safe internet use. We will continue to explore options in this space. I welcome further engagement, research and evidence in this area to inform our policies.
On those points, does the Minister agree that this is not just about addiction for some, but about dependency and harm for many? Artificial intelligence is only going to supercharge this. Does he agree that tech companies need to be held to account and ensure protections are in place, and that Ofcom needs to use the powers it has been given to force them to do that?
I thank my hon. Friend for that intervention. Let me say clearly that there is no reason why the tech companies could not have acted over the past few years. There is no reason for them to wait for Ofcom’s code of practice; they should be getting on with the job. I said that as a Back Bencher, and I mean it. The Online Safety Act is what we consider to be technology-agnostic. It covers a lot of the incidences of AI, but we obviously continue to monitor the situation.
I am so glad that my hon. Friend says he is looking at all options to keep children safe. On the issue of preventing children from being able to upload sexual content or from being groomed into uploading sexual images, will he look at the suggestion put to me by the national police lead and others of putting controls at systems level, so that a phone cannot upload that content when the upload is by a child?
I will limit further interventions due the time I have, but I will write to my right hon. Friend on that issue.
I will make some progress. We are aware of the ongoing debate regarding the age at which children should have a smartphone. We recognise the risks that technology such as smartphones pose, but I would argue that a ban would not necessarily achieve the outcome we wish. As has already been said, children can find ways through. We also have to consider who we are criminalising and how legislation would intervene in the lives of the private individual. We live in a digital age and many parents want their children to have a smartphone, as they provide benefits to children and parents, such as staying connected while travelling alone. In other words, trying to protect children from one harm may well lead to another. I speak to many parents who give me the other side of the argument, and I wanted to put that on the record.
The decision on whether a child should have access to a smartphone should not be one for Government. Instead, we should empower parents to make the right call for their children and their individual circumstances. In fact, parents as consumers can influence the market themselves. It is my belief that choice is a liberty that parents and children should be allowed to exercise.
I agree that online platforms must take responsibility for the harmful effects of the design of their services and their business models. That is why the Online Safety Act is a groundbreaking piece of legislation, which puts the onus on platforms to ensure that children are protected. I want to reassure parents that the legislation will change significantly how our children grow up in the online world. If social media companies do not do the right thing, we have given Ofcom the teeth to go after them—and I fully expect it to do so.
Children’s wellbeing is at the heart of the Act, and the strongest protections are for children. Under the new regulations, social media platforms will need to assess the risks of their services facilitating illegal content and activity. That includes illegal abuse, harassment or stirring up hatred. They will need to assess the risk of children being harmed on their services by content that does not cross the illegal threshold, but that is harmful to them, which is something that was brought up.
I will make some progress as I am very short on time, and I want to give my hon. Friend the Member for Penistone and Stocksbridge time to respond.
I want to be unequivocal here: the Online Safety Act ensures that the UK is the safest place to be online, requiring all companies to take robust action against illegal content. Last week, Ofcom published the draft codes of practice for the child safety rules. Those protections are a real game changer and will establish the foundation to protect generations to come. I commend Ofcom for its proposals. It rightly puts the onus on big tech to do the right thing and keep our children safe. I say this to big tech: with great reward comes great responsibility. They have that responsibility and they must act.
Part of the codes identify risks that children need to be protected from, and they also set out the requirement for platforms to implement highly effective age assurance technology to prevent children from encountering harmful content on their services, including pornography, and content that depicts serious violence or promotes serious self-harm, suicide and eating disorders.
Tackling suicide and self-harm material is a key objective of the Online Safety Act. We have heard too many stories of the devastating impact of that content, and I commend all the parents who have campaigned on the issue. They have gone through the most unimaginable, heartbreaking and heart-wrenching challenges. We continue to engage with them, and I commend them for their bravery. There is a live consultation on age assurance at the moment and I encourage all Members to engage with that.
My hon. Friend the Member for Redditch (Rachel Maclean) raised a number of key issues and I will write to her in response. She also talked about parental responsibility, which is important. I think she raised the issue of chat functions, which are also in the scope of the Online Safety Act. The hon. Member for Stirling (Alyn Smith) spoke about the tragic case of Murray Dowey. I offer my condolences to the parents and my open door; I would be more than happy to meet them with the hon. Member in attendance.
My hon. Friends the Members for Stoke-on-Trent North (Jonathan Gullis) and for Great Grimsby (Lia Nici) talked about the responsibility of the Department for Education. I am sure that has been heard, and I will continue to engage with Minsters. My right hon. Friend the Member for North East Hampshire (Mr Jayawardena) talked about his Nokia 3210. Nokia has started remarketing the 3210, so he should look forward to a Christmas present—not from me, but from someone who likes him. I wish him all the best with that.
My final comment is that I would be happy to meet my hon. Friend the Member for Penistone and Stocksbridge, as would the Secretary of State.
(8 months, 4 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I thank you for your excellent chairmanship, Mrs Harris, of this over-subscribed debate on an important topic. I thank the hon. Member for Ellesmere Port and Neston (Justin Madders) for securing the debate. I am grateful to him and other speakers for their insightful contributions. I am conscious of time, so I will be limiting the interventions I take, as I want to try to address as many of the issues that have been raised as I can.
Digital technologies offer extraordinary opportunities; if we take full advantage of them, we can grow our economy, create new jobs and improve lives for British people right across the country. They can have other benefits too, such as connecting communities, reducing loneliness and making public services easier and faster to access. All those points have been very well made today. Right now, though, too many people across the country cannot experience those benefits.
Digitally excluded people are less likely to be in well-paying jobs, and they have worse health outcomes and an overall lower quality of life. As a result, digital exclusion does not just create new inequalities, but exacerbates existing ones, making it more difficult to fully participate in society. That is why, even as we look towards investing in the transformative technologies of tomorrow, from AI to quantum, the Government remain resolutely committed to ensuring no one is left behind in today’s digital age. If Britain is to be a real science and tech superpower, our superpower status has got to deliver tangible benefits for every British person.
We are under no illusions: this is a difficult task that requires work right across Government to address the many complex barriers we face. That is why the 2022 digital strategy outlined work across Government that will promote digital inclusion, from accelerating the roll-out of gigabit broadband to delivering landmark legislation to make the UK the safest place in the world to be online. By doubling down on the four key principles we set out 10 years ago in the digital inclusion strategy—access, skills, motivation and trust—we believe we have the foundations in place to succeed. I will now take each of these principles in turn.
First, on access, we understand the importance of staying connected in the modern age. That is why we have prioritised access to fixed and mobile broadband, including wifi, affordable tariffs and access to suitable devices. To ensure everyone has the access they need, the Government introduced the broadband universal service obligation in 2020, which gives everyone the legal right to request a decent and affordable broadband connection of at least 10 megabits per second. To ensure the USO remains up to date, in October 2023 we launched a consultation to review the obligation and will be publishing a Government response later this year. In March 2021 we launched Project Gigabit, our £5 billion mission to deliver lightning-fast, reliable broadband to the hardest-to-reach parts of the UK, areas that would have otherwise been left out of commercial gigabit roll-out plans without Government subsidy.
Last week we announced that 1 million premises across the UK have received a gigabit-capable connection thanks to Government investment. The majority of these premises are in hard-to-reach locations where previously many people would have struggled to stream TV shows, access online services or run small businesses. I am happy to report that, as I am sure the hon. Member for Ellesmere Port and Neston already knows, his constituency benefits from excellent broadband connectivity. In his constituency, over 99% of premises can access a superfast connection, while 93% can access a gigabit-capable connection.
I thank the Minister for giving way. I am very envious of the hon. Member for Ellesmere Port and Neston (Justin Madders) for having such high levels of connectivity. Those of us who find ourselves in the Project Gigabit type C contract are now seeing that the voucher schemes have been turned off. Would the Minister agree that we need that procurement system to be speeded up so that we can all get to at least 99%?
I thank my hon. Friend for making that point and I will come on to some of the issues that she has raised; I am also happy to have a conversation with her about what support her community needs.
We know that, in addition to excellent coverage, we have competitive pricing in the UK. The cost of a gigabyte of data is 50p in the UK; that is less than half the average price in the EU, which is £1.18. We have also worked closely with the telecoms industry to ensure the availability and provision of low-cost, high-quality fixed and mobile social tariffs in the market. In total, 27 operators now offer social tariffs across 99% of the UK to those on universal credit and some other means-tested benefits.
We have seen social tariff take-up increase by almost 160% since September 2022. Although this represents just 8% of the total number of eligible households, progress is being made and we will continue to work with telecoms providers to increase awareness of this provision. We have also supported access to devices and wi-fi. Around 2,900 public libraries in England provide a trusted network of accessible locations with free wi-fi, which is funded by the Department for Culture, Media and Sport.
The Department for Education has also delivered over 1.95 million laptops and tablets to schools, trusts, local authorities and further education providers for disadvantaged children and young people since 2020. This is part of a £520 million Government investment to support access to remote education and online social care services. To support those seeking work, our Jobcentre Plus work coaches can provide support to eligible claimants who are not online, with financial support to buy six months’ worth of broadband connection. This scheme is administered by the Department for Work and Pensions through the flexible support fund, and I thank my right hon. Friend the Member for Suffolk Coastal (Dr Coffey), who did excellent work through the pandemic. I am sure that I must have written to her on behalf of my constituents during that very uncertain time, and I will certainly take away her points and ideas.
I will make some more progress, if that is okay.
That package, which includes free wi-fi, access to devices and affordable fixed and mobile tariffs for 99% of the UK, supports access to the digital products and services that are needed for modern life.
Now I turn to the issue of digital skills. As well as working to provide the right access, we are working to ensure that everyone has the right skills to be able to navigate their personal and professional lives. On a personal note, this is a particular passion of mine and something that I wholeheartedly believe in. My hon. Friend the Member for Derbyshire Dales (Miss Dines) mentioned digital skills in her contribution, as did other Members in theirs.
Digital skills are central to the jobs of today and the workforce of tomorrow. Ensuring that the workforce has the digital skills for the future will be crucial to meet the UK’s ambition to be a global science and tech superpower. We are supporting skills development at every level—or, as I like to say, at every age and at every stage.
The Department for Education supports adults with low digital skills through the digital entitlement, which fully funds adults to gain essential digital skills qualifications, based on the essential digital skills framework. Since the introduction of the digital entitlement in 2020, the Department has supported over 40,000 learners to study for a qualification in essential digital skills. We are working closely with the Department for Education, industry and academia through the digital and computing skills education taskforce, which was launched last summer to increase the numbers of students choosing digital and tech educational pathways into tech careers.
To inspire the next generation of tech professionals, we have also launched two initiatives: the Cyber Explorers platform for 11 to 14-year-olds, which has reached almost 60,000 students; and the CyberFirst Girls competition, which supported 12,500 12 and 13-year-old girls in 2023 alone.
The Department for Education also funds digital skills provision through Community learning, which is an important stepping stone for learners, particularly post-19 disadvantaged learners, who are not ready for formal accredited learning or who would benefit from learning in a more informal way.
In June 2022, the Government launched the Digital Skills Council, which I co-chair. It brings together Government and industry to strengthen the digital workforce. Last year, the Digital Skills Council partnered with FutureDotNow to fund the publication of the digital skills roadmap, which lays out collective commitments to ensure that all working-age adults have basic digital capabilities.
Finally, we are also supporting people to develop advanced skills in our priority technology areas. We have established the £30 million data science and artificial intelligence conversion programme course to broaden the supply of AI talent in the UK. It funds universities to develop masters level or data science courses suitable for non-STEM students and up to 2,600 scholarships for students from under-represented backgrounds. Just last week we launched a pilot advertising campaign designed to generate awareness of the benefits of learning advanced digital skills and to drive people towards a new website that has details on Government-funded digital skills bootcamps. These bootcamps are 16-week courses that are fully funded, with a guaranteed job interview at the end.
To support workers to understand and apply AI in their jobs, last year, in partnership with Innovate UK and the Alan Turing Institute, we published the first version of a new guidance document that helps businesses to identify what skills their non-technical workers need to be able to successfully use AI in the workplace.
The secondary barriers of trust and motivation, which I mentioned at the start, must be tackled to have a truly positive impact on digital inclusion, but those are harder to measure. We recognise that some people are hesitant to access online services because they fear they may become victims of fraud or that it is an unsafe environment for their personal data. We are taking a number of steps to improve the safety and trustworthiness of the online space, including through the Online Safety Act 2023. The Act will ensure that technology companies take more responsibility for the safety of their users online, particularly children. It is a major step in protecting UK citizens from the scourge of online scams. The motivation barrier requires influencing decision making and motivation at the individual level. That challenge is difficult to overcome and is best addressed through ensuring that access, skills and trust are in place, which is why those remain our focus. That is why we have supported work through libraries, charities and communities, including the digital lifeline fund, and why we continue to fund free public wi-fi in libraries across the UK.
There are many community-based initiatives at the local level, including work through libraries, as I have mentioned, and from the third sector, such as the National Digital Inclusion Network, run by the Good Things Foundation, which is a vital resource to many working in this space. The excellent work done by the Good Things Foundation, Age UK and others plays an important role in providing support with technology and the internet. Those charities supplement Government engagement by offering guides, training courses and volunteers to help people make the most of the internet.
I will address some of the issues raised around financial services. The Government recognise that digital payments play an incredibly important role for businesses and individuals, with many making payments faster, easier and cheaper. However, the Government also believe that all customers, wherever they live, should have appropriate access to banking and cash services. It is imperative that banks and building societies recognise the needs of all their customers, including those who still need to use in-person services. The Government legislated through the Financial Services and Markets Act 2023 to protect access to cash for individuals and businesses. The Act establishes the Financial Conduct Authority as the lead regulator and provides with it responsibility and powers to ensure that reasonable provision of cash withdrawal and deposit services is made, including free services for individuals.
The FCA recently consulted on proposals for its regulatory regime and expects to finalise its rules in the second half of the year. An alternative option to access everyday banking services can be made by telephone banking and via the Post Office or banking hubs. The Post Office allows personal and business customers to carry out everyday banking services at 11,500 Post Office branches across the UK, and banking hubs are a shared initiative that enables customers of participating banks to access cash and banking services in shared facilities.
The issue of local authorities was also raised. Digital inclusion interventions are included in a UK shared prosperity fund prospectus. That has allowed local authorities to allocate funding to digital inclusion interventions. That is because we know from key stakeholders that digital inclusion interventions work best when they are tailored to local needs and when support is provided in the community on an ongoing basis. I was surprised to learn of the issues raised by my hon. Friend the Member for Erewash (Maggie Throup), who spoke about the disparity in non-digital access and cost discrimination. I did check, and I know that her Labour-led council are the ones in charge of this matter. I hope they are listening to this, and realise and appreciate that this is a priority for Government and that it should be a priority for them, too.
My hon. Friends the Members for North Devon (Selaine Saxby) and for North Norfolk (Duncan Baker) raised some important points about the switchover from the public switched telephone network. There was a wonderful plug for the all-party parliamentary group that my hon. Friend the Member for North Devon runs, and I am sure that has been heard loud and clear. The fact is that the way that landlines work in the UK is changing. Communication providers, such as BT and Virgin Media, are upgrading their old analogue landline network—also known as the PSTN—to a new digital technology that carries voice calls over an internet connection, which is also known as Digital Voice. The decision to switch off the analogue landline network was made by the telecoms industry, and a transition to Digital Voice networks is an industry-led process, which is expected to conclude in 2025.
However, the Government were made aware of some serious shortcomings in how the telecoms industry managed the PSTN migration. As a result, the Technology Secretary convened a meeting in December 2023 with the UK’s leading telecoms providers to discuss ways to improve the protection of vulnerable households through the migration. In response, the major telecoms providers have now signed a charter committing to concrete measures to protect vulnerable households, particularly those using telecare alarms. That is a positive step, which we hope will ensure that safety continues to be at the heart of the nationwide switchover.
Let me turn to next steps. Digital skills permeate through every aspect of policy. I view it as part of a cross-Government agenda to integrate digital inclusion into all policy decisions, rather than a stand-alone issue. My hon. Friend the Member for St Ives (Derek Thomas) mentioned the cross-Whitehall ministerial group for loneliness; I can assure him that I attended a meeting last week. I chair the group on digital inclusion, and I will be addressing some of the issues that have been raised there. All Departments are considering the needs of people who are digitally excluded in their policymaking.
The ministerial group on digital inclusion first met in September. It discussed issues such as parking payments, website accessibility and device donation schemes. I am looking forward to hearing updates on those areas from my ministerial colleagues at our next meeting in three weeks’ time. Since our last discussion, the Department for Transport, which leads on the national parking platform, has already said that it expects the full features of the NPP to be available from late 2024, making parking simpler and less stressful. The group also agreed to undertake a departmental mapping exercise and to review the viability of each Department joining donation schemes. This work is an important step forward in our joint efforts to tackle digital inclusion, and I look forward to building on these conservations.
In closing, I again thank the hon. Member for Ellesmere Port and Neston for raising such an important issue. I am hopeful that we can work together. We are working hard on this issue across Government and we have made some credible steps to tackle it. As the digital transformation picks up pace, we know that there is more to do to ensure that no one is left behind in our digital age, but we are already rising to that challenge. Departments forming the cross-Whitehall ministerial group will work hand in hand across Government, as well as with industry and our partners in the third sector, to deliver the benefits of a better digital future for communities all over the country.
(9 months ago)
Written StatementsIn July 2023, the Government launched a consultation in relation to internet domain name registries and domain name abuse. This consultation asked for views on the Government’s proposals for regulations defining prescribed practices and requirements, which are to be introduced following sections 19 to 21 of the Digital Economy Act 2010 coming into force. Specifically, the consultation asked for views from the relevant parties on the draft list of misuses and unfair uses of domain names in scope, and proposed principles which will underpin the prescribed dispute resolution procedure.
It is important we undertake this work to ensure that the UK will continue to meet international best practice on governance of country code top level domains in line with our key global trading partners and our future global trading commitments.
As outlined in a previous statement of 20 July 2023, the DEA 2010 sets out the Secretary of State for Science, Innovation and Technology’s powers of intervention in the event that any UK-related domain name registry fails to address serious, relevant abuses of their domain names, posing significant risk to the UK electronic communications networks and its users.
We received 39 responses to the consultation, which closed in August 2023. In November 2023, the Government published a summary of the responses received and have since been analysing the responses, consulting with technical and industry experts to develop our policy response.
We have today published the Government policy response to the consultation. A copy of both this document and the summary of responses will be placed in the Libraries of both Houses and published on gov.uk.
We will now set out in secondary legislation the list of misuses and unfair uses of domain names that registries in scope must take action to mitigate and deal with, alongside the registry’s arrangements for dealing with complaints in connection with the domain names in scope. This will provide additional certainty for UK users that appropriate procedures will continue to be in place to help address abuse of UK-related domain names.
[HCWS276]
(10 months ago)
Commons ChamberFirst, let me put on the record how pleased I was to see my hon. Friend the Member for Watford (Dean Russell) back in his place, having heard about his health issues. I say that not just because his parents are constituents of mine or because he was born and brought up in my constituency, but because he is a dear friend of mine.
I thank my hon. Friend for securing this debate and raising the important issue of AI scams and the use of AI to defraud or manipulate people. I assure him that the Government take the issue very seriously. Technology is a fast-moving landscape and the pace of recent developments in artificial intelligence exemplifies the challenge with which we are presented when it comes to protecting our society.
I will start by being very clear: safely deployed, AI will bring great benefits and promises to revolutionise our economy, society and everyday lives. That includes benefits for fraud prevention, on which we are working closely with the Home Office and other Departments across Government. Properly used, AI can and does form the heart of systems that manage risk, detect suspect activity and prevent millions of scam texts from reaching potential victims. However, as my hon. Friend rightly identified, AI also brings challenges. To reap the huge social and economic benefits of AI, we must manage the risk that it presents. To do so, and thereby maintain public trust in these technologies, is key to effectively developing, deploying and adopting AI.
In the long term, AI provides the means to enhance and upscale the ability of criminals to defraud. Lone individuals could be enabled to operate like an organised crime gang, conducting sophisticated, personalised fraud operations at scale, and my hon. Friend spoke eloquently about some of the risks of AI. The Government have taken a technology-neutral approach. The Online Safety Act 2023 will provide significant protections from online fraud, including where Al has been used to perpetrate a scam. More broadly, on the services it regulates, the Act will regulate AI-generated content in much the same way that it regulates content created by humans.
Under the Online Safety Act, all regulated services will be required to take proactive action to tackle fraud facilitated through user-generated content. I am conscious that my hon. Friend may have introduced a new phrase into the lexicon when he spoke of AI-assisted criminals. I am confident that the Online Safety Act will be key to tackling fraud when users share AI-generated content with other users. In addition, the Act will mandate an additional duty for the largest and most popular platforms to prevent fraudulent paid-for advertising appearing on their services. This represents a major step forward in ensuring that internet users are protected from scams.
The Government are taking broader action on fraud, beyond the Online Safety Act. In May 2023, the Home Office published a fraud strategy to address the threat of fraud. The strategy sets out an ambitious and radical plan for how the Government, law enforcement, regulators, industry and charities will work together to tackle fraud.
On the points raised by the hon. Member for Strangford (Jim Shannon), the Government are working with industry to remove the vulnerabilities that fraudsters exploit, with intelligence agencies to shut down fraudulent infrastructure, and with law enforcement to identify and bring the most harmful offenders to justice. We are also working with all our partners to ensure that the public have the advice and support that they need.
The fraud strategy set an ambitious target to cut fraud by 10% from 2019 levels, down to 3.3 million fraud incidents by the end of this Parliament. Crime survey data shows that we are currently at this target level, but we are not complacent and we continue to take action to drive down fraud. Our £100 million investment in law enforcement and the launch of a new national fraud squad will help to catch more fraudsters. We are working with industry to block fraud, including by stopping fraudsters exploiting calls and texts to target victims. We have already blocked more than 870 million scam texts from reaching the public, and the strategy will enable us to go much further.
Social media companies should carefully consider the legality of different types of data scraping and implement measures to protect against unlawful data scraping. They also have data protection obligations concerning third-party scraping from their websites, which we are strengthening in the Data Protection and Digital Information Bill. That Bill will hit rogue firms that hound people with nuisance calls with tougher fines. The maximum fine is currently £500,000; under the Bill, it will rise to 4% of global turnover or £17.5 million, whichever is greater, to better tackle rogue activities and punish those who pester people with unwanted calls and messages.
I thank the Minister for a comprehensive and detailed response to the hon. Member for Watford; it is quite encouraging. My intervention focused on the elderly and vulnerable—what can be done for those who fall specifically into that category?
It is a great honour to be intervened on by the hon. Gentleman, who makes an important point. The Government will be doing more awareness raising, which will be key. I am willing to work with the hon. Gentleman to ensure that we make progress, because it is a key target that we must achieve.
Consumers are further protected by the Privacy and Electronic Communications (EC Directive) Regulations 2003, which govern the rules for direct marketing by electronic means. Under these regulations, organisations must not send marketing texts, phone calls or emails to individuals without their specific prior consent. We are also strengthening these regulations, which means that anyone trying to contact someone with unwanted marketing communication calls can be fined if they could cause harm or disturbance to individuals, even if they go unanswered by victims.
Beyond legislation, the Home Office and the Prime Minister’s anti-fraud champion worked with leading online service providers to create an online fraud charter. The charter, which was launched in November last year, sets out voluntary commitments from some of the largest tech firms in the world to reduce fraud on their platforms and services and to raise best practice across the sector.
This includes commitments to improve the blocking of fraud at source, making reporting fraud easier for users and being more responsive in removing content and ads found to be fraudulent. The charter will also improve intelligence sharing and better educate users about the risk on platforms and services, in response to the point of the hon. Member for Strangford.
Public awareness is a key defence against all fraud, whether or not AI-enabled. As set out in the fraud strategy, we have been working with leading counter-fraud experts and wider industry to develop an eye-catching public comms campaign, which we anticipate going live next month. This will streamline fraud communications and help people spot and take action to avoid fraud.
None the less, it is important to remember that this work is taking place in a wider context. The UK is leading the way in ensuring that AI is developed in a responsible and safe way, allowing UK citizens to reap the benefits of this new technology, but be protected from its harms. In March last year, we published the AI regulation White Paper, which sets out principles for the responsible development of AI in the UK. These principles, such as safety and accountability, are at the heart of our approach to ensure the responsible development and use of AI.
The UK Government showed international leadership in this space when we hosted the world’s first major AI safety summit last year at Bletchley Park. This was a landmark event where we brought together a globally representative group of world leaders, businesses, academia and civil society to unite for crucial talks to explore and build consensus on collective international action, which would promote safety at the frontier of AI.
We recognise the concerns around AI models generating large volumes of content that is indistinguishable from human-generated pictures, voice recordings or videos. Enabling users and institutions to determine what media is real is a key part of tackling a wide range of AI risks, including fraud. My hon. Friend has brought forward the idea of labelling to make it clear when AI is used. The Government have a strong track record of supporting private sector innovation, including in this field. Innovations from the safety tech sector will play a central role in providing the technologies that online companies need to protect their users from harm and to shape a safer internet.
Beyond that, Government support measures provide a blueprint for supporting other solutions to keep users safe, such as championing research into the art of the possible, including via the annual UK Safety Tech sectoral analysis report, and driving innovative solutions via challenge funds in partnership with GCHQ and the Home Office.
DSIT has already published best practices relating to AI identifiers, which can aid the identification of AI-generated content, in the “Emerging processes for frontier AI safety” document, which is published ahead of the AI safety summit. In the light of that, DSIT continues to investigate the potential for detecting and labelling AI-generated content. That includes both assessing technical evidence on the feasibility of such detection and the levers that we have as policymakers to ensure that it is deployed in a beneficial way. More broadly, last year the Government announced £100 million to set up an expert taskforce to help the UK to adopt the next generation of safe AI—the very first of its kind. The taskforce has now become the AI Safety Institute, which is convening a new global network and facilitating collaboration across international partners, industry and civil society. The AI Safety Institute is engaging with leading AI companies that are collaboratively sharing access to their AI models for vital safety research.
We are making the UK the global centre of AI safety—a place where companies at the frontier know that the guardrails are in place for them to seize all the benefits of AI while mitigating the risks. As a result, the UK remains at the forefront of developing cutting-edge technologies to detect and mitigate online harms. UK firms already have a 25% market share in global safety tech sectors. AI creates new risks, but as I have set out it also has the potential to super-charge our response to tackling fraud and to make our everyday lives better. The Government are taking action across a range of areas to ensure that we manage the risks and capitalise on the benefits of these new technologies. I thank all Members who have spoken in the debate, and I again thank my hon. Friend the Member for Watford for introducing this debate on AI scams, which I assure him, and the House, are a Government priority.
Question put and agreed to.
(10 months, 1 week ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I am conscious of time and of the broad range of this debate, but I will try to address as many issues as possible. I commend my hon. Friend the Member for Weston-super-Mare (John Penrose) for securing this important debate on preventing misinformation and disinformation in online filter bubbles, and for all his campaigning on the subject throughout the passage of the Online Safety Act. He has particularly engaged with me in the run-up to today’s well-versed debate, for which I thank hon. Members across the Chamber.
May I echo the sentiments expressed towards my hon. Friend the Member for Brigg and Goole (Andrew Percy)? I thank him for sharing his reflections. I was not going to say this today, but after the ceasefire vote I myself have faced a number of threats and a lot of abuse, so I have some personal reflections on the issue as well. I put on the record my invitation to Members across the House to share their experiences. I certainly will not hesitate to deal with social media companies where I see that they must do more. I know anecdotally, from speaking to colleagues, that it is so much worse for female Members. Across the House, we will not be intimidated in how we vote and how we behave, but clearly we are ever vigilant of the risk.
Since the crisis began, the Technology Secretary and I have already met with the large social media platforms X, TikTok, Meta, Snap and YouTube. My predecessor—my hon. Friend the Member for Sutton and Cheam (Paul Scully)—and the Technology Secretary also held a roundtable with groups from the Jewish community such as the Antisemitism Policy Trust. They also met Tell MAMA to discuss Muslim hate, which has been on the rise. I will not hesitate to reconvene those groups; I want to put that clearly on the record.
It is evident that more and more people are getting their news through social media platforms, which use algorithms. Through that technology, platform services can automatically select and promote content for many millions of users, tailored to them individually following automated analysis of their viewing habits. Many contributors to the debate have argued that the practice creates filter bubbles, where social media users’ initial biases are constantly reaffirmed with no counterbalance.
The practice can drive people to adopt extreme and divisive political viewpoints. This is a hugely complex area, not least because the creation of nudge factors in these echo chambers raises less the question of truth, but of how we can protect the free exchange of ideas and the democratisation of speech, of which the internet and social media have often been great drivers. There is obviously a balance to be achieved.
I did not know that you are a Man City fan, Sir Mark. I am a Manchester United fan. My hon. Friend the Member for Weston-super-Mare talked about fish tackle videos; as a tortured Manchester United fan, I get lots of videos from when times were good. I certainly hope that they return.
The Government are committed to preserving freedom of expression, both online and offline. It is vital that users are able to choose what content they want to view or engage with. At the same time, we agree that online platforms must take responsibility for the harmful effects of the design of their services and business models. Platforms need to prioritise user safety when designing their services to ensure that they are not being used for illegal activity and ensure that children are protected. That is the approach that drove our groundbreaking Online Safety Act.
I will move on to radicalisation, a subject that has come up quite a bit today. I commend my hon. Friend the Member for Folkestone and Hythe (Damian Collins) for his eloquent speech and his description of the journey of the Online Safety Act. Open engagement-driven algorithms have been designed by tech companies to maximise revenue by serving content that will best elicit user engagement. There is increasing evidence that the recommender algorithms amplify extreme material to increase user engagement and de-amplify more moderate speech.
Algorithmic promotion, another piece of online architecture, automatically nudges the user towards certain online choices. Many popular social media platforms use recommender algorithms, such as YouTube’s filter bubble. Critics argue that they present the user with overly homogeneous content based on interests, ideas and beliefs, creating extremist and terrorist echo chambers or rabbit holes. There are a multitude of features online that intensify and support the creation of those echo chambers, from closed or selective chat groups to unmoderated forums.
Research shows that individuals convicted of terrorist attacks rarely seek opposing information that challenges their beliefs. Without diverse views, online discussion groups grow increasingly partisan, personalised and compartmentalised. The polarisation of online debates can lead to an environment that is much more permissive of extremist views. That is why the Online Safety Act, which received Royal Assent at the end of October, focuses on safety by design. We are in the implementation phase, which comes under my remit; we await further evidence from the data that implementation will produce.
Under the new regulation, social media platforms will need to assess the risk of their services facilitating illegal content and activity such as illegal abuse, harassment or stirring up hatred. They will also need to assess the risk of children being harmed on their services by content that does not cross the threshold of illegality but is harmful to them, such as content that promotes suicide, self-harm or eating disorders.
Platforms will then need to take steps to mitigate the identified risks. Ofcom, the new online safety regulator, will set out in codes of practice the steps that providers can take to mitigate particular risks. The new safety duties apply across all areas of a service, including the way in which it is designed, used and operated. If aspects of a service’s design, such as the use of algorithms, exacerbate the risk that users will carry out illegal activity such as illegal abuse or harassment, the new duties could apply. Ofcom will set out the steps that providers can take to make their algorithms safer.
I am conscious of time, so I will move on to the responsibility around extremism. Beyond the duties to make their services safe by design and reduce risk in that way, the new regulation gives providers duties to implement systems and processes for filtering out and moderating content that could drive extremism. For example, under their illegal content duty, social media providers will need to put systems in place to seek out and remove content that encourages terrorism. They will need to do the same for abusive content that could incite hatred on the basis of characteristics such as race, religion or sexual orientation. They will also need to remove content in the form of state-sponsored or state-linked disinformation aimed at interfering with matters such as UK elections and political decision making, or other false information that is intended to cause harm.
Elections have come up quite a bit in this debate. The defending democracy taskforce, which has been instituted to protect our democracy, is meeting regularly and regular discussions are going on; it is cross-nation and cross-Government, and we certainly hope to share more information in the coming months. We absolutely recognise the responsibilities of Government to deal with the issue and the risks that arise from misinformation around elections. We are not shying away from this; we are leading on it across Government.
The idea put forward by my hon. Friend the Member for Weston-super-Mare has certainly been debated. He has spoken to me about it before, and I welcome the opportunity to have this debate. He was right to say that this is the start of the conversation—I accept that—and right to say that he may not yet have the right answer, but I am certainly open to further discussions with him to see whether there are avenues that we could look at.
I am very confident that the Online Safety Act, through its insistence on social media companies dealing with the issue and on holding social media companies to account on their terms and conditions, will be a vital factor. My focus will absolutely be on the implementation of the Act, because we know that that will go quite a long way.
We have given Ofcom, the new independent regulator, the power to require providers to change their algorithms and their service design where necessary to reduce the risk of users carrying out illegal activity or the risk of children being harmed. In overseeing the new framework, Ofcom will need to carry out its duties in a way that protects freedom of expression. We have also created a range of new transparency and freedom-of-expression duties for the major social media platforms; these will safeguard pluralism in public debate and give users more certainty about what they can expect online. As I have said, the Government take the issue incredibly seriously and will not hesitate to hold social media companies to account.
(10 months, 2 weeks ago)
General CommitteesI beg to move,
That the Committee has considered the draft Online Safety (List of Overseas Regulators) Regulations 2024.
It is a pleasure to serve under your chairmanship, Mr Betts. I put on the record my gratitude to hon. Members for their campaigning and collaboration throughout the passage of the Online Safety Act 2023 and their contribution to making the UK the safest place in the world to be online. The Government are working at pace to ensure that the Act is fully operational as quickly as possible. I am therefore pleased to debate this statutory instrument, which was laid before the House in draft on 28 November last year.
The draft instrument is one of several that will enable Ofcom’s implementation of the Act. It concerns Ofcom’s co-operation with and disclosure of information to overseas online safety regulators under section 114 of the Act. Given the global nature of the regulated service providers, it is vital that Ofcom can co-operate and share information with its regulatory counterparts in other jurisdictions to support co-ordinated international online safety regulation.
In certain circumstances, it may be appropriate for Ofcom to support overseas regulators in carrying out their regulatory functions. For example, it may be beneficial for Ofcom to share information that it holds to inform supervisory activity or an investigation being carried out by an overseas regulator. That could support successful enforcement action, which in turn could have direct or indirect benefits for UK users such as preventing malign actors from disseminating illegal content on regulated services.
International collaboration will also make online safety regulation more efficient. In carrying out regulatory oversight activity, Ofcom and its international counterparts will be able to gather extensive information about regulated service providers. In some instances, it may be more efficient for regulators to share information directly, where that information has already been collected by a counterpart regulator. International regulatory co-operation and co-ordination are likely to reduce the regulatory burden on both international regulators and regulated service providers.
Section 114 of the Act builds on the existing information gateways available to Ofcom under the Communications Act 2003 by permitting Ofcom to co-operate with an overseas regulator for specified purposes. It includes powers to disclose online safety information to a regulator
“for the purposes of…facilitating the exercise by the overseas regulator of any of that regulator’s online regulatory functions, or…criminal investigations or proceedings relating to a matter to which the overseas regulator’s online regulatory functions relate.”
The information gateway addresses a small legislative gap, because in the absence of section 114, Ofcom could not share information for those specified purposes. Under section 1(3) of the Communications Act, Ofcom can share information only where it is
“incidental or conducive to the carrying out”
of its functions, subject to the general restrictions on the disclosure of information under section 393 of that Act.
The draft regulations designate the overseas regulators with which Ofcom can co-operate and share information under section 114 of the Online Safety Act. It is important to note that Ofcom will retain discretion over whether to co-operate and share information with the overseas regulators specified. The regulations designate the following overseas regulators: Arcom in France, the Netherlands Authority for Consumers and Markets, the Federal Network Agency in Germany, the Media Commission in Ireland, the eSafety Commissioner in Australia, and the European Commission.
In compiling the list of specified overseas regulators, the Department has consulted Ofcom and carefully considered its operational needs and existing relationships with overseas regulators. That will mean that the designated regulators are those with which Ofcom will be able to share information in an efficient and mutually beneficial manner. We have also considered whether the overseas regulator is a designated regulator of a bespoke online safety regulatory framework, ensuring that any information sharing is proportionate.
Another important consideration is the protection of fundamental freedoms online. For that reason, we have considered whether the autonomy of the regulator is protected in law and whether the overseas regulator and the jurisdiction that empowers it uphold international human rights.
Ofcom is an organisation experienced in handling confidential and sensitive information obtained from the services that it regulates, and there are strong legislative safeguards and limitations on the disclosure of such material. Overseas regulators that receive any information from Ofcom may use it only for the purpose for which it is disclosed. They may not use it for another purpose, or further disclose it, without express permission from Ofcom, unless ordered by a court or tribunal. Ofcom must also comply with UK data protection law, and would need to show that the processing of any personal data was necessary for a lawful purpose.
There are six bodies on the list. Is it likely that the bodies listed will change, given that the world is rather a dynamic place? It seems quite a short list at the moment.
I can confirm that we will continually review the list and update it as appropriate, in consultation with Ofcom.
As a public body, Ofcom is required to act compatibly with the right to privacy under article 8 of the European convention on human rights. As I said to my hon. Friend, we will continue to review the list of designated regulators, particularly as new online safety regimes are developed and operationalised around the world. I commend the draft regulations to the Committee and open the matter for debate.
I thank hon. Members across the Committee for their contributions. I am grateful for this opportunity to debate the list of overseas regulators under the Online Safety Act. It is vital that we recognise the global nature of regulatory services and regulated service providers under the Act. The draft regulations will ensure that Ofcom can co-operate and share online safety information with specified overseas regulators where it is appropriate to do so. As I have set out, we will review whether it is desirable and appropriate to add further overseas regulators to the list on an ongoing basis, particularly as the new online safety regulations are developed and operationalised around the world.
May I put on the record a special thank you to the hon. Member for Rotherham for her contribution? I have followed her work since I have been in Parliament, and I know she is a champion in protecting children, especially in the online sphere. I would welcome the opportunity to work with her, and she raised a very interesting point. As I say, we will continue to review the list of regulators. I am certainly happy to have that conversation.
I also give special thanks to the hon. Member for Leeds East for sharing his constituent’s story. The intention has always been for this legislation to make the online world the safest place possible, especially in the UK, and international collaboration is key to that. My door remains open if there is anything further that he would like to discuss. Once again, I commend the draft regulations to the Committee.
Question put and agreed to.
(1 year ago)
Commons ChamberI beg to move, That the clause be read a Second time.
With this it will be convenient to discuss:
Government new clause 6.
New clause 23—Digital Markets Unit and CMA: annual statement to House of Commons—
“(1) The Secretary of State must, once a year, make a written statement to the House of Commons giving the Secretary of State’s assessment of the conduct and operation of—
(a) the Digital Markets Unit, and
(b) the CMA as a whole.
(2) The first statement must be made by 1 February 2024.
(3) A further statement must be made by 1 February each subsequent year.”
This new clause would require the Secretary of State to make a written statement about the conduct and operation of the DMU and CMA.
New clause 27—Appointment of senior director of the DMU—
“The senior director of the Digital Markets Unit must be appointed by the Secretary of State.”
This new clause provides that the senior director of the DMU must be appointed by the Secretary of State.
New clause 28—Duty of the CMA: Citizens interest provisions—
“(1) The Enterprise and Regulatory Reform Act 2013 is amended as follows.
(2) After section 25(3) insert—
“(3A) When carrying out its functions in relation to the regulation of competition in digital markets under Part 1 of the Digital Markets, Competition and Consumers Act 2024, the CMA must seek to promote competition, both within and outside the United Kingdom, for the benefit of consumers and citizens.””
This new clause would give the CMA a duty to further the interests of citizens – as well as consumers – when carrying out its digital markets functions under Part 1 of the Bill.
Amendment 176, in clause 2, page 2, leave out lines 20 and 21 and insert—
“(b) distinctive digital characteristics giving rise to competition law concerns such that the undertaking has a position of strategic significance (see section 6).”
This amendment is linked to Amendment 182.
Amendment 206, page 2, line 25, after “Chapter” insert “, taking account of analysis undertaken by the CMA on similar issues that have been the subject of public consultation.”
This amendment aims to ensure that the CMA are able to draw on previous analysis on issues relevant to the regulatory regime.
Amendment 177, page 2, line 25, at end insert—
“(5) The CMA must publish terms of reference setting out a summary of the evidence base for making a finding of substantial and entrenched market power or of a position of strategic significance.
(6) The terms of reference must include a detailed statement of the competition law concerns arising from these characteristics and the relationship between the designated digital activity and other activities.
(7) Activities with no reasonable prospect of adverse competitive effects linked to digital activity must be referred to as unrelated activities and the terms of reference must expressly state that unrelated activities are not covered by the designation.”
This amendment would require the CMA to publish terms of reference summarising the evidence base for a finding of substantial and entrenched market power or a finding of strategic significance.
Amendment 178, in clause 3, page 2, line 28, after “service” insert “predominantly”
This amendment clarifies that the provision of a service predominantly by means of the internet would be a digital activity.
Amendment 179, page 2, line 34, leave out subsection (2)
This amendment is linked to Amendment 178.
Amendment 180, in clause 5, page 3, line 28, at end insert—
“(c) are not assuaged by evidence of competition arising beyond the activities of the undertaking, and
(d) demonstrate that the perceived market power will be improved compared with the scenario in which the designation does not occur.”
This amendment makes additions to the definition of substantial and entrenched market power.
Amendment 181, in clause 6, page 3, line 31, leave out “one or more of” and insert “both”
This amendment is linked to Amendment 182.
Amendment 182, page 3, line 33, leave out paragraphs (a) to (d) and insert—
“(a) significant network effects are present;
(b) the undertaking’s position in respect of the digital activity would allow it to extend its market power.”
This amendment changes the definition of the term “position of strategic significance”.
Amendment 183, in clause 7, page 4, line 17 at end insert “arising from the designated activities”
This amendment limits the turnover condition in relation to UK turnover to turnover arising from designated activities.
Amendment 184, page 4, line 19, at end insert “to account for inflation on the CPI measure”
This amendment ensures that the sums used to determine whether the turnover condition has been met can only be amended to account for inflation on the CPI measure.
Amendment 194, in clause 11, page 6, line 36, at end insert—
“(c) give a copy of the statement to those undertakings that have not been designated as having SMS that are most directly affected.”
This amendment ensures that challenger firms are able to access information about the regulatory framework on an equal basis to designated firms.
Amendment 195, in clause 12, page 7, line 9, at end insert—
“(5) As soon as reasonably practicable after giving a notice under subsection (2), the CMA must give a copy of the notice to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Amendment 196, in clause 14, page 7, line 36, at end insert—
“(5A) As soon as reasonably practicable after giving an SMS decision notice, the CMA must give a copy of the notice to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Government amendments 2 and 3.
Amendment 197, in clause 15, page 8, line 41, at end insert—
“(6) As soon as reasonably practicable after giving a revised SMS decision notice, the CMA must give a copy of the revised notice to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Government amendments 4 to 7.
Amendment 193, in clause 19, page 11, line 15, at end insert—
“(9A) A conduct requirement must be imposed within 3 months of an undertaking being designated as having SMS under section 2.”
This amendment ensures that a time frame of three months is imposed for the CMA to enforce conduct requirements on designated SMS firms.
Government amendment 8.
Amendment 190, in clause 20, page 12, line 9, after “to”, insert “harm competition in the relevant digital activity or the other activity,”
This amendment would ensure that the CMA can tackle anti-competitive conduct in a non-designated activity, provided that the anti-competitive conduct is related to a designated activity.
Amendment 191, page 12, line 11, after “activity”, insert “, provided that the conduct is related to the relevant digital activity”
See the explanatory statement to Amendment 190.
Government amendments 9 and 10.
Amendment 192, in clause 25, page 14, line 7, at end insert—
“(e) whether to take action in accordance with Chapter 4 (Pro-competitive interventions) in respect of the extent to which it is complying with each conduct requirement to which it is subject and the effectiveness of each conduct requirement to which it is subject.”
This amendment would ensure that the CMA considers the efficacy of existing Conduct Requirements when considering whether to make Pro-Competitive Interventions.
Government amendments 11 and 12.
Amendment 198, in clause 26, page 15, line 3, at end insert—
“(7) As soon as reasonably practicable after giving a conduct investigation notice, the CMA must give a copy of the conduct investigation notice to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Amendment 187, in clause 27, page 15, line 8, at end insert—
“(2) The CMA may have regard to any significant benefits to users or potential users that the CMA considers have resulted, or may be expected to result, from a factor or combination of factors resulting from a breach of a conduct requirement.”
This amendment would ensure that the CMA considers any significant benefits to users resulting from the breach of a Conduct Requirement when it is considering representations from designated undertakings as part of a Conduct Investigation.
Amendment 199, in clause 28, page 15, line 20, at end insert—
“(5) As soon as reasonably practicable after giving a notice under subsection (2), the CMA must give a copy of the notice to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Amendment 188, page 15, line 21, leave out Clause 29.
This Amendment is consequential to Amendment 187.
Government amendment 13.
Amendment 186, in clause 29, page 15, line 31, leave out subsection (c) and insert—
“(c) the conduct is necessary for the realisation of those benefits based on the best available evidence reasonably obtainable, and”
This amendment would change the circumstances in which the countervailing benefits exemption would apply.
Government amendment 14.
Amendment 209, page 15, line 37, at end insert—
“(4) The CMA may only consider that the countervailing benefits exemption applies if it has reached such a consideration within six months of the day on which the conduct investigation notice is given to the undertaking.
(5) In subsection (2), a “benefit” means any benefit of a type set out in regulations made by the Secretary of State in accordance with the procedure under subsections (6) to (9).
(6) The Secretary of State must, within six months of this section coming into force, lay before Parliament draft regulations setting out the types of benefit that apply for purposes of subsection (2).
(7) A Minister of the Crown must make a motion in each House of Parliament to approve the draft regulations within 14 days of the date on which they were laid.
(8) Subject to subsection (9), if the draft regulations are approved by both Houses of Parliament, the Secretary of State must make them in the form of the draft which has been approved.
(9) If any amendments to the draft regulations are agreed to by both Houses of Parliament, the Secretary of State must make the regulations in the form of the draft as so amended.”
This amendment would introduce a 6 month time limit on the duration of investigations into countervailing benefits claims, and specifies that the Secretary of State shall introduce further legislation for Parliamentary debate providing an exhaustive list of the types of countervailing benefits SMS firms are able to claim.
Amendment 200, in clause 30, page 16, line 13, at end insert—
“(4A) As soon as reasonably practicable after giving the notice, the CMA must give a copy of the notice to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Government amendments 15 and 16.
Amendment 201, in clause 31, page 17, line 3, at end insert—
“(7A) As soon as reasonably practicable after making an enforcement order (including a revised version of an order), the CMA must give a copy of the order to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Amendment 202, in clause 32, page 17, line 35, at end insert—
“(6A) As soon as reasonably practicable after giving a notice under subsection (5), the CMA must give a copy of the notice to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Amendment 203, in clause 34, page 18, line 36, at end insert—
“(4A) As soon as reasonably practicable after revoking an enforcement order, the CMA must give a copy of the notice to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Government amendments 17 and 18.
Amendment 189, in clause 38, page 21, line 7, leave out “breached an enforcement order, other than an interim enforcement order” and insert “breached a conduct requirement”
This amendment would allow the CMA to initiate the Final Offer Mechanism after a Conduct Requirement of the type permitted by clause 20(2)(a) has first been breached, provided that the other conditions in clause 38 are met.
Government amendments 19 to 30.
Amendment 204, in clause 47, page 26, line 8, at end insert—
“(4A) As soon as reasonably practicable after giving a PCI investigation notice or a revised version of the PCI investigation notice, the CMA must give a copy of the notice to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Amendment 205, in clause 50, page 27, line 28, at end insert—
“(6A) As soon as reasonably practicable after making a pro-competition order, the CMA must give a copy of the order to those undertakings that have not been designated as having SMS that are most directly affected.”
See the explanatory statement to Amendment 194.
Government amendments 31 to 56.
Amendment 185, in clause 102, page 61, line 10, leave out subsections (6) and (7) and insert—
“(6) In determining an application under this section—
(a) for any application made within a period of three years beginning on the day on which this Act is passed, the Tribunal must determine the application on the merits by reference to the grounds set out in the application;
(b) for any application made thereafter, the Tribunal must apply the same principles as would be applied—
(i) in the case of proceedings in England and Wales and Northern Ireland, by the High Court in determining proceedings on judicial review; and
(ii) in the case of proceedings in Scotland, by the Court of Session on an application to the supervisory jurisdiction of the court.
(7) The Tribunal may—
(a) for any application made within a period of three years beginning on the day on which this Act is passed, confirm or set aside the decision which is the subject of the application, or any part of it, and may—
(i) remit the matter to the CMA,
(ii) take other such steps as the CMA could itself have given or taken, or
(iii) make any other decision which the CMA could itself have made;
(b) for any application made thereafter—
(i) dismiss the application or quash the whole or part of the decision to which it relates. and
(ii) where it quashes the whole or part of that decision, refer the matter back to the CMA with a direction to reconsider and make a new decision in accordance with a ruling of the Tribunal.”
This amendment changes for a three-year period the mechanism by which the Tribunal would determine applications for review.
Government amendments 57 to 67, 83 and 84, 106, 108, 111, 148 and 149.
I am honoured to have been appointed as the Minister with responsibility for tech and the digital economy, and as one of the Ministers with responsibility for the Digital Markets, Competition and Consumers Bill. When I was appointed last Tuesday, many helpful colleagues came up to me to say, “You have been thrown in at the deep end,” but it is a blessing to have responsibility for taking this legislation through the House.
In that vein, I thank my hon. Friend the Member for Sutton and Cheam (Paul Scully) for his tireless work to get the Bill to this stage.
I am aware of the importance of this legislation and the sentiment across the House to deliver the Bill quickly. The benefits of the digital market measures in part 1 of the Bill are clear to see. They will bring about a more dynamic digital economy, which prioritises innovation, growth and the delivery of better outcomes for consumers and small businesses. The rise of digital technologies has been transformative, delivering huge value to consumers and businesses. However, a small number of firms exert immense control across strategically critical services online because the unique characteristics of digital markets, such as network effects and data consolidation, make them prone to tip in favour of a few firms. The new digital markets regime will remove obstacles to competition and drive growth in digital markets, by proactively driving more dynamic markets and by preventing harmful practices such as making it difficult to switch between operating systems.
I turn now to the Government amendments. When the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake) first stood in the House, he stated that the legislation would unleash the full opportunities of digital markets for the UK. That intention has not changed, and our amendments fully support that. The Government’s amendments to part 1 will provide greater clarity to parties interacting with the regime, enhance the accountability of the regulator and make sure that the legislation is drafted effectively and meets its aims. I will address each of those themes in order.
This new regime is novel. To maximise certainty, it is critical that its parameters—the scopes of the regulator’s functions and the rights and obligations set out in the legislation—are clear. Therefore, the Government have tabled a series of amendments to further clarify how the digital markets regime will work in practice. The amendments relate to how legally binding commitments provided by firms within the scope of the regime will work in practice, the Digital Market Unit’s ability to amend certain decision notices, and how in certain circumstances the DMU may use its investigatory and enforcement powers after a firm is no longer designated.
Two important sets of clarifying amendments are worth covering in more detail. The first relates to conduct requirements. Consumer benefit is a central focus of the digital markets regime. The DMU must consider consumer benefit when shaping the design of its interventions. To reinforce that central focus, we are clarifying how the DMU will consider consumer benefits when imposing and enforcing conduct requirements. Amendment 7 requires the DMU to explain the consumer benefits that it expects to result from a conduct requirement, ensuring transparent, well-evidenced decisions. Amendments 13 and 14 simplify the wording of the countervailing benefits exemption, while critically maintaining the same high threshold.
I draw the House’s attention to my entry in the Register of Members’ Financial Interests. Let me take the opportunity to congratulate my hon. Friend the Member for Meriden (Saqib Bhatti) on his appointment. Does he recognise that it is important to be clear—and for the CMA and the DMU to be clear—that there could be a conflict between the interests of current consumers and those of future consumers? Therefore, it is important that the interests of both are balanced in what the CMA and the DMU eventually decide to do.
My right hon. Friend makes an important point. As I make progress, I hope he will be reassured that the regime will take both those things into account.
Together, amendments 13 and 14 will make sure that consumers get the best outcomes. Amendment 14 makes an important clarification on the role of third parties in the final offer mechanism process. New clause 5 and related amendments will clarify when and how third parties may make collective submissions in relation to the final offer mechanism. That is vital, as collective bargaining can help to address power imbalances during negotiations. We expect that third parties, especially smaller organisations, may seek to work together when negotiating payment terms and conditions.
My second theme is the accountability of the regulator. The discretion afforded to the CMA and its accountability to Government and Parliament have formed a large part of the debate—quite rightly—during the passage of the Bill. I will take time to address that.
The digital markets regime is flexible in its design, with the CMA requiring a level of discretion to deliver effective outcomes. While that is common for ex ante regulation, that does not negate the importance of taking steps to maximise the predictability and proportionality of the regulator’s actions. For that reason, the Government are introducing an explicit requirement for the CMA to impose conduct requirements and pro-competition interventions only where it considers that it is proportionate to do so.
That will make it clear to firms in scope of the regime that they will not be subject to undue regulatory burdens. Firms will be able to challenge disproportionate obligations, and the Competition Appeal Tribunal will, in its consideration of any appeals, apply the principle of proportionality in a reasonable way, as it always does. To complement that, and to ensure consistent senior oversight and accountability of the regime, amendments 57 to 60 require enforcement decisions, including the imposition of penalties, to be reserved to the CMA board or its committee.
I welcome my hon. Friend to his position, and congratulate him on his role. The Government amendments relate to the proportionality test for conduct requirements. Why did the Government feel that there was a need for those additional tests? Was there a concern that the CMA would use the power disproportionately, and if so, what might such a use have been?
I thank my hon. Friend for his contribution to the House on these matters, and for that question. The aim of the amendments is to provide clarity and give certainty—clarity that we will always ensure that the consumer is at the heart of what we do, and certainty because that is what business always needs. I will happily give further clarity in my closing remarks. To ensure robust oversight of the DMU’s implementation of the regime, we are also requiring that the Secretary of State approve the publication of guidance relating to part 1 of the Bill.
On the issue of clarity, the Minister knows that the final offer mechanism should be an issue of last resort, and before that there should be a mechanism by which negotiations can take place. Can he assure the House that there will be a mechanism to ensure that big tech firms do not drag out negotiations unnecessarily, because it is not clear so far?
The whole mechanism is designed to ensure that smaller firms have a say in this. That is why the final offer mechanism is there. I hope that that that gives the hon. Member some reassurance.
Finally, the regime has the potential for significant financial penalties to be imposed, so we have tabled amendments to allow any party subject to a penalty to appeal decisions about the penalty on the merits, rather than on judicial review principles. An appeal on the merits allows the Competition Appeal Tribunal to consider whether it was right to impose the penalty, and to consider the penalty amount. Where appropriate, it also allows the Competition Appeal Tribunal to decide a different penalty amount.
I join the queue of people congratulating the Minister on his new role, which is well deserved. I think that I am right in saying that any appeal against a fine from another economic regulator, such as Ofwat or Ofgem, is made to the CMA on the basis of the JR standard, yet we seem to be creating a different, and arguably more complicated, special deal for large tech platforms. Can he explain the Government’s thinking behind that?
I do not think that there is, as my hon. Friend puts it, a special deal; it is about taking a balanced approach to ensure that firms with penalty decisions that have less direct impact on third parties have the opportunity to challenge them, and take a view on them according to the regime.
The Minister is being very generous. I just want to understand why the approach differs from that taken in identical appeals by other companies against other economic regulators.
Given the huge size of the fines, it is only right that that approach is put in place to ensure the penalties are applied appropriately, but it does not apply to decisions that are not made by the CMA.
The regime has the potential for significant financial penalties to be imposed, so we are introducing amendments to allow any party subject to a penalty to appeal decisions about that penalty “on the merits”. An appeal “on the merits” allows the Competition Appeal Tribunal to consider whether it was right to impose the penalty and to consider the penalty amount. Where appropriate, it allows the Competition Appeal Tribunal to decide a different penalty amount. The DMU’s other decisions, including the decision as to whether a breach of the regime occurred, would remain subject to an appeal on judicial review principles.
I join in congratulating my hon. Friend on his appointment and on this very wise amendment. It is fundamental to the rule of law that people who are fined large amounts of money have some proper form of appeal; we must not put too much trust in unaccountable and unelected regulators.
My right hon. Friend is always a thoughtful contributor to debates in this House. We believe that the amendments ensure consumer benefit is at the heart of what we are doing and any appeals will be carried out appropriately. Adopting these amendments would bring the digital markets regime into closer alignment with existing CMA mergers and markets regimes, where penalty decisions can be appealed on the merits. As in those regimes, all other decisions are appealable on judicial review principles.
I thank my hon. Friend for giving way again. He will appreciate that we are all trying to get clarity, so we understand what the proposals really mean. In relation to the appeal standard that he describes, for cases that are not specifically related to fines, he mentioned the proportionality addition earlier in his remarks. When it comes to an appeal, are we right to understand that the question of proportionality applies when the CMA originally makes its decision to require an intervention and does not apply to the JR standard that is used to determine an appeal?
It is important to be specific about that, because there are those who would argue that proportionality should be a part of the appeal process. I think the Government amendments say that proportionality applies at an earlier stage and that when it comes to considering whether the CMA has behaved in a proportionate way in making its decisions, the assessment will be made by the Competition Appeal Tribunal on JR principles. Am I right about that?
I agree that that is exactly what we are saying. I am happy to provide further clarity in my closing remarks.
Critical to accountability is, of course, transparency. The Government are committed to transparency and bringing forward amendments that will require the CMA to set out its reasons for imposing or varying a conduct requirement. That will improve transparency around CMA decision making and increase consistency with other powers in the Bill where similar justification is required. It also reinforces the CMA’s existing responsibility to consider likely impacts on consumers when deciding whether and how to intervene.
The third theme is to ensure the legislation is drafted effectively. Therefore, we have tabled further technical amendments to ensure that the Bill’s text meets the Government’s original intended aim. They relate to the scope of conduct requirements, specifically the application of the materiality threshold contained in clause 20(3)(c), the maximum penalty limits imposed on individuals, the mergers reporting duty and the service of notices on undertakings overseas in certain circumstances.
It is worth noting that there are a small number of cross-cutting amendments contained in parts 5 and 6 of the Bill that will also impact the digital markets regime. I want to ensure that there is plenty of time for hon. Members to debate the Bill at this important stage in its passage. I appreciate a collaborative approach from across the House. I am sure that there will be many different views on some of the amendments, but I look forward to a constructive and collaborative discussion.
The Liberal Democrats welcome many aspects of this Bill. We are pleased that the Government are finally acting on the Competition and Markets Authority’s recommendations in bringing forward measures to prevent the tech giants from putting our digital sector in a stranglehold. We want to see a thriving British tech sector in which start-ups can innovate, create good jobs and launch innovative products that will benefit consumers. A strong competition framework that pushes back on the tech giants’ dominance is essential for that.
For too long a small number of big tech firms have been allowed to dominate the market, while smaller, dynamic start-up companies are too often driven out of the market or swallowed up by the tech giants. New rules designed by the CMA will ensure that these large companies will have to refrain from some of their unfair practices, and they give the regulator a power to ensure that the market is open to smaller challenger companies. The Liberal Democrats are pleased to see changes to the competition framework, which will allow the CMA to investigate the takeover of small but promising start-ups that do not meet the usual merger control thresholds. This change is particularly important for sectors such as artificial intelligence and virtual reality while they are in their infancy. The benefits of these changes will filter down to the end users, the consumers, in the form of more choice over products and services, better prices and more innovative start-ups coming to the fore.
While we are glad that most of the CMA’s recommendations are in this Bill, we have concerns about certain aspects, such as the forward-looking designation of SMS firms and the definition of countervailing benefits that SMS firms are able to claim. The countervailing benefits exemption allows the CMA to close an investigation into a conduct breach if an SMS firm can demonstrate that its anti-competitive practices produce benefits for users that outweigh the harms. There is some concern that big tech may seek to exploit this exemption to evade compliance with conduct requirements and continue with unfair, anti-competitive practices. It could also create scope for tech firms to inundate the CMA with an excessive number of claims of countervailing benefits, diverting the CMA’s limited resources away from essential tasks. Amendment 209, tabled in my name, seeks to strengthen the Bill and to curtail the power of large tech firms to evade compliance by tightening the definition in the Bill of what kind of benefits are valid.
The Liberal Democrats also have concerns about several of the Government amendments, particularly those relating to the appeals standard, as they risk watering down some of the CMA’s most powerful tools. There is now a real danger that powerful incumbents will use their vast resources to bog down and delay the process, leaving smaller competitors at a disadvantage. These amendments show that the Government are taking the side of these established firms at the expense of smaller, growing firms, and at the expense of economic growth and innovation as a whole.
The Liberal Democrats are keen to ensure that big tech is prevented from putting the British tech sector in a stranglehold. We hope that the Government will be robust on the defensive measures in the Bill. It is important that they reject any attempt to water down or weaken this Bill with loopholes, and that they ensure there is no ambiguity that could be exploited. Although competition is crucial for Britain’s tech sector, we hope the Government also move to tackle some of the fundamental issues holding it back, such as the skills gap, the shortage of skilled workers and weak investment.
With the leave of the House, I would like to address some of the points that have been made today.
I am grateful to Members across the House for their contributions to this debate and, of course, throughout the development of this legislation. I am similarly grateful for the cross-party support commanded by the digital markets measures. Members will find that I agree with points raised on both sides of the House, and I am confident that this Bill addresses those points.
I thank the hon. Member for Pontypridd (Alex Davies-Jones) for kindly welcoming me to the Treasury Bench, for her amendments and for her commitment to getting this legislation right. She asked about the countervailing benefits exemption, and I reassure her that the wording change maintains the same high threshold. SMS firms must still prove that there is no other reasonable, practical way to achieve the same benefits for consumers with less anti-competitive effect. This makes sure consumers get the best outcomes, whether through the benefits provided or through more competitive markets.
The hon. Lady also asked about appeals, and it is important that decisions made by the CMA can be properly and appropriately reviewed to ensure that they are fair, rigorous and evidence-based. We have considered strong and differing views about appeals from a range of stakeholders, and judicial review principles are the appropriate standard for the majority of decisions under the regime, as we have maintained with the additional clarification on the DMU’s requirement to act proportionately. We have, however, aligned the appeal of penalty decisions with appeals under the Enterprise Act 2002, so that parties can challenge these decisions on their merits to ensure that the value of a penalty is suitable. Penalty decisions have less direct impact on third parties, and the amendment will provide additional reassurance without affecting the regime’s effectiveness.
The significant changes we are making will provide more clarity and assurance to firms on the need for the DMU to act proportionately. They also bring the regime in line with the relevant CMA precedent. Parties will have greater scope to challenge whether the interventions imposed on them are proportionate or could have been achieved in a less burdensome way. When financial penalties are imposed, parties will have access to a full merits review to provide reassurance that the value of the fine is appropriate.
The hon. Lady also asked about the implementation of guidance, and I can assure her that we are working at pace to ensure the regime is operational as soon as possible after Royal Assent. Guidance must be in place for the regime to go live, and the Government will be working with the CMA to ensure timely implementation. The Secretary of State will, of course, review all guidance for all future iterations.
The hon. Lady also talked about amendments 187 and 188, which seek to replace the countervailing benefits exemption with a power for the CMA to consider benefits to users before finding a breach of a conduct requirement. The exemption will ensure that there is a rigorous process to secure the best outcomes for consumers, and removing it would jeopardise clear regulatory expectations and predictable outcomes. In turn, this would make it more likely that consumers lose out on the innovations developed by SMS firms, such as privacy or security benefits. Government amendments 13 and 14 clarify the exemption while, crucially, maintaining the same high threshold and clear process.
The hon. Lady also mentioned amendments 194 and 196, and the Government agree that it is important that the DMU’s regulatory decisions are transparent and that the right information is available to the public. We understand that these amendments would require the DMU to send decision notices to third parties that it assesses to be most affected by those decisions. However, under the current drafting, the DMU is already required to publish the summaries of key decisions. Requiring the DMU to identify appropriate third parties and send them notices would introduce a significant burden on the DMU, to limited benefit, and I argue that it would undermine the flexibility and quick pace that we expect from the DMU. We believe the current drafting strikes the right balance, providing transparency and public accountability on DMU decisions.
I warmly welcome my hon. Friend to his place, as this is my first chance to do so. Are we now to understand that, with regard to the judicial review standard, proportionality will, in effect, be built in, and that we are going beyond the principles of plain, vanilla JR into the more widely understood term? Am I right?
I suggest that I write to my right hon. and learned Friend, and to all right hon. and hon. Members who have raised the important question of proportionality, to clarify the position. We want this legislation to have clarity for consumers and certainty for businesses because, as my right hon. Friend the Member for North East Somerset (Sir Jacob Rees-Mogg) said, this is an ever-changing market, so it is essential that we have clarity and certainty.
The point about proportionality extends into clause 29, where the Government have now removed the indispensability test, leaving bare proportionality. My amendment asks for a necessity test. What assessment has my hon. Friend made of the removal of “indispensability”? Does he still think that the threshold for countervailing benefit will be sufficiently high to ensure that the CMA does not disapply or discontinue investigations inappropriately?
That is an important point, and I appreciate my right hon. and learned Friend giving me the opportunity to clarify it. I want to be unequivocal that, from my perspective, the threshold is still high and we have provided clarify. If he requires even further clarity, I am happy to write to him to be completely clear.
I am grateful for what my hon. Friend has said so far about the application of the proportionality test, but if he is to follow up with Members in writing with some clarity, can he set out what he believes the grounds for challenge would be on the basis of proportionality? The interventions that the CMA may make and the rulings it may give are at the end of quite a lengthy process of market analysis, demonstration of abuse of market power and breach of conduct requirements. If those are challenged routinely and at a late stage, on the basis that there are grounds to say that it is disproportionate, it could have the unintended consequence of delaying systems in a way that they should not be delayed.
If I heard my hon. Friend correctly, he wanted a letter on that. This legislation is designed to make sure that it is not for big companies to litigate heavily to stifle the smaller challengers from coming out and becoming the big companies and employers of tomorrow. Let me write to him to clarify the point further.
My right hon. and learned Friend the Member for South Swindon has spoken about accountability in my numerous conversations with him over the past few days, and again today. I take his point. He will know that I want independent, versatile, flexible and adaptable regulators. That is only right for an ever-changing digital market that is always innovating and changing the way it operates. We do not know the unicorns of tomorrow or the benefits that we can get from consumers. The Competition and Markets Authority and the DMU have a responsibility to be accountable, to maintain that flexibility and to have adaptability to new technology and new entrants in the market. As I am sure he knows and respects, that is why independent regulators are a central part of our internationally recognised business environment. We should not forget that point.
I take the points about overreach by regulators, but they are a core part of what international partners and investors look at when it comes to the competition regime, because they know that will be innovative and will encourage further innovation in technology. The CMA is operationally independent from Government, and Government will not intervene in its regulatory decisions. The DMU will have discretion in how it designs its interventions under the regime. That discretion is matched with robust accountability, from initial decision making to appeals.
There is a range of checks and balances throughout the regime that provide assurance. I hope that reassures my right hon. Friend. There are opportunities for Government, Parliament and stakeholders to hold the CMA to account, but I welcome his challenges and interventions on this point, because it is important. I am sure that this will be looked at again in the other place. Government should always be sensitive to those challenges. The digital markets regime will be overseen by CMA’s board, which is accountable to Parliament for all key decisions. Key decisions will be taken by a committee, of which at least half its members will offer an independent perspective. I am sure that he will welcome that because, as new technologies and innovations emerge in the market, we will need new expertise.
My right hon. and learned Friend the Member for South Swindon (Sir Robert Buckland) made the important point that the growth and expansion of regulation in digital markets is necessary but substantial. The ability of this place to keep track of how the regulators use their powers is increasingly important. That may be beyond the work of any departmental Select Committee, but instead requires something like the Public Accounts Committee, as he suggested—a separate committee whose job is to focus on and scrutinise such work. That was recommended by the House of Lords Communications and Digital Committee, and also by the Joint Committee on the Online Safety Bill. I do not expect the Minister to give us an answer right now, but if he could reflect on that need and give some guidance to the House, that would be welcome.
My hon. Friend makes an important point that is a matter for wider discussions on accountability. I am happy to have that discussion with him in future. As things currently stand, there are sufficient balances and checks in place, but I am always open to having further discussions with him.
Could the Minister give some clarification on my point about fair reimbursement to the journalists and publishing houses that produce original content? As the new Minister, is he prepared to meet the National Union of Journalists to hear its concerns directly?
If the hon. Member will be ever so patient, I will address that point, because it is important.
My right hon. and learned Friend the Member for South Swindon talked about the DMU’s ex-ante powers, which I want to address because it is an important measure. We proposed to give the DMU ex-ante powers to impose obligations on designated firms because of the characteristics of digital markets, which make them particularly fast-moving and likely to tip in favour of new, powerful winners. We do not think that approach is appropriate for firms in other markets that do not exhibit the same qualities. Even if a firm meets the turnover conditions and carries out a digital activity, the DMU will still need to find evidence that the firm has substantial and entrenched market power, as well as a position of strategic significance in the activity, to designate the firm. The DMU will prioritise the areas where there will be greatest benefits for markets and consumers, and will reflect the CMA’s strategic steer provided by the Government, which is designed to reflect the policy as intended.
I think that everyone wishes to achieve the same objective, so I do not quite understand why His Majesty’s Government do not accept the amendment of my right hon. and learned Friend the Member for South Swindon (Sir Robert Buckland), which will make that clear beyond doubt, will safeguard it and will tidy up the legislation.
I will address my right hon. Friend’s point. We have listened to the concerns and discussed them in great detail, but I believe the Government’s amendments strike the right balance between prioritising the benefit to the consumer while helping the digital market to remain flexible and innovative, allowing for the future tech of tomorrow to be a big challenger.
One of the great strengths of the Bill lies in the speed and flexibility of the toolkit to better equip the regulator to tackle fast-moving and dynamic digital markets. The amendments will maintain an effective, agile and robust process, and will not undermine the Digital Markets Unit’s ability to intervene in a timely and impactful way. They will ensure that the DMU’s approach is proportionate and beneficial to consumers. I hope that we have reached a good position with the Members I have spoken about, but I want to turn to the points raised by my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who was ever so eloquent about the challenge that the legislation is looking to overcome and the balance that it seeks. I was greatly appreciative of his support and the challenge he has put down.
In respect of the hon. Member for Easington (Grahame Morris), the final offer mechanism, which strengthens the hand of smaller businesses when they challenge those bigger businesses, is designed with the challenges he has put forward in mind. I hope that he appreciates that we recognise the traditional business model of news media, particularly print media, which has been substantially disrupted by the growth of digital. The regime is designed to help rebalance the relationship between major platforms and those who rely on them, including news publishers. That could include creating an obligation to offer fair and reasonable payment terms for the use or acquisition of digital, including news, content. I will absolutely take up the offer to meet the NUJ and hear its concerns. I hope that this measure goes a long way towards appeasing those concerns by rebalancing the market and ensuring that firms that have strategic market significance know that they must present a much fairer deal for regional print media.
Perhaps the Minister will forgive me for juxtaposing his reluctance to make things clear in primary legislation when discussing this clause and what the Government seek to do in part 4 on subscriptions. It seems to me very odd to conduct a subscription regulation mechanism by using primary legislation. There is a conflict in the logic being applied here, and I am sorry that I have to point that out to him.
I am sure that the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake) will appreciate the pass that I am just about to give him; I am sure that he will address that issue in his speech.
I reiterate my gratitude to the Opposition for their co-operative behaviour, which I have been informed about by my predecessor, and to right hon. and hon. Members across the House for the challenge that they have put forward today. I am grateful to Members across the House for their contributions, and I hope that they continue to work with the Government. We will continue to work with Members as the Bill progresses through Parliament to ensure that it drives innovation, grows the economy and delivers better outcomes for consumers. That is what the Government care about. We want a highly competitive market that innovates and nurtures the technology companies of tomorrow to ensure that the digital online world serves consumers. For that reason, I respectfully ask Members not to press their amendments.
Question put and agreed to.
New clause 5 accordingly read a Second time, and added to the Bill.
New Clause 6
Protected disclosures
“In the Public Interest Disclosure (Prescribed Persons) Order 2014 (S.I. 2014/2418), in the table in the Schedule, in the entry for the Competition and Markets Authority, in the right hand column, after ‘Kingdom’ insert ‘, including matters relating to Part 1 of the Digital Markets, Competition and Consumers Act 2024 (digital markets)’.”—(Saqib Bhatti.)
This new clause (which would be inserted into Chapter 8 of Part 1 of the Bill) confirms that matters relating to Part 1 of the Bill (digital markets) are covered by the entry for the Competition and Markets Authority in the Public Interest Disclosure (Prescribed Persons) Order 2014.
Brought up, read the First and Second time, and added to the Bill.
Clause 15
Notice requirements: decisions to designate
Amendments made: 2, in clause 15, page 8, line 34, leave out from “that” to the end of line 35 and insert
“the undertaking or digital activity, as the case may be, remain substantially the same”.
This amendment clarifies how the CMA may revise its view of an undertaking or digital activity by issuing a revised SMS decision notice.
Amendment 3, in clause 15, page 8, line 37, leave out from “not” to the end of line 38 and insert
“affect—
‘(a) the day on which the designation period in relation to that designation begins, or
(b) anything done under this Part in relation to that undertaking.”—(Saqib Bhatti.)
This amendment confirms that giving a revised SMS decision notice does not affect anything done under this Part in relation to a designated undertaking.
Clause 17
Existing obligations
Amendments made: 4, in clause 17, page 9, line 23, at end insert—
“(2A) In Chapters 6 (investigatory powers and compliance reports) and 7 (enforcement and appeals), references to a ‘designated undertaking’ are to be read as including an undertaking to which an existing obligation applies by virtue of provision made in reliance on subsection (1).”
This amendment provides that references in Chapters 6 and 7 to a designated undertaking include an undertaking to which an obligation applies by virtue of provision made in reliance on clause 17(1).
Amendment 5, in clause 17, page 9, line 37, at end insert—
“(ba) commitment (see sections 36 and 55);”.—(Saqib Bhatti.)
This amendment provides for the CMA to be able to apply an existing commitment, with or without modifications, in respect of certain new designations or to make transitional, transitory or saving provision in respect of a commitment when it would otherwise cease to have effect.
Clause 19
Power to impose conduct requirements
Amendments made: 6, in clause 19, page 10, line 30, leave out from “requirement” to the end of line 35 and insert
“or a combination of conduct requirements on a designated undertaking if it considers that it would be proportionate to do so for the purposes of one or more of the following objectives—
(a) the fair dealing objective,
(b) the open choices objective, and
(c) the trust and transparency objective,
having regard to what the conduct requirement or combination of conduct requirements is intended to achieve.”
This amendment provides that the CMA may only impose a conduct requirement or combination of requirements if it considers that it would be proportionate to do so, having regard to what the requirement or combination is intended to achieve.
Amendment 7, in clause 19, page 11, line 15, at end insert—
“(9A) Before imposing a conduct requirement or a combination of conduct requirements on a designated undertaking, the CMA must have regard in particular to the benefits for consumers that the CMA considers would likely result (directly or indirectly) from the conduct requirement or combination of conduct requirements.”—(Saqib Bhatti.)
This amendment provides that the CMA must consider the likely benefits for consumers when imposing a conduct requirement or combination of conduct requirements.
Clause 20
Permitted types of conduct requirement
Amendment made: 8, in clause 20, page 12, line 9, leave out from “to” to “in” on line 10 and insert
“materially increase the undertaking’s market power, or materially strengthen its position of strategic significance,”.—(Saqib Bhatti.)
This amendment clarifies that a conduct requirement is permitted if it is for the purpose of preventing an undertaking from carrying on activities other than the relevant digital activity in a way that is likely to materially strengthen its position of strategic significance in relation to the relevant digital activity.
Clause 21
Content of notice imposing a conduct requirement
Amendments made: 9, in clause 21, page 12, line 28, after “requirement” insert
“or, as the case may be, each conduct requirement as varied,”.
This amendment clarifies how the notice requirements in clause 21 apply in relation to the variation of a conduct requirement.
Amendment 10, in clause 21, page 12, line 31, leave out paragraphs (b) and (c) and insert—
“(b) the CMA’s reasons for imposing the conduct requirement, including—
(i) the objective for the purposes of which the CMA considers it is proportionate to impose the conduct requirement (see section 19),
(ii) the benefits that the CMA considers would likely result from the conduct requirement (see section 19(9A)), and
(iii) the permitted type of requirement to which the CMA considers the conduct requirement belongs (see section 20);”.—(Saqib Bhatti.)
This amendment requires the CMA to give reasons for imposing conduct requirements on a designated undertaking. Sub-paragraph (ii) is consequential on Amendment 7.
Clause 26
Power to begin a conduct investigation
Amendments made: 11, in clause 26, page 14, line 11, leave out “a designated” and insert “an”.
This amendment, together with Amendments 12, 16, 29, 37, 38, 40, 42, 43 and 65, ensures that enforcement action can be taken in respect of an undertaking that has ceased to be a designated undertaking in relation to its conduct while it was a designated undertaking.
Amendment 12, in clause 26, page 14, line 18, leave out “designated”.—(Saqib Bhatti.)
See the explanatory statement for Amendment 11.
Clause 27
Consideration of representations
Amendment proposed: 187, in clause 27, page 15, line 8, at end insert—
“(2) The CMA may have regard to any significant benefits to users or potential users that the CMA considers have resulted, or may be expected to result, from a factor or combination of factors resulting from a breach of a conduct requirement.”—(Alex Davies-Jones.)
This amendment would ensure that the CMA considers any significant benefits to users resulting from the breach of a Conduct Requirement when it is considering representations from designated undertakings as part of a Conduct Investigation.
Question put, That the amendment be made.
(1 year, 6 months ago)
Commons ChamberThis country has a proud history of welcoming almost half a million refugees over the past several years, and we will always continue to do so, but our ability to do that is absolutely hampered when we have tens of thousands of people illegally crossing the channel every year. It is precisely because we want to help the most vulnerable people, whether they be in Syria, Afghanistan, Sudan or elsewhere, that we must get a grip of the problem, break the cycle of the criminal gangs, and target our resources and compassion on those who most need them.
I agree with my hon. Friend and I am so glad to see the local Conservatives delivering for the people of Solihull, with dozens of new family homes, new flexible commercial space and a new integrated health, social care and community hub. As he says, it is clear that for his local area, only the Conservatives can deliver.