(2 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Peter Fortune (Bromley and Biggin Hill) (Con)
I beg to move,
That this House has considered Government support for UK-based tech companies.
It is a pleasure to serve under your chairmanship, Mr Betts. I am pleased to have secured this debate.
It is hard to measure the true economic value of the technology sector in the UK, but I think we can all agree on the sector’s huge importance for economic growth, productivity and society as a whole. That importance will only grow in the future, so nurturing and supporting our domestic technology sector is vital. To be clear, as a Conservative, I believe in the importance of competition as a driver for innovation and economic growth. To have true competition, we need to challenge monopolies. If our tech sector is to thrive in the future, competition is vital; otherwise, we will see innovative firms leave the UK.
Today I will focus particularly on our domestic app ecosystem. The UK’s mobile app ecosystem generates £28 billion annually in gross value added—equivalent to nearly 1% of GDP. It also supports around 400,000 jobs: the highest number in any country in Europe. However, despite that huge contribution to our economy, app developers face significant challenges.
Apple and Google control 95% of all mobile operating systems in the UK, and the Competition and Markets Authority formally designated them with strategic market status in October 2025. That does not mean that Apple and Google just run the app stores; they have control over far more than that. Those companies can control what developers can say within apps, block developer communications with consumers, hide customer details from developers and prevent them from telling users when something is new, better or cheaper—all the while taking up to 30% of every transaction. That not only stifles the sector domestically, but pushes up prices for ordinary consumers and drives British innovation overseas. We simply cannot afford to allow such a growing industry to be lost.
I congratulate the hon. Member on securing this important debate on Government support for UK-based tech companies. My Slough constituency is a huge tech and data hub; indeed, it has the second largest concentration of data centres anywhere in the world. Does he agree that it would be an act of folly for the Government not to designate Slough as an artificial intelligence growth zone, given that £1 spent there provides a much greater return for the UK economy? We as a nation would not want to lose that.
Peter Fortune
I have a list of Government follies here, if the hon. Member would like me to pass them on. In all seriousness, I completely agree with him on the importance of the industry and those jobs, and I am sure that the Minister will pick that up when he responds.
To give an example of the issues with these monopolies, Amazon was forced to remove the “Buy book” button from its Kindle app on iPhones because Apple demanded a 30% cut of every e-book sale. Authors simply cannot afford to forgo that 30%. Instead, readers had to—this is absurd—close the Kindle app, log on to the Amazon system separately, complete their purchase and then reopen the Kindle app. It was only thanks to a court case in the United States that forced Apple’s hand that the “Buy book” button returned.
Spotify cannot include a “Subscribe” button in its iOS app, nor can it tell users in the app what a subscription costs or that a cheaper option exists outside the app. UK Spotify Premium subscribers have faced three price rises in two years, partly because Apple’s 30% cut has to be absorbed somewhere. Every Spotify user in the UK is paying more, and Apple’s rules are a direct reason why.
There are many similar cases in which Apple and Google are inserting themselves directly into the relationship between developers and consumers by forcing developers to use their payment systems. That takes away a consumer’s ability to choose their preferred payment method, causes greater friction when there are issues such as refunds and cancellations, and prevents consumers from properly benefiting from lower prices or discounts.
The UK’s Competition Appeal Tribunal ruled in October 2025 that Apple’s payment restrictions were neither necessary nor proportionate for security or privacy purposes. They were designed to eliminate competition. It is as simple as that. It is estimated that removing the restrictions would release £1.75 billion a year that is currently taken from UK developers and consumers, rising to over £4 billion annually by 2029. That money could go back into British engineering, creative content and the next generation of app businesses built and scaled here. We could unleash the true potential of these industries.
The ability to remove the restrictions and hand UK app developers back their rights already exists in legislation. The Digital Markets, Competition and Consumers Act 2024 gave the CMA conduct-requirement powers—the ability not just to levy fines, but to mandate specific behaviours. The CMA can end Apple’s and Google’s control over in-app communication, ensuring that developers are free to know their own customers and tell their own customers what their own product costs and where to buy it at the best price. Could the Minister outline the Government’s view on pressing the CMA to issue conduct requirements that protect competition?
Another area that we must look at is cloud computing. The UK’s digital economy is underpinned by cloud computing, but cloud has been increasingly monopolised. The CMA’s cloud services market investigation estimated that Amazon Web Services and Microsoft control 70% to 90% of the UK’s cloud computing market. That concentration poses a number of dependency risks, including operational, financial and security vulnerabilities, and restricts market innovation and customer choice.
Just months after the Government published their “Chronic risks analysis”, there were three global cloud outages within a matter of weeks. In two of those, Amazon Web Services and Microsoft were directly impacted, highlighting the risks of over-reliance on a limited number of cloud hosts. Governments, businesses, digital platforms, AI services and individuals were materially impacted by the outages, with US companies alone suffering losses of between $500 million and $650 million. Indeed, the recent CrowdStrike outage is estimated to have cost the UK economy between £1.7 billion and £2.3 billion.
Competition can be the key mitigation for the UK’s digital dependency and, again, it is the CMA that holds the levers to tackle anti-competitive conduct and address the risks of cloud concentration. I am not calling for more legislation or regulation. We do not need it. With the Digital Markets, Competition and Consumers Act, brought in by the last Conservative Government, we have already legislated for stronger digital competition, but slow implementation and weak early enforcement risk squandering a rare pro-growth and pro-SME opportunity.
Only a small number of designations have been made so far. For Google’s and Apple’s mobile ecosystems, the CMA has relied on non-binding “commitments” rather than imposing binding conduct requirements. These non-binding commitments have no clear statutory basis under the 2024 Act, carry no legal consequences if breached and are not contemplated anywhere in the CMA’s published guidance. Their use risks weakening the regime and forcing the CMA to restart enforcement if firms fail to comply, which is precisely the outcome that the last Government sought to avoid. It is also concerning that a requirement for Google to negotiate fair terms with news publishers has been pushed back by at least 12 months, despite the CMA having previously committed to use that power in the first half of this year.
The Government must reaffirm that robust digital competition enforcement is pro-growth and central to the UK’s industrial strategy. Moreover, the CMA must ensure that there is robust competition enforcement. The levers to achieve that were put there by the last Government; it just requires some political will. Fundamentally, the UK cannot build globally competitive tech firms while a handful of dominant platforms control the routes to market, search, app stores, mobile ecosystems, cloud and key AI infrastructure.
The potential for huge economic growth from our tech sector is there, but competition is key. If competition flourishes, we will see more innovation, improved services and lower costs for consumers.
(11 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I thank my hon. Friend for securing this debate, because many of my Slough constituents have contacted me with concerns about the impact of AI on the intellectual property of creatives. We have a thriving UK creative industry, which contributed more than £120 billion to our economy in 2022. Just down the road from Slough, we have Pinewood studios and Shinfield studios, which have given us global hits over the years. Does my hon. Friend agree that those industries must be listened to properly before any legislative changes, to ensure that the film, book, media and other creative industries can continue to thrive?
My hon. Friend makes an excellent point and demonstrates both the economic might of these industries—the sheer size of their contribution—and the fact that this is Britain’s best industry, giving some of the best life experiences. I know how well my hon. Friend is thought of.
Creative industries must not be expected to forfeit their legal rights for uncertainty. There is no doubt that AI will unlock huge gains in our society. The story some tell is selective, though. We are told that only if we deregulate will we unlock the AI economic growth, that the UK must hurry up or fall behind and that regulation will only slow us down, but the urgency to get the deal done is theirs. It is no coincidence that this hurrying up has intensified as the first US judgment has found that AI training is not deemed fair use.
I thank the hon. Lady for her intervention; I will expand on her point about transparency.
We must have transparency, and it needs to be granular, enforceable and practical. AI developers must be required to disclose which copyrighted works they used to train or fine-tune their models. TollBit’s “State of the Bots” report confirms:
“Whilst every AI developer with a published policy claims its crawlers respect the robots exclusion protocol, TollBit data finds that in many instances bots continue scraping despite explicit disallow requests for those user agents in publishers’ robots.txt files”.
Many AI companies say that this need not hamper AI, and it is their voices that I wish to amplify today. This is about creating a fair, functioning market for training data that benefits all sectors. Last month’s YouGov survey of MPs and the general public agrees: 92% of MPs believe that AI companies should declare the data used to train their models, 85% say that using creative work without pay undermines intellectual property rights, and 79% support payments to creators whose work is used in training. The public expect us to do our best for our UK industries, and that is why we are square behind the Government’s instincts on British Steel. Let us apply the Government’s instinct here, too, as well as their strong record and rhetoric on digital images, deepfakes, online harms and the principle that if it is illegal offline, it is illegal online.
Big tech always begins on the fringes before being regulated to the centre. We saw that most recently with age verification on app stores. Will the Minister commit to table Government amendments to the Data (Use and Access) Bill, in recognition of these supermassive concerns, to introduce a power to regulate for transparency, consent, copyright and compensation?
My hon. Friend is making a very hard-hitting speech. Any reforms we introduce should ensure that artificial intelligence companies are more transparent about the materials they use to train their AI tools. Does my hon. Friend agree that that would benefit the development of AI while ensuring adequate protections for the creative industries and individuals’ intellectual property?
It is absolutely imperative that we strike the right balance. This is not about pitting one side against the other; it is about coexistence and mutual interdependence.
Will the Minister consider introducing a stronger framework for personality rights under the data Bill, as proposed by Equity and the wider creative sector, to improve protections against the illegal exploitation of artistic works by generative AI companies? Will he also explain why the position is still to bundle transparency in with copyright?
The stories are increasingly familiar: entire creations and careers are copied and remixed into data, while the original human creator and rights owner is left out of the conversation entirely. One visitor I met from big tech likened the training of an AI on copyrighted work to the use of a library, but that argument dries up when we remember that libraries pay for their books, and it wilfully ignores the scalability differences between machine learning and human inspiration. If a warehouse traded in stolen guitars or paintings, we would expect action, but in digital form online, theft is somehow not just tolerated but to be expected. The logic that the creative industry feels it is being asked to swallow is that because tech saw it, read it or heard it, it can have it, own it and resell it.
Copyright is not an obstacle but infrastructure—a cornerstone of the British economy. As colleagues said, it makes possible the £1.24 billion contribution to UK plc. We should not be weakening it with vague exceptions or opt-out regimes. The Berne convention, signed by more than 180 countries, makes it clear that creators must not have to exert their rights for them to exist, and they should not have to sue to keep them. This violation of international copyright norms will undermine international relations and investment in our country. It will see capital take flight and cause economic damage.
I would like to introduce a new thought to the Minister: please consider the relationship with managed risk and the freedom of expression in a regulated model that creativity needs and relies on. Creatives take risks, and express themselves freely. For growth, we need an economy of risk takers with the freedom to express without fear that they will be ripped off. Without the legal frameworks that protect copyright, risk will not be embraced and creativity will dry up or move away. That is why there is so little faith in an opt-out model, or a strategic direction placing the burden on individual creators to prevent their work from being taken by tech. A technical solution does not yet exist to make such opt-outs meaningful. As they emerge, the same concerns and questions will need answering.
Worldwide, there is no comparable territory where this matter is settled. No functioning rights reservation has emerged in the EU, and in the US, further litigation is rife. California, the home of silicon valley, has a carve out—they do not get high on their own supply. The Chancellor and the leadership of our Government are right to propose on the world stage that the UK is a safe and certain bet for investment, but being the UK branch for US tech demands will not deliver that. New growth must ensure net growth. Industrial-scale unregulated scraping is not innovation; it is infringement. It erodes the commercial certainty that serious investors need.
In the regulation space, we have exciting growth opportunities in the emerging AI licensing sector. They demonstrate that copyright is well understood, and that there are scalable possibilities for licensing and ethical AI here in the UK. The emergence of platforms such as Created by Humans, Narrativ, ProRata, Getty Images, Musiio, Adobe and We Are Human, and relationships between Sony Music and Vermillio, Universal Music and SoundLabs, Lionsgate and Runway, and news publishers and Microsoft all point to new licensing with a duty of candour. With these examples, will the Minister confirm if the Government are considering building new foundational models led and built by UK AI firms?
In conclusion, this rapid technological change demands that we confront the decisions in front of us as a creative powerhouse. In this defining moment, we must stand with those whose creativity shapes our culture, economy and shared human story. Algorithms may calculate, but it is the human creativity behind it all that pours heart, history, hope and the human into and out of every note, frame and story. AI cannot be allowed to redefine the soul of creation. If we erode the rights that protect our creators, we risk not only economic loss but strangulation of those voices that tell our stories, reflect our struggles and inspire our futures. Let us affirm that in Britain we value not just innovation but the irreplaceable human spirit behind creativity, and protect the rights that mean that we thrive, for now and for generations yet to imagine, to dream and to create.
(11 months, 3 weeks ago)
Commons ChamberWe will set out the details just as soon as we can, on the basis of the principles that I set out earlier. The welfare scheme overall is not defendable on terms, but it must be one that supports those who need it. The details will be set out.
I thank my hon. Friend for his question; he does great work to bring our communities together, especially in Slough. Any form of racial or religious-based hatred is abhorrent and has no place in society. We have set aside over £50 million to protect faith communities and freedom of worship. That is the right thing to do; it is a shame that we have to do it. Our £15 million community recovery fund has been supporting communities affected by the disorder last summer—again, that is the right thing to do, but it is a shame that we have to do it.
(1 year, 3 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Lola McEvoy
I will—and I thank the hon. Lady for her intervention.
It is a pleasure to serve under your chairmanship, Mr Dowd. It is my great honour to open this debate on online safety for our children. I welcome the Minister answering for the Department for Science, Innovation and Technology, and the shadow Minister, the hon. Member for Runnymede and Weybridge (Dr Spencer), answering for the official Opposition. I tabled this as my first debate in Westminster Hall, because I believe this issue is one of the most defining of our time. I promised parents and children in my constituency of Darlington that I would tackle it head-on, so here I am to fulfil that promise.
I would like to put on the record that I have long been inspired by the strength of the parents of Bereaved Families for Online Safety—a group of parents united by the unbearable loss of their children and by their steadfast commitment to get stronger online protections to prevent more children’s deaths. I say to Ellen, who is here with us this afternoon: thank you for your courage—you have experienced unimaginable pain, and I will do everything I can to prevent more parents from going through the same.
The consensus for action on this issue has been built, in no small part due to the incredible drive of parents to campaign for justice. It is felt in every corner of the country, and it is our job as a Government to step in and protect our children from online harm. In my constituency of Darlington, at door after door right across the town and regardless of background, income or voting intention, parents agreed with me that it is time to act to protect our children. I am taking this issue to the Government to fight for them.
I am standing up to amplify the voice of the girl who sends a picture of herself that she thought was private but arrives at school to find that it has been shared with all her peers; she is not only mortified but blamed, and the message cannot be unsent. I am standing up to amplify the voice of the boy who gets bombarded with violent, disturbing images that he does not want to see and never asked for, and who cannot sleep for thinking about them. I am standing up for the mother whose son comes home bruised and will not tell her what has happened, but who gets sent a video of him being beaten up and finds out that it was organised online. I am standing up for the father whose daughter refuses to eat anything because she has seen video after video after video criticising girls who look like her. I say to all those who have raised the alarm, to all the children who know something is wrong but do not know what to do, and to all those who have seen content that makes them feel bad about themselves, have been bullied online, have seen images they did not want to see or have been approached by strangers: we are standing up for you.
I congratulate my hon. Friend on securing this debate on online safety for children and young people. I have a keen personal interest, as a father of two young children. Earlier this year, Ofcom published 40 recommendations about how to improve children’s safety online, including through safer algorithms, and the Government rightly pointed to the role that technology companies can play in that. Does my hon. Friend agree that these companies must take their responsibilities much more seriously?
Lola McEvoy
I absolutely agree that the companies must take those responsibilities seriously, because that will be the law. I am keen that we, as legislators, make sure that the law is as tight as it possibly can be to protect as many children as possible. We will never be able to eradicate everything online, and this is not about innovation. It is about making sure that we get this absolutely right for the next generation and for those using platforms now, so I thank my hon. Friend for his intervention.
The first meeting I called when I was elected the MP for Darlington was with the headteachers of every school and college in my town. I asked them to join together to create a town-wide forum to hear the voices of children and young people on what needs to change about online safety. The first online safety forum took place a couple of weeks ago, and the situation facing young people—year 10s, specifically—is much worse than I had anticipated.
The young people said that online bullying is rife. They said it is common for their peers to send and doctor images and videos of each other without consent, to spread rumours through apps, to track the locations of people in order to bully them through apps, to organise and film fights through apps, to be blackmailed on apps, to speak on games and apps to people they do not know, and to see disturbing or explicit images unprompted and without searching for them. They also said it is common to see content that makes them feel bad about themselves. This has to stop.
The last Government’s Online Safety Act 2023 comes into force in April 2025. The regulator, Ofcom, will publish the children’s access assessments guidance in January 2025. This will give online services that host user-generated content, search services and pornography services in the UK three months to assess whether their services are likely to be accessed by children. From April 2025, when the children’s codes of practice are to be published, those platforms and apps will have a further three months to complete a children’s risk assessment. From 31 July 2025, specific services will have to disclose their risk assessments to Ofcom. Once the codes are approved by Parliament, providers will have to take steps to protect users. There is to be a consultation on the codes in spring 2025, and I urge everybody interested in the topic—no matter their area of expertise or feelings on it—to feed into that consultation. The mechanism for change is in front of us, but my concern is that the children’s codes are not strong enough.
Lola McEvoy
I am loath to tell Ofcom that it does not have enough power. As I understand it, the powers are there, but we need to be explicit, and they need to be strengthened. How do we do that? The reason I outlined the timelines is that the time to act is now. We have to explicitly strengthen the children’s codes.
There are many ways to skin a cat, as they say, but one of the simpler ways to do this would be to outline the audience that the apps want to market to. Who is the base audience that the apps and platforms are trying to make money from? If that is explicitly outlined, the codes could be applied accordingly, and strengthened. If children are the target audience, we can question some of the things on those apps and whether the apps are safe for children to use in and of themselves.
With children able to access online content a lot more easily nowadays, many of my Slough constituents feel that it is critical that the content itself is appropriate and safe. Does my hon. Friend share my concerns about the rise of extreme misogynistic content and its impact on young people, especially considering that research has shown that it is actually amplified to teens?
Lola McEvoy
I thank my hon. Friend for raising the really important—indeed, deeply concerning—issue of the rise of anti-women hate, with the perpetrators marketing themselves as successful men.
What we are seeing is that boys look at such videos and do not agree with everything that is said, but little nuggets make sense to them. For me, it is about the relentless bombardment: if someone sees one video like that, they might think, “Oh right,” and not look at it properly, but they are relentlessly targeted by the same messaging over and over again.
That is true not just for misogynistic hate speech, but for body image material. Girls and boys are seeing unrealistic expectations of body image, which are often completely fake and contain fake messaging, but which make them reflect on their own bodies in a negative way, when they may not have had those thoughts before.
I want to drive home that being 14 years old is tough. I am really old now compared with being 14, but I can truly say to anybody who is aged 14 watching this: “It gets better!” It is hard to be a 14-year-old: they are exploring their body and exploring new challenges. Their hormones are going wild and their peers are going through exactly the same thing. It is tough, and school is tough. It is natural for children and young people to question their identity, their role in the world, their sexuality, or whatever it is they might be exploring—that is normal—but I am concerned that that bombardment of unhealthy, unregulated and toxic messaging at a crucial time, when teenagers’ brains are developing, is frankly leading to a crisis.
I return to an earlier point about whether the parts of apps or platforms that children are using are actually safe for them to use. There are different parts of apps that we all use—we may not all be tech-savvy, but we do use them—but when we drill into them and take a minute to ask, “Is this safe for children?”, the answer for me is, “No.”
There are features such as the live location functionality, which comes up a lot on apps, such as when someone is using a maps app and it asks for their live location so they can see how to get from A to B. That is totally fine, but there are certain social media apps that children use that have their live location on permanently. They can toggle it to turn it off, but when I asked children in Darlington why they did not turn it off, they said there is a peer pressure to keep it on—it is seen as really uncool to turn it off. It is also about being able to see whether someone has read a message or not.
I then said to those children, “Okay, but those apps are safe because you only accept people you know,” and they said, “Oh no, I’ve got thousands and thousands of people on that app, and it takes me ages to remove each person, because I can’t remember if I know them, so I don’t do it.” They just leave their location on for thousands of people, many of whom may be void accounts, and they do not even know if they are active any more. The point is that we would not allow our children to go into a space where their location was shown to lots of strangers all the time. Those children who I spoke to also said that the live location feature on some of these apps is leading to in-person bullying and attacks. That is absolutely horrifying.
(2 years, 10 months ago)
Commons ChamberI completely agree about the importance of motorsport in this country, and I pay tribute to my hon. Friend for his commitment and hard work in this area. We already support sustainable and synthetic fuels under the renewable transport fuel obligation scheme. Tax policy, as he knows, is a matter for the Treasury, but I will of course work with him and ensure that his ideas are shared across Government.
As the hon. Member will know, the Commissioner for Public Appointments is looking into this matter, and it would not be appropriate to comment until it has published its full report.
(3 years ago)
Commons ChamberI would be delighted to do so. My hon. Friend is a great ambassador for his constituency, always pushing and promoting the great work that is being done.
According to Tech Nation, Slough, which is the silicon valley of the UK, has experienced a 536% increase in the formation of digital start-ups in the last decade. Given that artificial intelligence is of strategic importance to the UK, why have the Government cut research and development tax credits for small and medium-sized enterprises?
The hon. Gentleman will know that a review of R&D tax credits is being conducted. The Chancellor will be speaking later, but because of Tech Nation and the work that has been done over the last decade, we have a great tech ecosystem to build on.