(2 weeks, 2 days ago)
Commons Chamber
Victoria Collins (Harpenden and Berkhamsted) (LD)
Last year, I carried out a “safer screens” tour in my constituency, hearing directly from young people, because the Liberal Democrats consider children and young people to be at the heart of this issue. Teenagers shared concerns about extreme content pushed by algorithms, but also about being glued to their screens alongside their younger siblings. One said, “It’s as addictive as a drug, and I feel the negative impacts every day.” Another pleaded, “Help—I just can’t stop.” Last week, more than 1,700 parents emailed me calling for a social media ban. One mother said that the social media used by her two boys “fills me with dread.” Another highlighted the way in which
“anxiety, reduced attention, online bullying, and exposure to harmful content are becoming common topics among families.”
Parents, teachers, experts and young people themselves are crying out for action, which is why the Liberal Democrats have long raised this as a public health issue. We pushed for the digital age of data consent to be raised to 16, and for the tackling of addictive algorithms. We voted to ban phones in schools, and called for health warnings. Now the Liberal Democrats have tabled an amendment in the other place to ban harmful social media for under-16s, based on film-style age ratings extending to 18. We would reset the default age for social media to 16 now, with strong age assurance, because enough is enough.
This world-pioneering approach brings age-appropriate standards to online safety. We are learning from Australia, and preparing for today’s reality. Our risk-based approach, supported by more than 40 charities and experts including the NSPCC, the Molly Rose Foundation and the Internet Watch Foundation, will stop new platforms slipping through the net while addressing harmful games and AI chatbots, and protecting educational sites such Wikipedia and safe family connections. Crucially, it does not let social media companies off the hook.
We have had age-appropriate safety standards offline, for toys and films, for decades. After 20 years of social media platforms clearly prioritising profit over children, building addictive algorithms that keep children and adults hooked, it is time to take action. We do not need consultation—we need that action now—but at least in this consultation we must look into how, not if, we will implement a ban on harmful social media for under-16s. I urge the Government to consider such a ban, with swift timelines, to address this growing public health crisis, and to act on our proposals now. Our children’s future is not something to be played with.
The hon. Lady explains very well the views of children, young people and parents who are grappling with these issues. I disagree with her: I think we need a short, sharp consultation because there are different views, but we definitely want to act. I am very interested in the idea of age classification, and I would be more than happy to talk to her about that. We all see how this issue affects our own children, and we need to help them cope at different ages. I am sure that many hon. Members will raise different options, and I am more than happy to discuss those with them.
(3 weeks, 3 days ago)
Commons Chamber
Victoria Collins (Harpenden and Berkhamsted) (LD)
For over a week, Grok has generated illegal sexual abuse material—non-consensual images of women and children—without restraint on X, which took the disgraceful step of putting it behind a paywall. That is abhorrent, and those images are illegal. Unlike the Conservatives, we very much welcome the action being taken and absolutely want to work together to stop this illegal, abhorrent use of AI technology. That is why the Liberal Democrats have called on the National Crime Agency to launch a criminal investigation into X and for Ofcom to restrict access immediately. We also called for Reform MPs to donate their earnings from X to those charities working for those victims of sexual exploitation.
Where there are loopholes around AI creation of these horrific images, we are pleased to hear the Secretary of State announce the establishment of a criminal offence to create, or seek to create, such horrific content and the work to criminalise nudification apps. Regulatory gaps, however, are not the only problem; enforcement is failing, too. While other countries have acted decisively to ban X, Ofcom has taken over a week to start an investigation and lacks the resources to take on these tech giants. What has become clear is that with the pace of technology, the Government must look to future-proof online safety from new harms and harmful features.
The Liberal Democrats have long been raising the alarm. We tabled amendments to raise the age of data consent, proposed a doomscroll cap to curb addiction and called for public health warnings on social media. Protecting women and children from online abuse cannot wait, so will the Government support our calls on these actions? This matters in real life—to my constituent who was harmed by strangulation in a nightclub following online videos, and to the victims of sexual abuse and violence, which often starts online. Given the pace of change, does the Secretary of State have full faith in Ofcom’s ability to enforce the Online Safety Act? Will she meet me because, unlike the Conservatives, I would like us to work together on this important issue and discuss the action needed on AI chatbots and emerging technologies?
This is a moment for the House to act together. Inaction sends the message that abuse online is acceptable, and we must prove otherwise.
I thank the hon. Lady for her questions. I think I have said to the House before that patience is not my greatest virtue, but that is because the public and, most importantly, victims want to see this happen quickly. I said in my statement that I expect—because the public expects—Ofcom to do this swiftly. We do not want to wait months and months for action. I am of course happy, as is the Online Safety Minister, to meet her to discuss further steps. There are clear responsibilities here in terms of enforcement of the law on individuals and their behaviour, but the Online Safety Act, which I know her party voted for, does place some of those requirements on Ofcom. We have to see action, and I am sure that that message will be heard loud and clear today.
(1 month, 3 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Victoria Collins (Harpenden and Berkhamsted) (LD)
It is a pleasure to serve under your chairmanship, Ms Butler. I congratulate the hon. Member for Dewsbury and Batley (Iqbal Mohamed) on securing this incredible debate. That so many issues have been packed into 90 minutes shows clearly that we need more time to debate this subject, and I think it comes down to the Government to say that an AI Bill, or further discussions, are clearly needed. The issue now pervades our lives, for the better but in many aspects for the worse.
As the Liberal Democrat spokesperson on science, innovation and technology, I am very excited about the positive implications of AI. It can clearly help grow our economy, solve the big problems and help us improve our productivity. However, it is clear from the debate that it comes with many risks that have nothing to do with growing our economy—certainly not the kind of economy we want to grow—including the use of generative AI for child sexual abuse material, children’s growing emotional dependency on chatbots, and the provision of suicide advice.
I have said for a long time the trust element is so important. It is two sides of the same coin: if we cannot trust this technology then we cannot develop as a society, but it is also really important for business and our economy. I find it fascinating that so many more businesses are now talking about this and saying, “If we can’t trust this technology, we can’t use it, we can’t spend money on it and we can’t adopt it.” Trust is essential.
If the UK acts fast and gets this right, we have a unique opportunity to be the leader on this. From talking to industry, I know that we have incredible talent and are great at innovating, but we also have a fantastic system for building trust. We need to take that opportunity. It is the right thing to do, and I believe we are the only country in the world that can really do it, but we have to act now.
Sarah Russell
Does the hon. Lady agree that we should be looking hard at the EU’s regulation in this area, and considering alignment and whether there might be points on which we would like to go further?
Victoria Collins
Absolutely, and the point about global co-operation has been made clearly across the Chamber today. The hon. Member for Leicester South (Shockat Adam) talked about what is now the AI Security Institute—it was the AI Safety Institute—and that point about leading and trust is really important. Indeed, I want to talk a little more about safety, because security and safety are slightly different. I see safety as consumer facing, but security is so important. Renaming the AI Safety Institute as the AI Security Institute, as the hon. Member mentioned, undermines the importance of both.
The first point is about AI psychosis and chatbots—this has been covered a lot today, and it is incredibly worrying. My understanding is that the problem of emotional dependency on AI chatbots is not covered by the Online Safety Act. Yes, elements of chatbots are covered—search functionality and user to user, for example—but Ofcom itself has said that there are certain harms from AI chatbots, which we can talk about, that are not covered. We have heard that 1.2 million users a week are talking to ChatGPT about suicide—we heard the example of Adam, who took his own life in the US after talking to a chatbot—and two thirds of 23 to 34-year-olds are turning to chatbots for their mental health. These are real harms.
Of course, the misinformation that is coming through chatbots also has to be looked at seriously. The hon. Member for York Outer (Mr Charters) mentioned the facts and the advice coming through. We can achieve powerful outcomes, but we need to make sure that chatbots are built in a way that ensures that advisory element, perhaps by linking with NHS or other proper advice.
The hon. Member for Milton Keynes Central (Emily Darlington), who has been very passionate about this issue, mentioned the Molly Rose Foundation, which is doing incredible work to show the harms coming through this black hole—many do not see the harms, which have an impact on children that parents do not understand, as well as on adults.
The harm of deepfakes, including horrific CSAM and sexual material of all ages, has also been mentioned, and it is also impacting our economy. Just recently, a deepfake was unfortunately made of the hon. Member for Mid Norfolk (George Freeman). The Sky journalist Yalda Hakim was also the victim of a deepfake. She mentioned her worry that it was shared thousands of times, but also picked up by media in the subcontinent. These things are coming through, and no one who watches them can tell the difference. It is extremely worrying.
As the hon. Member for Congleton (Sarah Russell) said, “Rubbish in, rubbish out.” What is worrying is that, as the Internet Watch Foundation has said, because a lot of the rubbish going in is online sexual content that has been scraped, that is what is coming out.
Then there is AI slop, as the right hon. Member for Oxford East (Anneliese Dodds) mentioned. Some of that is extreme content, but what worries me is that, as many may know, our internet is now full of AI slop—images, stories and videos—where users just cannot tell the difference. I do not know about others, but I often look at something and think, “Ah, that’s really cute. Oh no—that is not real.” What is really insidious is that this is breaking down trust. We cannot tell any more what is real and what is not, and that affects trust in our institutions, our news and our democracy. What we say here today can be changed. Small changes are breaking down trust, and it is really important that that stops. What is the Minister doing about AI labelling and watermarking, to make sure we can trust what we see? That is just one small part of it.
The other thing, which my hon. Friend the Member for Newton Abbot (Martin Wrigley) mentioned, is that often AI threats magnify what is already a threat, whether it is online fraud or a security threat. I believe that AI scams in just the first three months of this year cost Brits £1 billion. One third of UK businesses said in the first quarter they had been victims of AI fraud. And I have not got on to what the hon. Member for Dewsbury and Batley said about moving towards AI in security and defence, and superintelligence. What are the “exaggerated” threats that actually will become extremely threatening? What are the Government doing to clamp down on these threats, and what are they doing on AI fraud and online safety?
Another issue is global working. One of the Liberal Democrats’ calls is for an AI safety agency, which could be headquartered in the UK; we could take the lead on it. I think that is in line with what the hon. Member for Dewsbury and Batley was talking about. We have this opportunity; we need to take it seriously, and we could be a leader on that.
I will close by reiterating the incredible work that AI could do. We all know that it could solve the biggest problems of tomorrow, and it could improve our wellbeing and productivity, but the threats and risks are there. We have to manage them now, and make sure that trust is built on both sides.
Mr Adnan Hussain (Blackburn) (Ind)
I just want to reaffirm what the hon. Member has said. Does she agree that innovation and safety are not opposites? This is reminding me of when Google and online banking first came in. We need clear rules so that we can increase public trust and not stifle technology.
Victoria Collins
Absolutely. What is interesting about innovation is that it often thrives with constraints. As I have said, safety is about trust, which is good for business and our economy, and not just for our society.
When will the AI Bill come to Parliament? We really need it; we need to discuss these things. What are the Government doing to reassess the Online Safety Act? Beyond that, in determining how we react to this rapid shift in technology, will they consider the Lib Dems’ call for a digital Bill of Rights to make sure that standards are set and can adapt to that? What are the Government doing about international co-operation on safety and security? As the hon. Member for Blackburn (Mr Hussain) mentioned, we can—we must—have innovation and safety, and safety by design. We can choose both, but only if we act now.
(2 months, 2 weeks ago)
General Committees
Victoria Collins (Harpenden and Berkhamsted) (LD)
It is a pleasure to serve under your chairmanship, Mr Vickers. The Liberal Democrats support this statutory instrument, which updates the Online Safety Act’s priority offences to reflect changes in intimate image abuse law. It is absolutely right to tackle the non-consensual sharing of intimate photographs and films, and to tackle self-harm.
However, this is also an important opportunity to say that the Act must go further still. The Internet Watch Foundation reminds us that it is not currently illegal to retain, re-upload or trade abusive intimate image material long after initial distribution. The Molly Rose Foundation and Samaritans have raised the issue of self-harm, and I am pleased to hear that being addressed today, but the point about AI chatbots is really important. As I mentioned in DSIT questions, the legislation on user to user and search seems pretty clear, but what about one-to-one chatbots when there is a single user? It is not clear who is accountable when self-harm content comes through chatbots that are not user to user. I appreciate that the Minister said the Department is looking into that issue with Ofcom.
The Act must also go further to address emerging online threats. The Internet Watch Foundation also reports that intimate images online are increasingly generated by deepfake AI, and that expert analysis now struggles to distinguish AI-generated content from real images or videos. At the beginning of this year alone, the IWF found 1,200 photorealistic videos of child sexual abuse material online. The Online Safety Act must do more to hold big tech companies to account, and to protect users from intimate image abuse at source, both real and AI-generated. Importantly, it must also tackle self-harm that is linked to AI chatbots, which are increasingly used by people of all ages.
Although this statutory instrument is a step forward, we need regulation that keeps pace with the rapidly evolving technology, not just changes in statute. We must ensure that Ofcom is sufficiently equipped and resourced to deal with emerging technologies. Will the Minister confirm what assessment has been done of the adequacy of Ofcom’s resourcing to ensure that this statutory instrument and the Online Safety Act can be applied and enforced in this fast-moving environment? When can we expect updates on AI chatbots and the scope of regulation? Will the Minister also confirm what the Government are doing to effectively regulate deepfake intimate content? What steps are being taken to hold tech companies to account for the continued harm facing children, vulnerable people and, given that experts can no longer differentiate between deepfake and real images, all internet users?
(3 months ago)
General Committees
Victoria Collins (Harpenden and Berkhamsted) (LD)
It is a pleasure to serve under your chairmanship, Dr Murrison. The Liberal Democrats support the statutory instrument, as it will simplify market access for manufacturers, reduce duplication in testing and certification, and facilitate UK exporters’ entry into Japan and Singapore for smart connected consumer products. It demonstrates the important principle that cutting red tape is vital to promoting economic growth and reducing compliance costs for businesses—which is why the Liberal Democrats, alongside many businesses, are also calling for a customs union with the EU. That would similarly break down the bureaucracy holding British businesses back and boost our economy.
We must, however, ensure that safeguards remain. Given the critical importance of maintaining robust cyber-security protections, can the Minister confirm what oversight mechanisms are in place to monitor ongoing alignment with these international schemes, and how these measures will be integrated into the long-awaited cyber-security and resilience Bill, which will be vital in keeping our economy safe?
(3 months, 3 weeks ago)
Commons Chamber
Victoria Collins (Harpenden and Berkhamsted) (LD)
I thank Secretary of State for advance sight of this statement, but I am quite frankly disappointed that this is how we are starting the conversation on digital ID in Parliament. We Liberal Democrats believe that freedoms belong to citizens by right, but the Government’s plans for digital ID for every single working person risk eroding the hard-won freedom to control the way we live our lives. They risk excluding millions of vulnerable people from their own society and wasting billions in public money chasing expensive solutions that will not work. Yet again, it is a gimmick to tackle irregular migration—something I had hoped was reserved for the Conservatives. Yet again, by eroding public trust with these rushed, retrofitted policies, the Government have squandered an opportunity to use technology to improve public services by bringing people with them. In addition, the Government announced this—a scheme that will impact every single working person in the UK—weeks before it could be scrutinised by Parliament.
Any claims from this Government that this scheme will be non-compulsory and give agency are poppycock in reality. As a requirement for the right to work, it is mandatory ID in all but name—the Secretary of State said so herself just now. Where is the choice in that? Last week, the Foreign Secretary proposed issuing digital IDs for teenagers. This is clear Government mission creep, and it is dangerous.
Liberals have always stood up against concentrations of power, and for good reason. We have seen the Government’s abject failure to secure people’s data before—just ask the victims of the Legal Aid Agency data breach or the armed forces personnel who were victims of the Ministry of Defence data breach whether they have faith in the Government to keep their most personal data secure. How can the public have trust in the Government to manage a system that will manage the data of almost the entire population?
Will the Secretary of State commit to publishing an impact assessment for the 8.5 million people without foundational digital skills, such as my constituent Julie, who does not own a smartphone and is fearful of being excluded from employment, healthcare and other essential services? Will the Secretary of State come forward with a plan to reduce the risk of further marginalisation?
All these serious concerns, from privacy to exclusion, come at a staggering cost. This scheme will cost the taxpayer billions—money that will be wasted on a system doing little to tackle the Government’s stated aims of immigration enforcement. Meanwhile, our public services are crumbling. Finally, I ask the Minister how much taxpayer money the Government are prepared to waste on this—a scheme for which they have no mandate and no public support—before they admit it does not work.
I will try to keep this brief, Madam Deputy Speaker. The hon. Lady raises a number of different issues that I mentioned in my statement. On digital exclusion, we have a digital inclusion action plan and will be spending £9.5 million in local areas to help people who are currently excluded to get online. We will be publishing a full consultation on that, and I am sure she will feed in her views.
It is interesting that the Liberal Democrat leader, the right hon. Member for Kingston and Surbiton (Ed Davey), said last month that if a UK system were about giving individuals the power to access public services, he could be in favour of it. I hope the Liberal Democrats drop their partisan approach and work with us to deliver the system. I say to the hon. Lady and to other hon. Members that many, many other countries have digital ID systems. The EU is rolling out a digital ID system in all member states—
(5 months ago)
General Committees
Victoria Collins (Harpenden and Berkhamsted) (LD)
Keeping children and vulnerable people safe online is vital. For far too long, the online world has been a wild west, where children are subject to a torrent of harmful content, from pornography to suicide promotion. The call for this measure has come not only from parents, teachers and experts, but during my safer screens tour in Harpenden and Berkhamsted, young people themselves told me that the algorithms are pushing explicit and harmful content that they do not want to see. The topic is so important that students from Ashlyn’s and Berkhamsted schools have joined forces to lead the work themselves.
Today this Committee looks at qualifying worldwide revenue, which is important as it is linked to the level of fines. With the roll-out of the Act, the Lib Dems call on the Government to ensure a review of Ofcom fines to ensure they are enforceable and act as a true deterrent, especially given the pushback already seen from companies. We also ask Government to ringfence the fines generated by Ofcom under the Online Safety Act, for purposes including funding the provision of stand-alone education on online safety and safer screens for all school children.
Overall, it is important to highlight that the Online Safety Act contains vital safeguards against priority illegal content, requiring online platforms to tackle material depicting offences including child sexual abuse, intimate image abuse and sexual exploitation, but we know that concerns have been raised by many about the implementation of the Act, including about its effectiveness in preventing online harms and its impeding access to educational sites and important informational forums. Concern has also been voiced that age-assurance systems may pose a data protection or privacy threat to users. We therefore believe Parliament should have the opportunity to properly scrutinise Ofcom’s implementation. We use this opportunity to again call on the Government to conduct a full and urgent parliamentary review of the Act, to ensure that it meets its stated aim of keeping children and other vulnerable groups safe online, and to determine whether the Act is fit for purpose.
(6 months, 2 weeks ago)
Commons Chamber
Victoria Collins (Harpenden and Berkhamsted) (LD)
Britain’s musicians have long been our most beloved cultural treasures. In the crowded field of excellence in our creative sectors, our musicians are some of our proudest exports. They are part of a £124 billion industry that drives our economy, so support for our legacy and session musicians is completely overdue and very welcome. The musicians covered include the Devines in Berkhamsted, upcoming artists like Myles Smith, and national treasures like Elton John—I agree that Adele is one of our national treasures—and, as was mentioned, all those around them: songwriters, producers, and those who support them.
Technological change means that online streaming now constitutes the vast bulk of music consumption, and 120,000 new tracks a day are uploaded to music platforms. This often leaves a hole in musicians’ income, so it is absolutely right that the Government are taking this issue seriously. We simply need to get this right, so I ask the Minister to clarify for the House how much confidence we can really have that the principles he is spelling out will finally lead to a more equitable distribution of streaming revenue. Ultimately, this is a label-led, voluntary framework; where is the independent oversight? Crucially, what guarantees are there of consistency or enforcement across the industry?
We have raised this issue many times in the past, but it remains true that if we are serious about protecting artists’ right to remuneration, we need to ensure that copyright, which has underpinned success for decades, works in our digitally evolving world. Musicians and creatives face an AI tsunami, which could pose a threat to their livelihoods; we need to tackle it seriously. I conclude by asking the Minister once more to consider swifter action from the Government on copyright and data mining, in order to support our musicians and creatives, as well as innovation across the UK.
I think the hon. Member is in danger of becoming a national treasure herself. [Interruption.] Oh, I see that I have not united the House on that, but—[Interruption.] The right hon. Member for Daventry cannot keep heckling; he is the shadow Health Secretary now.
The important point is that the hon. Member for Harpenden and Berkhamsted (Victoria Collins) asked what confidence we can have that this will be adhered to, and I am very confident that it will. I have had face-to-face conversations with all the chief executives of the major record labels, and although sometimes I have been asking them to go further, they have gone that extra mile, and I am absolutely sure that they will deliver on this. I am confident that any legacy artist who wants to renegotiate their contract will be able to do so. We will be looking at precisely how that happens.
If anybody is not happy with their renegotiation, we have included in the principles a means of appealing. That is obviously a major role of the Musicians’ Union, but if by the autumn we suddenly find that lots of musicians are saying, “Excuse me, but I haven’t managed to renegotiate with my label”, then we will be returning to this issue. The record labels are fully aware of that, but they are determined: each is going to put together a bespoke package to try to revitalise legacy work. They are also looking at wiping off unrecouped balances and making sure people can earn more into the future.
The one thing I have always been nervous about it is that I do not think Governments should be writing contracts. This is really important. Julie Andrews, when she took the role of Maria in “The Sound of Music”, decided—or this is how the negotiation ended up—that she would just take an up-front free, and she never got paid any royalties thereafter. That was probably a poor decision, or she was not given any other choice. However, I think Schwarzenegger, when he made movies, quite often decided to take the royalties and did not take any up-front fees. Different artists will enter renegotiation in different ways, but we wanted to rebalance the equation so that it is more in the interests of the musicians, and that is what we have done.
(7 months, 1 week ago)
Commons Chamber
Victoria Collins (Harpenden and Berkhamsted) (LD)
First, I echo the congratulatory comments about the hon. Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah)—they are absolutely deserved.
Donald Trump’s proposals to ban US states from regulating AI for 10 years have been condemned by Microsoft’s chief scientist, showing that we cannot trust the US to provide safe and sensible AI regulation. Does the Minister agree that now is the time for the UK to lead on AI safety, and will he join me and the head of Google DeepMind in calling for an AI safety agency modelled on the International Atomic Energy Agency and headquartered here in the UK?
Both the Under-Secretary of State and I have been remiss in not congratulating my hon. Friend the Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah) on her damehood. As you know, Mr Speaker, all knights love to see a dame enter the Chamber. The Under-Secretary of State and I work closely on AI and copyright, and on making sure that we have the AI safety and security that we need. The Liberal Democrat spokesperson makes a fair point and it is one of the things that we are considering at the moment.
(7 months, 3 weeks ago)
Commons ChamberI call the Liberal Democrat spokesperson.
Victoria Collins (Harpenden and Berkhamsted) (LD)
I would like to either disappoint or reassure the House that, sadly, I do not have a story for Members today. I will dive straight into the amendments that are before us.
Just three months. After all the discussions and the cries for fairness from the creative industries, which have seen the daylight robbery of their life’s work, the Government are sending back an amendment that, in essence, changes the economic assessment from 12 months to nine months, with a progress statement and some expansion. I understand that this is the data Bill, and that this legislation contains many important elements relating to the future of our data, which we must secure. In response to the point made by the Minister, I absolutely understand the importance of securing data adequacy with the European Union. However, the creative industry is at a critical juncture with AI. Many feel that it is already too late, but they are doing what they can, fighting for transparency and fairness for a £126 billion UK industry.
The Creators’ Rights Alliance has already started to see the impact for creators. 58% of members of the Association of Photographers have lost commissioned work to generative AI services, with an increase of 21% in the past five months alone, totalling an average loss of £14,400 per professional photographer—approximately £43 million in total. Some 32% of illustrators report losing work to AI, with an average loss of £9,262 per affected UK creator. There is an uncomfortable truth that economic gains from AI—of which I am sure there will be many—will also be met with economic losses that must be addressed. Indeed, at Old Street tube station, there are signs everywhere at the moment saying “Stop hiring humans.” Some 77% of authors do not know whether their work has been used to train AI, 71% are concerned about AI mimicking their style without consent, and 65% of fiction writers and 57% of non-fiction writers believe that AI will negatively impact their future earnings. At this point, the creative industry feels betrayed, and is asking for solutions.
I also welcome the Secretary of State’s statements this weekend. He talked about looking comprehensively at the challenges creatives will face into the future and about bringing legislation in at the right time, but that time is now. Last week’s Lords amendment 49F highlighted that the Lords understood the need for separate legislation and asked for a draft Bill looking at copyright infringement, AI and transparency about inputs, which is something that creatives have been clear about from the start. I have always highlighted the positive impacts of technology and innovation, and I have no doubt that creatives will also use AI to aid their creativity. However, from musicians to film makers and photographers to writers and painters, the works of this massive industry have been swallowed up, and creatives are left wondering what that means for them—especially as they are already starting to see the impact.
In my constituency of Harpenden and Berkhamsted I see the creative spirit everywhere. There is: Open Door, a caring oasis with walls covered by local artists; the Harpenden Photographic Society, established more than 80 years ago, where generations have learned to capture light and moment through their lens; the Berkhamsted art society, where painters and craftspeople gather to nurture each other’s artistic journey; and the Berkhamsted Jazz group, who get us up and dancing. These creators are the threads that weave the rich tapestry of British culture, and the creative industries permeate our towns, including the filming of box office hits such as “Guardians of the Galaxy” and “Robin Hood: Prince of Thieves” at Ashridge. Who will be the guardians of this creative galaxy? Why does this theft feel a little less heroic than Robin Hood?