(1 week ago)
Commons Chamber
Natasha Irons (Croydon East) (Lab)
I thank the Minister for the decisive action that he took over the recent Grok incident. Given the scope of the consultation and the fact that we are talking about online harms, I want to flag the issue we have around content on YouTube, which is a video-sharing platform, not necessarily a social media platform. The type of content that our children are consuming on there is a quick succession of images, which is not very good for a child’s development, rather than the slow-paced stuff we get when we watch a broadcaster. Will the consultation look at the quality of content on these platforms? Not all screentime is equal; some screentime can be quite dangerous for a child’s development in general.
Kanishka Narayan
Both of my hon. Friend’s points—on the scope of how we look at particular platforms and at their functionalities—are not just considered by the consultation, but deeply important. I engaged with the Australian Minister on this issue just last week, trying to understand their experiences of this and the uncertainty of getting those two things right. That is exactly why the consultation has been an appropriate approach in this context.
Where services fail to comply with their duties in the Act, Ofcom’s enforcement powers include fines of up to £18 million or 10% of qualifying worldwide revenue. Ofcom has indicated that it has issued financial penalties to six companies under the Online Safety Act amounting to more than £3 million. I can confirm to the House that just yesterday, Ofcom announced that it has fined a porn company £1.35 million for failing to introduce proper age verification on its websites—the largest fine levied so far under the Act. I welcome this strong action to protect children online.
We have always been clear that while the Online Safety Act provides the foundations, there is more to do to ensure that children live enriching online lives. Like all regulatory regimes, it must remain agile. That is all the more critical given that we are dealing with fast-moving technology. That is why this Government have already taken a number of decisive steps to build on these protections.
The first act of my right hon. Friend the Secretary of State was to make online content that promotes self-harm and suicide a priority offence under the Online Safety Act. That means that platforms must take proactive steps to stop users seeing this content in the first place. If it does appear, platforms must minimise the time that it is online. As well as that, both intimate image abuse and cyber-flashing are now priority offences under the Online Safety Act.
Last month, my right hon. Friend the Secretary of State stood in this Chamber and made it clear that the creation of non-consensual deepfakes on X is shocking, despicable and abhorrent. She confirmed that we would expedite legislation to criminalise the creation of non-consensual intimate images, and I am pleased to confirm to the House that that came into effect earlier this month. That will also be designated as a priority offence under the Online Safety Act, and it complements the existing criminal offence of sharing or threatening to share a deepfake intimate image without consent.
Alongside that, it was announced that we will legislate to criminalise nudification tools to make it illegal for companies to supply tools to be used as generators of non-consensual intimate images. Last week, we went further still and announced that we will introduce a legal duty requiring tech companies to remove non-consensual intimate images within 48 hours of them being reported. These measures will provide real protection for women and girls online.
However, we recognise the strength of feeling up and down the country and right across this House—not least in this debate. We share the concern of many parents about the wider impact of social media and technology on children’s wellbeing. The rapid growth of grassroots campaigns such as Smartphone Free Childhood highlights how concerned parents are about the pull of these technologies and what it means for their children. That includes the potential impacts on mental health, sleep and self-esteem.
We have set out our commitment to supporting parents and children with these issues. We want to find solutions that genuinely support the wellbeing of our children and to give parents the help that they need as they guide children through online spaces safely.
There were very real and important debates during the passage of that Bill about legal but harmful material and whether people should be able to speak freely online. Our approach was to seek to create a space where adults can speak freely while accepting that children should not be in some of these spaces. That was the point that the Leader of the Opposition was trying to make.
We were moving very dangerously into the realms of free speech, and it is not for an online regulator to start telling people what they can and cannot say online when it is not something that is illegal to speak of in the real world. That was the challenge that we got ourselves into as a Government, and that is why we changed parts of the approach that we were taking to the Online Safety Bill. I appreciate the concerns that are being raised, and I am trying to answer them as honestly and straightforwardly as I can.
When we consider the amendment from Lord Nash, this House will have its opportunity to make an unequivocal statement of principle: that when we believe that something is harming children at scale, we accept that it is insufficient to leave the status quo unchallenged or simply to commission a consultation. That applies especially when it is a consultation to which this Government have provided absolutely no political direction or view and that has been much trailed but still not actually launched. In truth, this consultation was not ready. It was a mechanism to get the Prime Minister out of another of his tight fixes.
The Tech Secretary might be very good at emoting and telling us all how impatient she is for change, how she cares, and indeed for how many years she has cared, but when she made her statement on social media for children in this Chamber a few weeks ago, she said nothing about what the Government would actually do, beyond seeking more time to take a position. I commend the hon. Member for Twickenham for pointing that out, and I have sympathy with why she is trying to use this mechanism today, because we are all trying to tease out what the Government are seeking to do.
It was extraordinary to listen to the Government Minister, who said with great sincerity, “We will act robustly in responding to a consultation.” What does he actually believe? What do the Government think we should do on this issue? Nobody has a clue. They are talking about a huge range of things that could be done, but it is for a Government to provide political direction; it is not for a Government to seek consensus. [Interruption.] It is for a Government to take a position and to take a view. It is for a Government to have opinions. It is for a Government to have policy positions. It is not for a Government to try to make sure that everybody in this House agrees. [Interruption.] It is pathetic to see those on the Labour Benches getting out of their tree about this.
Natasha Irons
I sincerely thank the hon. Lady for giving way. When we talk about the consultation, it is not necessarily about seeking consensus in this place; it is about seeking consensus with parents and children, and with people outside this place. Banning social media for children is a good approach, but this is not just about that, is it? It is also about the time that our kids are spending on screens. That is what this is about: it is about having a digital childhood that we can all get behind and support.
I can agree with that. My point is that this Government are trying to suggest that a consensus can be found in the absence of their having a policy position. They are talking about a consultation, but what on earth are they consulting on? Nobody has a clue. They have not been able to say anything about what they actually want to do, because the Prime Minister has no opinions, which is why he is in such deep trouble. Those on the Labour Benches can get out of their tree and get all uppity about it, but this—[Interruption.] No, the Prime Minister is being blown around like a paper bag on this issue, and everybody knows it. First of all, he said that his children did not want to ban social media; now he says that his children are the reason why he wishes to ban social media. He said there is going to be a consultation, but it has not materialised. What does this man actually think?
I agree with the hon. Member wholeheartedly.
Until now, we have implicitly decided that childhood must simply adapt to an environment that we as adults find totally overwhelming, undermining of our own sense of self and completely irresistible. We have been exposing our children to this place of no settled social rules where that exposure is constant, the boundaries are porous and responsibility is diffuse. Behaviour that would never be tolerated offline is normalised, monetised and then algorithmically amplified. The Online Safety Act, which we have discussed already, has been a step forward in trying to wrest back control, but it is, of course, an imperfect one. It focuses primarily on illegal content, seeks to keep the most extreme material offline and introduces age-gating for pornography and other over-18 content. That work does matter, but the problem before us today goes well beyond illegality and explicit material. There are also many concerns about the complexity of policing content, in terms of both the implementation and intent.
The central question is not just what children see but how social media works. Social media platforms are addictive by design. Their algorithms are engineered to maximise engagement and stickiness. They reward outrage, comparison, emotional intensity, competition and repetition. They draw children away from purposeful activity and into feedback loops that erode attention and resilience. Not all platforms operate like this globally, funnily enough. The Chinese version of TikTok is time-limited and feeds children content of scientific or patriotic value. In the west, it is emotional arousal that is fed to our kids.
Children are not simply consuming content; they are being shaped by the environment itself. It is happening when their brains are still developing. Their impulse control, emotional regulation and ability to assess risk are not the same as for adults. We recognise this everywhere else in law—in alcohol limits, in safeguarding rules and in age of consent protections—yet online we have decided to suspend that logic, and the consequences are increasingly visible.
Natasha Irons
I am new to this place and clearly still learning, but I am wondering why, in that case, measures on designing out at source the harms that the hon. Member is talking about were watered down in the Online Safety Bill. She is absolutely right: we are creating online worlds, and they should be designed to be safe. Just as we design clothes for children that do not have toxic materials in them, we would hope that the spaces they inhabit online also do not have toxic material in them, so why were those protections not strengthened in the Bill that the Conservative party passed when it was in power?
I have set out before what we were trying to achieve with the Online Safety Act and why certain things were in it and others were not. I do not want to go over that again.
The consequences of these design features are increasingly visible, including rising anxiety and low mood, poor sleep, shredded attention spans and cyber-bullying that follows children home.
I really thank the hon. Member for that intervention, because that is exactly one of the recommendations of the Committee’s inquiry. As he says, the advertisement-based business models of most social media companies mean that they promote addictive content regardless of authenticity. This spills out across the entire internet via the unclear, under-regulated digital advertising market, incentivising the creation of content that will perform well on social media, as we saw during the 2024 unrest following the horrendous Southport attacks.
This is not just a social media problem, though. It is a systemic issue that promotes harmful content and undermines public trust. The Committee identified five key principles that we believe are crucial for building public trust. The first is public safety. Public safety matters; I hope it is not necessary to debate that. The second is free and safe expression, which is also very important. The third is responsibility on the part of the platforms. Right now, they have no legal responsibility for the content they amplify; they just have to follow their own processes in certain specific cases. Our fourth principle involves control, and the fifth and final principle is transparency. We made detailed recommendations on regulating the advertising-based business model so that amplification would not be incentivised in the way that was outlined by the hon. Member for Carshalton and Wallington (Bobby Dean). We also recommended a right to reset—the right of a person to remove their data from any algorithm.
Our report came out not long before the Minister took up his position. The Government accepted all our conclusions but none of our recommendations. I urge them to look again at our recommendations and to consider implementing them, or at least to respond and tell me why they are still not to be implemented. I welcome the Government’s recent actions and interventions and their readiness to intervene. As I said, the consultation is critical. I welcome the desire to promote a consensus and to take measures to ensure swift delivery of the consultation conclusions through the Children’s Wellbeing and Schools Bill. The consideration of the inclusion of AI chatbots is important, as is addressing the risky features in certain models, as well as providing support for bereaved parents. The Committee looks forward to working with the Government to try to achieve their aims. We need evidence to drive policy and regulation based on principles that the public can have confidence in.
Natasha Irons
I wanted to intervene on the point about principles, content and responsibility. I worked for Channel 4 before I came to this place, and we were regulated by Ofcom. Channel 4 did not create its own content, but was responsible for the editorialisation of that content. It was beholden to certain standards. Does she agree that we should be holding these media companies—they are not now “new media” companies, but legacy media companies—just as responsible for the content they put out on their platforms as any broadcaster?
My hon. Friend makes an important point; the insight she brings from her career in the media is critical. For many years, while the platforms were just that—platforms on which other people placed content—there was an argument that they should not be regulated and that they did not have a responsibility for the content on them, but they are at the very least active curators of that content now. Algorithms effectively form digital twins of individuals and then drive individualised content at them. That requires a responsibility. The time is right, as our Committee recommended, to ensure that platforms have responsibility for their content.
The Science, Innovation and Technology Committee will be holding a one-off session on social media age restrictions on 11 March to feed into the Government’s consultation on measures to keep children safe online and to hear from social media companies on their progress in the last year. We will also gauge the strength of the evidence for and against an age-based ban on social media, as well as any evidence relating to proposed alternatives to a ban. In doing so, we will hear from experts and representatives of those with direct experience of harms. We want to hear from both sides of the debate in the UK and will be seeking evidence from Australia on the first few months of the ban that is already in force there. We will be hearing from major social media and technology companies in a follow-up to our algorithms and misinformation inquiry, and we will ask for their views on the proposed age limits.
Finally, the work on social media age restrictions will feed into a larger inquiry on the neuroscience of digital childhood, which we will launch in the coming weeks. We want to find out how young people spending their formative years online affects their brains and what the Government should do to protect them from any negative impact. That could cover the impact of social media and other screentime on brain development, behaviour, and physical and mental health, whether positive or negative. It could also cover the physiological impact on eye development, the impact on socialisation and what actions Governments should take. There is a consensus on the need to do something, but not on what needs to be done. That is why we are seeking to provide evidence.
I always say to the platform companies that the opposite of regulation is not no regulation, but bad regulation. More regulation is coming. Several US states, such as California, have brought in new regulation on big tech. The Spanish Prime Minister has called social media a
“failed state where laws are ignored and crimes are tolerated”.
There is also the increasingly significant issue of technology sovereignty and whether we are too dependent on foreign companies for our online environment. I call myself a tech evangelist, and I am, but I also know how much an engineer costs. The starting salary of an AI engineer—if companies can find one—is well over £100,000 a year. Tech companies are not going to put them to work on protecting and keeping our children safe unless the House puts the right incentives in place. With all due respect to the Minister and the Online Safety Act, which he inherited, they are not in place now.
(1 month, 1 week ago)
Commons Chamber
Natasha Irons (Croydon East) (Lab)
I welcome the Government’s launching of a national conversation about the impact of screens and tech on our children. As part of the Culture, Media and Sport Committee’s inquiry into children’s TV and video content, we have heard evidence that not all screen time is equal. Content that is meant for engagement can be beneficial for a child’s development, but content that is designed for attention, such as fast-paced images bombarding children, can be harmful. Will the Secretary of State ensure that the consultation looks at the quality and purpose of the content that our children are consuming on video-sharing platforms such as YouTube and at what more can be done to ensure that platforms support the right type of content for our kids?
Absolutely. I was at Braunstone community primary school in my constituency not long ago, where I was shown the incredible power of AI to help teachers with lesson planning. One teacher told me that using it meant that he could free up 30 days a year to be present for his kids in school and his family at home. Teachers were also using AI tutors to help narrow the disadvantage gap between rich and poor kids. We need to look at the quality of screen time, so my hon. Friend’s point is very well made.
(8 months ago)
Commons ChamberPart of our reforms are to ensure that those who can never work are properly supported and not put through endless assessment, and I thank the hon. Member for raising this case. We are committed to renewing the nation’s contract with those who have served, and a range of support is in place for veterans, including dedicated medical and physical healthcare pathways in the NHS, employment, and housing. The new support system, VALOUR, backed by £50 million of funding, will provide a network of support centres to connect veterans with local and national services.
Natasha Irons (Croydon East) (Lab)
I completely agree with my hon. Friend that the Conservatives left a trail of devastation across education and youth services. [Interruption.] Given half the chance, judging by their moans, they would do it all over again. We are making different choices—working with young people to draw up a landmark new national youth strategy, investing £145 million this year to provide stability to the youth sector, rolling out youth future hubs to expand access opportunities and reduce crime, and extending access to mental health support to nearly 1 million more children this year.
(9 months, 2 weeks ago)
Commons ChamberNotwithstanding the views of the Chinese Government, it is a delight to see you in your place, Madam Deputy Speaker. I am only saddened that I have not been sanctioned, which feels a shame—nor by Russia, for that matter. There is still time.
I am delighted to be here today to discuss the Bill, which we last discussed in depth a week ago today. First, I would like to express how pleased I am that the other place has agreed to the Government’s amendments relating the national underground asset register and intimate image abuse. I pay tribute to all those Members of the House of Lords who took part in getting that part of the legislation to the place where it is now. I am glad we have been able to work with them. I will start by encouraging the House to agree to those amendments, before I move on to discuss the amendments relating to AI and intellectual property, scientific research, and sex and gender—in that order.
Lords amendments 55D, 55E and 56B, which were introduced to the Bill in the other place by the noble Baroness Owen of Alderley Edge, place a duty on the face of the Bill that requires the Government to: review the operation of the “reasonable excuse” defence in the offences of creating and requesting intimate image deepfakes without consent, or reasonable belief in consent; publish the outcome of the review in a report; and lay that report before Parliament. The Government were pleased to support the amendments in the other place, as we share the desire to ensure that the criminal law, and these offences in particular, work as the Government intend.
Natasha Irons (Croydon East) (Lab)
I think we all appreciate the amendment, because we want to protect vulnerable women, children and anybody who is at risk of this sort of harm. Could we not look at doing something similar to the amendment, and the carve-out we have created with it, for our creative industries? If we can protect our vulnerable people, can we not also protect our creative industries from copyright infringement by having territorial exemptions similar to what we have with deepfakes?
My hon. Friend is jumping the gun slightly—I will come on to those issues.
I want to praise Baroness Owen with regard to this part of the legislation. If it had not been for her, I do not think it would have ended up in the Bill. There was a bit of to-ing and fro-ing between her and the Ministry of Justice to ensure that we got the legislation in the right place. As I said in last week’s discussions, one of the issues was whether Baroness Owen’s original version of the second offence really worked in law; I think she agreed that our version, which we tabled in Committee, was better. We have been able to tidy up the question of the reasonable excuse. It is perfectly legitimate to ask how on earth there could be a legitimate or reasonable excuse for creating one of these images or asking for one to be created, and we went through those debates previously. I am glad that the Government have come to a settled position with Baroness Owen, and that is what I urge everybody to support here today.
The Government made a manifesto commitment to ban sexually explicit deepfakes, and the Bill delivers on that promise. For the first time, there will be punishment for perpetrators who create or ask others to create intimate deepfakes of adults without consent.
Secondly, I turn to the national underground asset register, which it does feel has been a long time coming. Of course, that is partly because the Bill is in its third iteration. Amendment 34 relates to the national underground asset register. An amendment was previously tabled in the House of Lords requiring the Secretary of State to provide guidance on cyber-security measures, which was rejected by this House. Last week, the Government tabled amendments 34B and 34C in lieu on this topic, which were drafted with the support of the security services. These amendments expand the scope from cyber-security only to general security measures, clarify the audience for the guidance and extend its reach to Northern Ireland, alongside England and Wales.
On all the amendments I have spoken to thus far, I thank our noble colleagues in the other place for their work and support to reach agreement in these areas. I urge colleagues here today to support these amendments, too; otherwise, we are never going to get the Bill through.
(10 months, 1 week ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Natasha Irons (Croydon East) (Lab)
It is an honour to serve under your chairmanship, Ms McVey. I thank my hon. Friend the Member for Bury North (Mr Frith) for securing this important debate. I should declare that my husband is a voiceover artist.
As with every technological leap forward, whether from theatre to cinema or television to streaming, protecting the rights and income of our creators does not create a barrier to innovation or growth. As we move into an AI-powered future, it is even more crucial to protect creators with transparency, consent and compensation for the content used to train AI models.
Our creative industries are a great British success story, worth more than £125 billion to the UK economy and supporting more than 2.4 million jobs. What underpins that success is the principle that those who create content are paid for it, and copyright protections have been the bedrock of that principle for decades. The case for updating UK copyright law for training AI is that the current framework is unclear, but there is no such ambiguity. If someone plays music in a club without a licence or sells counterfeit DVDs, they are breaking the law. If AI companies wish to train their models on copyrighted content, they have to get consent to do so.
AI companies may be harder to hold to account because their models are opaque, but that makes this a transparency and enforcement issue, not a legal one. Our content, our books, our journalism and our music are the oil needed to fuel generative AI systems. I do not think anyone would argue that oil should be mined and used for free by any other industry, so why should it be any different for the precious resource that is creative content? Creating generative AI systems with no accountability and no remuneration is not innovation; it is simply exploitation.
I welcome this Government’s commitment to our creative industries and to finding a solution fit for the future, but the current proposal of an opt-out system is unworkable and unfair. The Government even acknowledge that the technology to implement an opt-out system does not exist. We must uphold the rights of our content creators by upholding copyright protections and giving creators the transparency, consent and compensation they deserve.
(1 year, 6 months ago)
Commons Chamber
Natasha Irons (Croydon East) (Lab)
I thank the hon. Member for Dorking and Horley (Chris Coghlan) for his moving and powerful words. It is an extraordinary thing to become a Member of Parliament, and becoming a Labour Member of Parliament who gets to sit on the Government Benches is even more extraordinary. I share the sense of pride and urgency that we have heard from my hon. Friends the Members for Glasgow West (Patricia Ferguson) and for Stoke-on-Trent South (Dr Gardner) and from the hon. Member for North East Hampshire (Alex Brewer) in their maiden speeches, and I congratulate them on their contributions to this debate.
It is the honour of my life to represent Croydon East in this place, so I thank the people of this new constituency for putting their faith in me. I will never forget what a privilege it is to be their voice in Parliament, and I will do all I can to fight for Croydon’s future. As south London’s most iconic borough, Croydon is a place so big that it needs four Members of Parliament to represent it. Like the reunion tour of a much-loved ’90s band, my constituency has been reformed under an old name with new boundaries, so I pay tribute to my predecessors, starting with the right hon. Member for Croydon South (Chris Philp) for his work for the people of Selsdon and his commitment to public service. While I cannot apologise for campaigning against him in the election, I can attest to the number of constituents who spoke highly of him and his contribution to their community when I met them on the doorstep.
I also pay tribute to my hon. Friend the Member for Croydon West (Sarah Jones), who is not only Croydon’s first female MP, the founder and chair of the all-party parliamentary group on knife crime and violence reduction and now a Minister in this Government, but the kind of MP where everybody knows someone who has been helped by her. I thank her for her kindness, advice and encouragement over the past few months, and hope to build on her legacy and continue to stand up for Croydon in the way that she has always done. I am thrilled to join those Members and my right hon. Friend the Member for Streatham and Croydon North (Steve Reed), someone who continues to work tirelessly for his constituents and who, as Secretary of State, is now taking on the urgent fight to clean up our waters. I look forward to us working together in the best interests of Croydon.
Croydon East represents so much of what is great about our city. Whether it is the bustling life of South Norwood, the close-knit communities of Addiscombe and New Addington, or the stunning views from Shirley Hills, Croydon East is a place filled with diversity, ambition and strong values. However, even in this vibrant corner of south London, the shadow of inequality persists. Whether it is the fact that a child in Croydon East is twice as likely to live in poverty as one just a few miles away in Croydon South, or that a man living in New Addington North has a life expectancy a decade shorter than a woman living in Shirley South, these stark disparities remind us that too many remain trapped in a cycle of inequality.
When we talk about technology in public services, we must not get lost in grand plans and digital transformations: we must remember those who use these services, and how they will be put to use. What do they mean for the community leaders at the coalface of the cost of living crisis—those who run services such as the Food Stop, Pathfinders and the Community Family Project in New Addington that often step up when there is no one else to step in? What do they mean to groups such as the Friends of Shirley Library, which is fighting to keep the library open—a place that not just tackles loneliness and isolation, but plays a critical role in closing the digital divide? What do they mean to my constituent Michael Lyons, a veteran who has campaigned to ensure that the memory of servicepeople from the first world war lives on? How will we ensure that digital public services remain as accessible to him as they are to the rest of us? With a considered application of technology, we have an opportunity to break down barriers instead of creating new ones, to bring people closer to democracy instead of driving them further away, and to rebuild our public services so they work better for the people who need them most.
As someone who grew up in south London in a family that was always political but did not think that politics was for people like us, I know that the decisions that we make in this place have the power to change lives for the better. Choices that have been made in this room took my family from sleeping on our living room floor to decent social housing. They enabled my mother to go back to university to retrain and to get a better job. And they gave me the opportunity to go to university, to get a career in public service broadcasting, to serve my community as a local councillor and to make my way to these Benches.
As an MP in London’s youngest borough, I want to spend my time here putting the wellbeing of our young people back on the political agenda. There is no longer-term plan than ensuring that our young people have the opportunities they deserve, so that they can go on to live the successful lives that we want them to. We need a national plan for what it is to be a young person in this country, and I look forward to showing how Croydon stands ready to lead that vital work.
Finally, every Member of the House will have those people in their lives who have supported them to get here, and I would like to take this opportunity to thank mine. I thank my hon. Friend the Member for Mitcham and Morden (Dame Siobhain McDonagh), who allowed me to see that there was room for people like me in places like this. Without her encouragement and advice, I simply would not be here. I can only hope to be half the MP that she is. I thank my mum and dad for their support and unwavering belief in me. It is their hard work and determination that changed our lives, and I am so proud to continue our family’s track record of serving our country. To my spectacular husband, who encouraged me to get involved in politics because he was sick of me shouting at the TV: thank you for your patience and your endless support, and for being the greatest dad imaginable to our lads.
It is really an extraordinary thing to be a Member of Parliament. I hope that in our time here, we make choices in this room that change the lives of the people outside of it for the better.
I call Caroline Voaden to make her maiden speech.