Online Safety Act: Implementation

Ben Obese-Jecty Excerpts
Wednesday 26th February 2025

(1 week, 5 days ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Ben Obese-Jecty Portrait Ben Obese-Jecty (Huntingdon) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Stringer. I thank my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) for securing this timely debate. His wealth of knowledge on this topic is clear, and his voice in pursuing the most effective iteration of the legislation has been constant.

The previous Government passed the world-leading Online Safety Act, which places significant new responsibilities and duties on social media platforms and search services to increase child safety online—aims that all Members can agree upon. Platforms will be required to prevent children from accessing harmful and age-inappropriate content, and to provide parents and children with clear and accessible ways to report problems online when they arise.

The evidence base showing that social media is adversely impacting our children’s mental health is growing stronger. The Royal Society for Public Health says that about 70% of young people now report that social media increases their feelings of anxiety and depression. It is for those reasons that Conservative Ministers ensured the strongest measures in the Act to protect children.

The Act places duties on online platforms to protect children’s safety and put in place measures to mitigate risks. They will also need to proactively tackle the most harmful illegal content and activity. Once in force, the Act will create a new regulatory regime to significantly improve internet safety, particularly for young people. It will address the rise in harmful content online and will give Ofcom new powers to fulfil the role of the independent regulator. Fundamentally, it will ensure services take responsibility for making their products safe for their users.

I note that the Government have said that they are prioritising work with Ofcom to get the Act implemented swiftly and effectively to deliver a safer online world, but I recognise the concerns of parents and campaigners who worry that children will continue to be exposed to harmful and age-inappropriate content every day until these regulations come into force. Will the Minister acknowledge those concerns in her remarks?

The Act places new duties on certain internet services to protect users from illegal content on their platforms. The purpose of those illegal content duties is to require providers of user-to-user and search services to take more responsibility for protecting UK-based users from illegal content and activity that is facilitated or encountered via their services.

In December, Ofcom published its finalised illegal harms codes of practice and risk assessment guidance. The codes of practice describe the measures that services can take to fulfil their illegal content duties, and they recommend that providers of different kinds and with different capacities take different steps proportionate to their size, capacity and level of risk.

The codes recommend measures in areas including user support, safety by design, additional protections for children and content moderation or de-indexing. Many of the measures in the draft codes are cross-cutting and will help to address all illegal harms. Certain measures are targeted at specific high-priority harms, including child sexual abuse material, terrorism and fraud. Those include measures on automated tools to detect child sexual abuse material and for establishing routes so that the police and the Financial Conduct Authority can report fraud and scams to online service providers. The included measures will also make it easier for users to report potentially illegal content.

Ofcom has also published guidance on how providers should carry out risk assessments for illegal content and activity. Providers now have three months to complete their illegal content risk assessment. Can the Minister update the House on whether the completion of the risk assessments will coincide with the codes of practice coming into force?

Another important milestone was the publication of Ofcom’s children’s access assessment guidance last month. Services will have to assess whether their service is likely to be accessed by children and, once the protection of children codes have been finalised by the summer, must put in place the appropriate protections, known as age assurance duties.

All services that allow pornography must implement by July at the latest highly effective age assurance to ensure that children are not normally able to access pornographic content. Together, the illegal harms and child safety codes should put in place an important foundation for the protection of users. For example, children will be better protected online with services having to introduce robust age checks to prevent children seeing content such as suicide, self-harm material and pornography, and having to tackle harmful algorithms. Illegal content, including hate speech, terrorist content and content that encourages or facilitates suicide should be taken down as soon as services are aware of it. Women and girls will be better protected from misogyny, harassment and abuse online.

The Government have said they are keen for Ofcom to use its enforcement powers as the requirements on services come into effect to make sure that the protections promised by the Act are delivered for users. Samaritans has called on the Government and Ofcom to

“fully harness the power of the Online Safety Act to ensure people are protected from dangerous content”.

Will the Minister confirm that the Government will fully back Ofcom in its enforcement of the illegal harms and child safety codes?

There are concerns that Ofcom appears to be relying on future iterations of the codes to bring in the more robust requirements that would improve safety. Relying on revision of the codes to bring them up to the required standard will likely be a slow process. The requirement to implement initial codes and guidance is significant and is unlikely to allow capacity for revision. Furthermore, the Secretary of State’s ability to stipulate such revisions could hamper that. To that end, it is essential that the first iteration of the codes of practice is robust enough to endure without the need for revision in the short term. Although that might be difficult to achieve in an environment that moves as quickly as the digital space, it must be strived for, lest we end up with legislation that does not hold online platforms to account and does not protect victims of online harms as it should.

As legislators, we have a responsibility to ensure that the online world is a safe place for our children. We also have a responsibility to ensure that online platforms take their obligations seriously. I am pleased that the previous Government’s Online Safety Act delivers on both those points. I urge the Minister to ensure that it is fully implemented as soon as possible.

Graham Stringer Portrait Graham Stringer (in the Chair)
- Hansard - - - Excerpts

We have gained a considerable amount of time because of disciplined interventions and short speeches. I ask the Minister to ensure that there is a small amount of time at the end for the Member in charge to wind up.

Social Media Use: Minimum Age

Ben Obese-Jecty Excerpts
Monday 24th February 2025

(2 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Ben Obese-Jecty Portrait Ben Obese-Jecty (Huntingdon) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Vickers. I thank the Petitions Committee for enabling this debate; Kim Campbell for launching the petition; the hon. and learned Member for Folkestone and Hythe (Tony Vaughan) for opening the debate; and the 128,000 signatories of the petition, including 225 people from my constituency of Huntingdon.

In a recent survey by More in Common of more than 2,000 parents, social media and excessive screen time was ranked as the top issue affecting children’s mental wellbeing: parents ranked it higher on the list of threats than alcohol, bullying and financial problems. Exposure to harmful content online was deemed the second biggest risk to mental health. The challenges facing children have changed astronomically in recent years. Children now face a boiling point of addiction, constant connectivity, online crime and harmful content. Many feel that it has become too much for children to handle.

The evidence base is growing stronger. Smartphones and social media are adversely impacting our children’s mental health. The Royal Society for Public Health says that about 70% of young people now report that social media increases their feelings of anxiety and depression. In increasing numbers, children are coming into schools up and down the UK having stayed up all night on their phone. A child who has not had a healthy night’s sleep is not equipped to contribute to the classroom, except perhaps to disrupt it.

Evidence from Health Professionals for Safer Screens shows that children who routinely spend extended periods on their smartphones have poorer eyesight, inhibited speech and language development, interrupted sleep and rising rates of anxiety. Smartphones are designed to be addictive. Platforms are constantly seeking to develop new design strategies that encourage children to stay online longer. Notifications, comments and likes are designed to drive feelings of happiness. It is easy for children to feel obliged to engage and even compete with their peers online.

Of course, children access social media on mobile phones via the internet. Individually, each tool brings its own benefits. Mobile phones allow children to let parents know that they have reached school safely, providing an extra safeguard that allows them greater and earlier independence. The internet itself allows children to further their education, whether through research tasks, homework or practising coding; it also provides better connectivity, information and entertainment. The internet is so integral to society that we must ensure that children have the skillset and the know-how to navigate it.

I expect that many hon. Members use social media every day, scrolling through their feeds, checking the news or drafting updates to their constituents. Social media has its benefits, not least because it allows us to communicate with people instantly and en masse, wherever they may be in the world. Used responsibly, social media can provide some benefits for children. Children may use it to stay connected with friends and family around the world. They may use it for civic engagement or to fundraise; they may use YouTube or short reels for online learning or content discovery.

The drawbacks, however, are considerable: addiction to their screen, online bullying and exposure to harmful content such as eating disorders, self-harm and body shaming. There is some bad content on the internet. It is deeply concerning that half of 13-year-olds reported seeing hardcore, misogynistic pornographic material on social media sites. There are widespread concerns that this is impacting the way young people understand healthy relationships, sex and consent. Half of parents worry that online pornography is giving their children an unrealistic view of sex. We see the same with knife crime: there is constant exposure to content that glamorises violence, exposes children to a world of criminality, gangs and scoreboard videos, and contributes to the perception that every teenager carries a knife and thus drives the urge for them to carry one themselves, too often with deadly consequences.

What can be done to tackle these issues? The previous Government passed the world-leading Online Safety Act, which places significant new responsibilities and duties on social media platforms and search services to increase child safety online. Platforms will be required to prevent children from accessing harmful and age-inappropriate content and to provide parents and children with clear and accessible ways to report problems online when they arise. As well as content, the Act applies to service functionality, including the way in which platforms are operated and used by children. Will the Minister confirm whether platforms will be obliged to manage and mitigate addictive functions if a provider’s risk assessment identifies habit forming that could cause sufficient harm?

We are cleaning up the online space with world-leading legislation and an enforced regulator, but I worry that that is not enough. We should be having a conversation about the use of mobile phones in schools. The previous Government took action and issued guidance backing headteachers in restricting access to phones in schools. However, new research has shown that only 11% of schools are genuinely smartphone free, while children at smartphone-free schools get one to two grades higher at GCSE. That is why the Opposition tabled an amendment to the Children’s Wellbeing and Schools Bill to ban mobile phone use in schools. It was disappointing that the Government rejected that amendment and that argument. Will the Minister update us on what conversations he has had with colleagues in the Department for Education about that policy?

Conservatives want to put the safety of children first. I hope that the Minister agrees with that aim. The More in Common poll showed that nearly nine in 10 parents—86%—backed raising from 13 to 16 the so-called digital age of consent, the point at which children should be allowed on social media. Some Members have also proposed banning social media for children under 16. I note that the Secretary of State has not ruled that out, saying that it is “on the table” and that he “is not currently minded” to enact such a policy.

Instead, the Government have announced the launch of a study to explore the effects of smartphone and social media use on children. It seeks to build the evidence base for future decisions designed to keep children safe online. The work is being led by a team at the University of Cambridge, with contributions from researchers at other leading universities. The project lead, Dr Amy Orben, says:

“There is huge concern about the impact of smartphone use on children’s health, but the evidence base remains fairly limited. While the government is under substantial time pressure to make decisions, these will undoubtedly be better if based on improved evidence.”

The Opposition agree that the evidence base needs to be improved, and we welcome the study.

The last piece of substantial Government-backed research into children and mobile phone use was completed in 2019, before covid. We know the devastating impact of lockdown on children and how pandemic restrictions forced children to connect with their friends and schoolteachers online. That pushed children towards technology and social media, potentially leading to irreversible changes in behaviour.

However, the timeline for the work is unclear. Although the research should be detailed and thorough, its publication should be timely. Will the Minister please outline when the study will report back to the Department and, given the dangers of delay, whether he has considered speeding it up? I am aware that the Children’s Commissioner has recently done some work to better understand the impact of mobile phones on children. Her insight could prove very valuable while the academics are researching in depth. I presume that the Minister has spoken to the commissioner, but can he update the House on what he has learned from those discussions? I would be grateful for the Minister’s comments on those points.

The poll is a clear illustration of the strength of feeling among parents, but we all know—from our own families and our conversations with parents, teachers and children in our constituencies—the impact on children of mobile phones and social media. As legislators, we have a responsibility to ensure that the online world is a safe place for our children. We also have a responsibility to ensure that online platforms take their obligations seriously. I am pleased that the previous Government’s Online Safety Act delivers on both those points, and I urge the Minister to ensure that it is fully implemented as soon as possible.

Data (Use and Access) Bill [Lords]

Ben Obese-Jecty Excerpts
Ben Obese-Jecty Portrait Ben Obese-Jecty (Huntingdon) (Con)
- View Speech - Hansard - -

Opposition Members are broadly supportive of this Bill and its aims, which build on the work done by the previous Conservative Government and refined by Members of the other place. I will focus primarily on deepfakes and AI-generated images, specifically in relation to clauses 91 and 141.

I commend Baroness Owen for her work on non-consensual images and deepfakes, and for pressing the Government to address this issue with more urgency. Her work brought about a U-turn on the intent-based amendment, pushed the Government to agree on solicitation, and pushed them to act on the deletion of content.

Removing the intent-based amendment removes a huge hurdle for victims—a hurdle that would have required victims to prove the intentions of their tormentor. This would simply have placed more pressure on victims, while abusers would have been more likely to excuse their crimes.

It is right that solicitation is now included in the Bill, as many cases have shown that it is not enough just to criminalise creation; we must also criminalise the people who ask others to create these images and videos for them. Otherwise, we run the risk of a loophole where images are posted on to sites and people from other jurisdictions create the content and send it back to the person requesting it.

The Government tried to prevent custodial sentences from being an option for these abusers, as they argued that omitting “reasonable excuse” may breach the European convention on human rights. Given the content, I cannot see how perpetrators’ rights trump the rights of victims, but the change was made thanks to Baroness Owen and Members from across the parties in the other place, who persisted in making sure that the Government take all action needed to stop this growing criminal scourge.

We have seen a surge in deepfakes, revenge porn and nudifying apps. This technology is the wild west and, unchecked, poses a danger to many in society. We must act to protect the most vulnerable from harm. I welcome that clause 91 establishes requirements for the commissioner to report on what actions have been taken. It is right that we see what is being done to combat these crimes. We have heard many reports of sexual images of children being generated and spread. This causes so much damage, and once such exploitative images are created and disseminated, they are near impossible to eradicate.

As technology advances, we need to keep pace with new threats, lest technological change outstrip the pace of legislation. For too long, the law has been out of touch with fast-changing realities. There are apps just a few clicks away that allow users to generate their own AI boyfriend or girlfriend, and some of these apps can take real images and change them into sexually explicit figures that are already terrifyingly real. This is just one example of why we need further restrictions, with clear penalties for both platform providers and users.

One app, undress-ai, processed over 600,000 images of women within 21 days of launching. These were ordinary women, with no knowledge that their image was being doctored in such a way and used, even traded, for the gratification of others. This is simply not right.

Although consent is at the heart of aspects of the Bill, we need to look closely at provisions for withdrawing consent. This must be seriously considered, particularly where an image that is consensually exchanged is doctored into something that was never consented to.

Though I welcome aspects of the Bill, we must ensure that we keep up with the rapid pace of change. Apps that cause great harm are readily accessible. I hope to hear more about what can be done to assist people to withdraw consent, so that we can end this vile abuse.

Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025

Ben Obese-Jecty Excerpts
Tuesday 4th February 2025

(1 month ago)

General Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Ben Obese-Jecty Portrait Ben Obese-Jecty (Huntingdon) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Sir Christopher. The Online Safety Act will be one of the lasting accomplish-ments of the last Government. It is world-leading legislation that places significant new responsibilities and duties on social media platforms and search services, to increase safety online. Most importantly, this vital legislation ensures that children are better protected online.

If it is worrying that children aged eight to 17 spend between two and five hours online per day, then it is deeply concerning that half of 13-year-olds reported seeing hardcore, misogynistic pornographic material on social media sites. It is for those reasons that Conservative Ministers ensured that there were the strongest measures in the Online Safety Act to protect children. For example, platforms will be required to prevent children from accessing harmful and age-inappropriate content and will provide parents and children with clear and accessible ways to report problems online when they arise.

Furthermore, the Act requires all in-scope services that allow pornography to use highly effective age assurance to prevent children from accessing it, including services that host user-generated content and services that publish pornography. Ofcom has robust enforcement powers available to use against companies who fail to fulfil their duties. The Act also includes provisions to protect adult users, as it ensures that major platforms are more transparent about what kinds of potentially harmful content they allow. It gives users more control over the types of content they want to see.

The Act allocates regulated services into different categories to ensure that regulatory requirements are applied proportionately. The thresholds that we are debating follow Ofcom’s work and consultation on what platforms should be set as category 1, category 2A and category 2B. The highest-risk platforms—the largest social media and pornography sites—will be designated as category 1 and will bear the highest duty of care. Category 2A will contain the highest-risk search engines, such as Google and Bing, and category 2B will contain the remaining high-risk and high-reach sites.

The regulations enable Ofcom to designate services subject to additional duties. That will address content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders, as well as content that is abusive or incites hate. Where users are likely to access this content, category 1 providers will be required to proactively offer adults optional features to reduce the likelihood of their encountering such content or to alert them to its nature. There are concerns that category 1 sites may omit smaller platforms with harmful content, and it may be prudent for the Government to look at redefining that at a later date.

The Online Safety Act’s impact assessment concludes that more than 25,000 companies may be within scope of the new regulatory framework. Companies designated into higher categories will face additional risks as they face more duties. Can the Minister reassure tech companies, especially small and medium-sized businesses, that her Department will continue to work with them to ensure that cost is affordable and proportionate?

I note that Ofcom expects the illegal harms safety duties to become enforceable around March 2025, once technology companies have assessed the risk of online harms on their platforms. Does the Minister agree that platforms do not need to wait, and should already be taking action to improve safety on their sites? Can the Minister confirm that she is encouraging platforms to take this proactive action?

Separately from the Online Safety Act, the last Government launched the pornography review to explore the effectiveness of regulation, legislation and the law enforcement response to pornography. I understand that that review has now concluded. Can the Minister provide her reassurance that the review’s final report will be published imminently?

I would be grateful for the Minister’s comments on these points. The Online Safety Act is a pivotal piece of legislation and makes the UK the safest place in the world to be a child online. I am proud of the previous Government’s role in passing it, and I urge the Minister to ensure that it is fully implemented as soon as possible.

Listed Places of Worship Scheme

Ben Obese-Jecty Excerpts
Wednesday 22nd January 2025

(1 month, 2 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Ben Obese-Jecty Portrait Ben Obese-Jecty (Huntingdon) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Western. I congratulate my hon. Friend the Member for Bromsgrove (Bradley Thomas) on securing this important debate. I have been contacted by so many pillars of the community in my constituency who are deeply concerned about what this might mean for their places of worship. This is about the preservation of heritage, ensuring that our future generations can enjoy places of beauty and history central to our heritage and culture.

In Great Staughton, St Andrew’s church has stood for 800 years. The chair of their renovation project, Anthony Withers, wrote to me at the end of last year, deeply concerned that their aim of building a community space might be affected or even rendered unfeasible if they are unable to claim back the VAT. Anthony expressed how this project was not aimed at conventional churchgoers, but rather a space for musical, theatrical and other community events. He would dearly like to hear assurances that 800 years of history will be able to carry on, with St Andrew’s remaining at the centre of their community.

I also received a moving email from a constituent who was deeply concerned about the future of the medieval All Saints church in Hamerton. The church was where she was married, where her children were christened and, she hopes, where future family marriages and christenings will happen too. However, with work needed to keep the church building safe, she is worried that the future of All Saints may now be at risk.

From Hamerton to Hertford, where the treasurer of All Saints parish church explained to me that in the last six years alone they have been able to claim back £50,000 for various projects, including repairs to the church tower, refurbishment of the bells, a new gas boiler, restoration and rebuild of the church organ, installation of a new lighting system and the limewash on the internal walls—all work that must be done to keep that church going. The site has had a church standing on it for nearly 1,000 years.

Our churches and places of worship are resorting to ever more inventive and ingenious ways to raise funds for the upkeep of their ageing buildings. All Saints Parish church in St Ives runs a popular event twice a year called “Booze in the Pews”. I attended the last two events and spoke with the vicar, Mark Amey. The funds raised go towards the upkeep of the church and, for anybody passing through my constituency in a fortnight’s time, the next event will be from 6 to 8 February—but I digress.

I know that all Members see on a daily basis the importance of these places and the people who selflessly devote their lives to serving those whom we represent. In summing up, on behalf of all the constituents represented by the Members present, I ask the Minister to outline what steps the Government will be taking in order to extend the listed places of worship grant scheme.

Children’s Social Media Accounts

Ben Obese-Jecty Excerpts
Monday 13th January 2025

(1 month, 3 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Ben Obese-Jecty Portrait Ben Obese-Jecty (Huntingdon) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Sir Desmond, and I thank the hon. Member for Sunderland Central (Lewis Atkinson) for introducing this debate. I would like to start by thanking Ellen Roome for her determined work in fighting to highlight this issue. Her courage and her stoicism in pursuing this cause have been hugely impressive, and Parliament would not be debating this today were it not for her impassioned commitment.

This e-petition has garnered some 126,000 signatures in support of calls to give parents and guardians the right to access the social media accounts of their children. We have heard many important contributions from Members this afternoon, and I am sure that parents across their constituencies will be grateful to them for doing so. The hon. Members for Cheltenham (Max Wilkinson) and for Darlington (Lola McEvoy) paid tribute to Ellen Roome and have shared her own words. The hon. Members for Sunderland Central and for South Devon (Caroline Voaden) spoke about the refusal of social media companies to release data, citing legal restrictions. The hon. Members for Worcester (Tom Collins) and for Lowestoft (Jess Asato) spoke of the impact of harmful content on children’s development, and my right hon. Friend the Member for East Hampshire (Damian Hinds) spoke about how current legislation gives control to children as young as 13.

With the vast majority of children now having access to a phone or tablet by the age of 12, children are exposed to an enormous range of content online. Many children are being exposed to social media content that is inappropriate and dangerous and poses substantial risks to safety and development. There has been a growing crisis in children’s mental health, with recent research highlighting that 32% of eight to 17-year-olds state that they have viewed worrying or upsetting online content in the last 12 months, yet only 20% of parents with children and teenagers in that age group report their child telling them they had seen something online that scared or upset them during the same timeframe. Evidence has shown that the widening of access to the internet has seen more children moving away from social interactions, with the subsequent detrimental impacts on mental health and social development.

We welcome much of the work that this Government are doing on protections for children by building on the foundations laid by the previous Government, but could I ask the Minister what is being done to increase mental health support for children? In January last year the Labour party pledged to introduce specialist mental health support for children and young people in every school, as well as open-access children and young people’s mental health hubs in every community, as part of the child health action plan. Although I appreciate that it is not part of her brief, could the Minister outline what progress the Government are making towards the delivery of those pledges, as they relate to this topic more broadly?

Keeping children safe online in the current media landscape is a challenge that will require agile and adroit legislation that simultaneously keeps pace with technological developments and reflects cultural usage of media platforms. We also need to recognise the power that social media giants now hold, and ensuring accountability will be a key aspect of any legislation. We must ensure that parents have the right to be able to ensure that their children are safe from harm on platforms, especially in circumstances where children may be being mistreated.

I have previously heard Ellen describe how social media companies have abdicated responsibility in assisting in the disclosure of messages that could help to identify how a tragedy has occurred. In Jools’ case, TikTok has not released any of the messages on his account, and Instagram Meta has released some but not all. Any parent should be concerned that they will not have the right to access details of their child’s online life, even if it is suspected to have contributed to their death. Parents like Ellen are currently required to take legal action to pursue the release of such information and, even if they have the financial resources to do so, why should any parent be forced to go to such lengths just to find out what may be, at best, critical information and, at worst, closure? The majority of parents do not even have access to such resources.

As a newly elected Member, I will not stand here and pretend that the previous Government got everything right, but the Online Safety Act was a crucial and positive step forward to keeping more children and young people safe online so that fewer families have to face situations like those we have heard and spoken about in this debate. Under section 101 of the Act, Ofcom has the power to support the investigation of a coroner or procurator fiscal into the death of a child via the data preservation measure. The measure came into effect under the previous Government in April last year, and it is under this section that the amendment that would be Jools’ law would sit.

Although the current iteration of section 101 is a step in the right direction, it is not an easily accessible outcome and it can only be put into effect following a tragedy. In many instances, parental access to social media accounts could prevent tragic outcomes. Do the Government plan to introduce legislation to give parents and guardians the right to access their child’s social media accounts and the messages contained within them? If they do, would that build on the Online Safety Act?

There are further considerations that must be taken into account, such as safeguarding. Though parental access to children’s social media accounts may sound like a simple and prudent solution, not every child has parental figures who have their best interests at heart, and that includes vulnerable children in a family with an abusive parent. A child who is seeking help in communicating domestic abuse to friends or organisations may find their only avenue of escape is compromised. There may also be instances in which a parent could use their child’s social media account to gain access to information about other children and teenagers. There are therefore wider implications to granting parents unrestricted access to the information of children other than their own, as that could unintentionally make unsolicited and inappropriate contact easier. Would the Minister consider how parental access rights could be designed to give parents the ability to monitor their children’s safety and to ensure children have the privacy they may need to facilitate their own safety, and how such measures could be designed so as not to be exploited by any of the parties that are subject to them?

I was reassured to see the Secretary of State for Science, Innovation and Technology meeting bereaved parents who have lost children after being influenced by harmful content online. I also welcome the publishing of the Secretary of State’s “Draft Statement of Strategic Priorities for online safety” in November last year, which provided clarity on the framework that the Government will expect the independent regulator to work within. The Secretary of State has stated that the Government will be

“implementing safety by design to stop…harm occurring in the first place”,

and we should consider whether the expectation should fall on users themselves to take precautionary steps to avoid severely harmful content. Given how instrumental algorithms are in pushing themed content to users’ feeds, what plans do the Government have to give users the ability to opt out or reset these algorithms?

We support parents in raising concerns about content they do not want their children to see by requiring sites to take measures to remove content as soon as it is flagged. Since the introduction of the 2023 Act, we have seen many cases in which the response from platforms has been far quicker than before, and we would welcome a detailed plan that lays out how the Government will ensure that all companies act quickly and the cost of their not doing so.

It is right that services must assess any risk to children from using their platforms and set appropriate age restrictions to ensure that child users have age-appropriate experiences and are shielded from harmful content, such as pornography or content relating to violence, self-harm, eating disorders or even suicide. That is why the last Government tightened up age restrictions by requiring social media companies to enforce their age limits consistently and protect their child users, but many parents still believe that these age limits are too easily circumvented by children lying about their age. The Government talk of ensuring that age-assurance technology to protect children is being effectively deployed, but how do the Government intend to ensure this? How do they intend to ensure that companies are investing in the most up-to-date technology to facilitate that? Will the Government proactively stress-test that capability and, if so, how?

For all of this, Ofcom plays a vital role. As an evidence-based regulator, its task is to regulate the trust and safety systems and processes. Its role is not necessarily to police individual pieces of content; it is to ensure companies have the correct measures in place to minimise harms to users. At the end of last year, we heard about how the Government had informed Ofcom that it would need to build more safety measures into these systems. I would welcome the Minister’s outlining how the Government will aid Ofcom in its aims and ensure that any Government support needed will be supplied. These regulations would not be anything without empowering Ofcom to take action, which is why we gave it powers to issue fines of up to £18 million or 10% of global revenue, whichever is higher, or to pursue criminal investigations into senior managers if they fail to comply with enforcement notices. Will the Minister outline what steps the Government are taking to make sure that Ofcom brings forward its children’s safety codes and guidance in April?

As we have all seen, technology keeps moving and advancements are constantly made, so the risk of digital progress outstripping the pace of legislation is an all too real prospect. We must embrace technology and understand that the internet and social media, embedded in our daily lives, can be a force for good, but we must also understand that checks and balances are essential if we are to ensure a safe online environment not only for today’s users but for those newly entering the online world. It is for the Government not only to guarantee an environment conducive to users of all ages, but to ensure that parents have the confidence that the online environment can be made as safe as they strive to make the home environment.