Children’s Social Media Accounts Debate
Full Debate: Read Full DebateBen Obese-Jecty
Main Page: Ben Obese-Jecty (Conservative - Huntingdon)Department Debates - View all Ben Obese-Jecty's debates with the Department for Science, Innovation & Technology
(1 week, 1 day ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Sir Desmond, and I thank the hon. Member for Sunderland Central (Lewis Atkinson) for introducing this debate. I would like to start by thanking Ellen Roome for her determined work in fighting to highlight this issue. Her courage and her stoicism in pursuing this cause have been hugely impressive, and Parliament would not be debating this today were it not for her impassioned commitment.
This e-petition has garnered some 126,000 signatures in support of calls to give parents and guardians the right to access the social media accounts of their children. We have heard many important contributions from Members this afternoon, and I am sure that parents across their constituencies will be grateful to them for doing so. The hon. Members for Cheltenham (Max Wilkinson) and for Darlington (Lola McEvoy) paid tribute to Ellen Roome and have shared her own words. The hon. Members for Sunderland Central and for South Devon (Caroline Voaden) spoke about the refusal of social media companies to release data, citing legal restrictions. The hon. Members for Worcester (Tom Collins) and for Lowestoft (Jess Asato) spoke of the impact of harmful content on children’s development, and my right hon. Friend the Member for East Hampshire (Damian Hinds) spoke about how current legislation gives control to children as young as 13.
With the vast majority of children now having access to a phone or tablet by the age of 12, children are exposed to an enormous range of content online. Many children are being exposed to social media content that is inappropriate and dangerous and poses substantial risks to safety and development. There has been a growing crisis in children’s mental health, with recent research highlighting that 32% of eight to 17-year-olds state that they have viewed worrying or upsetting online content in the last 12 months, yet only 20% of parents with children and teenagers in that age group report their child telling them they had seen something online that scared or upset them during the same timeframe. Evidence has shown that the widening of access to the internet has seen more children moving away from social interactions, with the subsequent detrimental impacts on mental health and social development.
We welcome much of the work that this Government are doing on protections for children by building on the foundations laid by the previous Government, but could I ask the Minister what is being done to increase mental health support for children? In January last year the Labour party pledged to introduce specialist mental health support for children and young people in every school, as well as open-access children and young people’s mental health hubs in every community, as part of the child health action plan. Although I appreciate that it is not part of her brief, could the Minister outline what progress the Government are making towards the delivery of those pledges, as they relate to this topic more broadly?
Keeping children safe online in the current media landscape is a challenge that will require agile and adroit legislation that simultaneously keeps pace with technological developments and reflects cultural usage of media platforms. We also need to recognise the power that social media giants now hold, and ensuring accountability will be a key aspect of any legislation. We must ensure that parents have the right to be able to ensure that their children are safe from harm on platforms, especially in circumstances where children may be being mistreated.
I have previously heard Ellen describe how social media companies have abdicated responsibility in assisting in the disclosure of messages that could help to identify how a tragedy has occurred. In Jools’ case, TikTok has not released any of the messages on his account, and Instagram Meta has released some but not all. Any parent should be concerned that they will not have the right to access details of their child’s online life, even if it is suspected to have contributed to their death. Parents like Ellen are currently required to take legal action to pursue the release of such information and, even if they have the financial resources to do so, why should any parent be forced to go to such lengths just to find out what may be, at best, critical information and, at worst, closure? The majority of parents do not even have access to such resources.
As a newly elected Member, I will not stand here and pretend that the previous Government got everything right, but the Online Safety Act was a crucial and positive step forward to keeping more children and young people safe online so that fewer families have to face situations like those we have heard and spoken about in this debate. Under section 101 of the Act, Ofcom has the power to support the investigation of a coroner or procurator fiscal into the death of a child via the data preservation measure. The measure came into effect under the previous Government in April last year, and it is under this section that the amendment that would be Jools’ law would sit.
Although the current iteration of section 101 is a step in the right direction, it is not an easily accessible outcome and it can only be put into effect following a tragedy. In many instances, parental access to social media accounts could prevent tragic outcomes. Do the Government plan to introduce legislation to give parents and guardians the right to access their child’s social media accounts and the messages contained within them? If they do, would that build on the Online Safety Act?
There are further considerations that must be taken into account, such as safeguarding. Though parental access to children’s social media accounts may sound like a simple and prudent solution, not every child has parental figures who have their best interests at heart, and that includes vulnerable children in a family with an abusive parent. A child who is seeking help in communicating domestic abuse to friends or organisations may find their only avenue of escape is compromised. There may also be instances in which a parent could use their child’s social media account to gain access to information about other children and teenagers. There are therefore wider implications to granting parents unrestricted access to the information of children other than their own, as that could unintentionally make unsolicited and inappropriate contact easier. Would the Minister consider how parental access rights could be designed to give parents the ability to monitor their children’s safety and to ensure children have the privacy they may need to facilitate their own safety, and how such measures could be designed so as not to be exploited by any of the parties that are subject to them?
I was reassured to see the Secretary of State for Science, Innovation and Technology meeting bereaved parents who have lost children after being influenced by harmful content online. I also welcome the publishing of the Secretary of State’s “Draft Statement of Strategic Priorities for online safety” in November last year, which provided clarity on the framework that the Government will expect the independent regulator to work within. The Secretary of State has stated that the Government will be
“implementing safety by design to stop…harm occurring in the first place”,
and we should consider whether the expectation should fall on users themselves to take precautionary steps to avoid severely harmful content. Given how instrumental algorithms are in pushing themed content to users’ feeds, what plans do the Government have to give users the ability to opt out or reset these algorithms?
We support parents in raising concerns about content they do not want their children to see by requiring sites to take measures to remove content as soon as it is flagged. Since the introduction of the 2023 Act, we have seen many cases in which the response from platforms has been far quicker than before, and we would welcome a detailed plan that lays out how the Government will ensure that all companies act quickly and the cost of their not doing so.
It is right that services must assess any risk to children from using their platforms and set appropriate age restrictions to ensure that child users have age-appropriate experiences and are shielded from harmful content, such as pornography or content relating to violence, self-harm, eating disorders or even suicide. That is why the last Government tightened up age restrictions by requiring social media companies to enforce their age limits consistently and protect their child users, but many parents still believe that these age limits are too easily circumvented by children lying about their age. The Government talk of ensuring that age-assurance technology to protect children is being effectively deployed, but how do the Government intend to ensure this? How do they intend to ensure that companies are investing in the most up-to-date technology to facilitate that? Will the Government proactively stress-test that capability and, if so, how?
For all of this, Ofcom plays a vital role. As an evidence-based regulator, its task is to regulate the trust and safety systems and processes. Its role is not necessarily to police individual pieces of content; it is to ensure companies have the correct measures in place to minimise harms to users. At the end of last year, we heard about how the Government had informed Ofcom that it would need to build more safety measures into these systems. I would welcome the Minister’s outlining how the Government will aid Ofcom in its aims and ensure that any Government support needed will be supplied. These regulations would not be anything without empowering Ofcom to take action, which is why we gave it powers to issue fines of up to £18 million or 10% of global revenue, whichever is higher, or to pursue criminal investigations into senior managers if they fail to comply with enforcement notices. Will the Minister outline what steps the Government are taking to make sure that Ofcom brings forward its children’s safety codes and guidance in April?
As we have all seen, technology keeps moving and advancements are constantly made, so the risk of digital progress outstripping the pace of legislation is an all too real prospect. We must embrace technology and understand that the internet and social media, embedded in our daily lives, can be a force for good, but we must also understand that checks and balances are essential if we are to ensure a safe online environment not only for today’s users but for those newly entering the online world. It is for the Government not only to guarantee an environment conducive to users of all ages, but to ensure that parents have the confidence that the online environment can be made as safe as they strive to make the home environment.