House of Commons (24) - Commons Chamber (12) / Written Statements (5) / Written Corrections (3) / Westminster Hall (2) / General Committees (2)
House of Lords (13) - Grand Committee (7) / Lords Chamber (6)
Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
(2 days, 6 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move,
That this House has considered e-petition 661407 relating to children’s social media accounts.
It is a pleasure to serve with you in the Chair, Mr Twigg.
It is a privilege to have the opportunity to open this important debate as a member of the Petitions Committee. I start by paying tribute to the petitioners and in particular the petition creator, Ellen Roome, who I had the honour of meeting as part of my preparations for this debate and who is in the Gallery today.
Ellen’s son, Jools, died in April 2022. Since then, she has been a determined campaigner not just to get access herself to Jools’ social media data to help understand the circumstances of his death, but also to secure a future in which other bereaved parents do not face the situation that she has. I welcome that Ellen’s MP, the hon. Member for Cheltenham (Max Wilkinson), is here, as is my hon. Friend the Member for Darlington (Lola McEvoy), who has been working with Ellen on this issue. I look forward in particular to their contributions, which I know will provide further insight and will rightly ensure that Ellen’s own words are on the parliamentary record.
Since Jools’ death in 2022, the law and practice related to social media data has changed in several ways, most notably through the Online Safety Act 2023. The changes were secured in large part thanks to the efforts of the Bereaved Families for Online Safety group, other members of which are also here today; I pay tribute to their work. What I hope will become clear in this debate is that recent changes to the law, the implementation of those changes and the response of social media companies are not yet sufficient, and further change is needed to help bereaved parents such as Ellen. The petition attracted 126,000 signatures. That is a testament to Ellen’s campaigning efforts and the public’s concern about these issues.
Online services such as social media, streaming and messaging are now features of everyday life, including children’s lives. There are undoubtedly positive aspects of age-appropriate online services—giving children opportunities to explore the world, connecting with others who share interests and extending peer support options—but as the level of public concern recognises, it is beyond doubt that there has been significant exposure of many children to online harms, and that the action in response to that has not yet been sufficient or fast enough to meet the challenges.
Ellen Roome has gone through the unimaginable; I am absolutely amazed at her ability to continue fighting for answers for her son Jools. Does the hon. Member agree that social media companies are not in the wild west of the internet age any more and have become an integrated part of our society with certain responsibilities? Does he also agree that one of those responsibilities is ensuring that children in Wokingham and across the UK are protected from harmful content?
I agree that social media companies rightly face regulation; I will talk a little more about that later.
There is, rightly, debate and campaigning in the media and elsewhere—we saw some of that over the weekend—about the level of regulation of online content that children may access online, whether that is illegal content or legal but harmful content. Although the regulation of content itself is not the focus of today’s debate, as a concerned parent myself, I am impatient for online services to take proper responsibility for what our children see. Social media companies must not shy away from their responsibilities to protect children, either because of misguided free speech concerns or out of concern about their levels of profit.
It is important to note, as context for today’s debate, that the law does not allow online services to collect or store the personal information of children under the age of 13. As a result, most popular services require users to be at least 13 years of age, but enforcement of that requirement has historically been lax. The age-assurance requirements in the Online Safety Act must be implemented without delay.
I know that hon. Members will want to touch on various elements of the approaches to social media regulation, but in parallel to the wider debate about content regulation, the petitioners are clear that there are specific issues about parental access that warrant a response in their own right, from both social media companies and the Government, so I want to focus my remarks, and I hope this debate, on those specific questions.
First, the petitioners call for parental access to social media when children are alive. I understand the instinct behind this call—the instinct to directly monitor what a child is doing online in order to protect them from the harms to which I have referred. However, in the course of preparing for the debate, I heard clearly from multiple perspectives, including children’s charities such as the NSPCC, that broad, overarching parental rights to children’s social media would not be appropriate. Of course, given that the minimum age of use should be 13, we are talking about teenagers. Children—teenagers—do have long-established rights to privacy, as set out by the UN convention on the rights of the child, and established UK law and practice in a range of areas reflects and recognises that.
Such rights are important not just as abstract concepts but because, as I have heard, older children sometimes need private online spaces to help them to explore the world—for example, a teenager understanding their own sexuality before they wish to share it with their parents. We also know that, sadly, in some horrible cases, parents are themselves perpetrators of abuse against their children. Establishing a blanket right to access children’s online activity would remove an important safe route for children to seek support and alert others in such cases.
The petitioners themselves have reflected on such matters, and although they remain determined to support parents to keep their kids safe online, there is a recognition that establishing an overarching parental right to access the data of living teenagers is unlikely to be the appropriate way forward. Instead, it appears to me that parental oversight of a child’s social media use should rather be achieved through strengthening and significantly increasing the uptake of parental controls and other specifically designed arrangements that children and their parents agree to together, as part of the sign-up processes for online services. It is clearly the responsibility of online services to implement and expand such measures, and I hope for rapid progress in that area as a key part of online services’ work to improve children’s safety online in partnership with parents.
I now turn to the issue of bereaved parents’ access to social media data, which is the specific issue for which Ellen has been campaigning, because she is still not able to access data about Jools’ use of online services before his death. I cannot begin to imagine the anguish of losing a child and, even more so, not being given information that might help a parent to understand the circumstances of their child’s death. We rightly talk about laws, protections and rights, but we do not talk enough as a society about the right—indeed, the need—to grieve a loss, especially one as painful as the loss of child.
In preparing for this debate, I spoke with SLOW, or Surviving the Loss of Your World—a charity that offers bereavement support for parents following the loss of a child. The charity emphasised to me the absolute necessity, as part of the grieving process, of establishing an understanding of what led to death. By being denied data about Jools’ online activity before his death, Ellen has been denied the ability to grieve as she wishes. The petition recognises that it cannot be right that a grieving mother is forced to go through years of campaigning and investigation to try to get answers about her son’s death. I urge everyone—especially those working in social media companies—to reflect on the evident injustice of the situation, and to commit to finding a way to do the right thing: to give Ellen the information and answers she needs.
The Online Safety Act 2023 made important provisions for Ofcom and coroners to access social media in relevant cases following a child’s death, in turn helping bereaved families. It is welcome that the current Government’s Data (Use and Access) Bill, which is making its way through Parliament, strengthens those powers and the requirements for data retention so that the risks of data loss in such cases are reduced. However, the provisions do not have retrospective mechanisms, and are therefore not sufficient for historical cases, such as Jools’, where the coronial process has already concluded. As a result, Ellen is in the situation where she has to try and crowdfund a significant sum of money for legal action to get Jools’ inquest re-run, so that the coroner can use the powers now available in law to access Jools’ social media data. It cannot be right that this is necessary.
Some online services say that without a change in the law, they cannot legally release data to bereaved parents like Ellen, but what has struck me in preparation for the debate is that there is not a consensus on the current legal situation. The online safety and data protection expert, John Carr, told me that he did not believe that the general data protection regulations necessarily limited the release of children’s usage data to bereaved parents. Snap, the provider of Snapchat, told me that it already, on a case-by-case basis, discloses usage data to a parent who is the successor to a deceased child.
Other online services—including some of the most prominent social media services used by young people, such as TikTok—seem to take a different interpretation of the law. They state that data protection legislation prohibits them from releasing any data they hold that would give parents like Ellen the answers they deserve. I find this inconsistency of interpretation from online services at best troubling and at worst suspicious, given the historical behaviours of some social media companies that were involved in minimising—and indeed covering up—evidence of the impact of online harm. I believe it is incumbent on all online services to use their considerable resources to push the existing law as far as they can, and to find a way to release data to bereaved families.
We owe it to our constituents that we work together, and leave no stone unturned to understand the trends and drivers that lead our children to take their own lives. Social media companies headquartered overseas have repeatedly demonstrated that they cannot be relied on to take reasonable action out of good will, so I invite the hon. Member to agree that it is up to Parliament to legislate accordingly.
I agree that legislative action has been necessary, as the Online Safety Act shows, and indeed, there are provisions on this in the current data Bill. The issue is that there is a lack of clarity; under the existing law, some social media companies seem to be finding a way of doing the right thing while others are not. I will come to the hon. Member’s point when I ask a couple of questions of the Government.
I can assure the social media companies—in the event that they carefully do the right thing, on a case-by-case basis, and then face data protection questions from regulators in response—that they will find allies across Parliament in defending their actions.
I ask the Minister and the Government: what scope is there for stating clearly in law that, so long as due care is taken on a case-by-case basis, the release of data to bereaved parents is permitted? Could the data Bill be amended to include a clarification to remove, once and for all, the claim of some companies that they are prohibited from giving parents like Ellen the data and answers they deserve? Are there any other steps the Minister believes could be taken to right this injustice? I look forward to hearing the perspectives of colleagues and the Government’s response to this important debate.
It is a pleasure to serve under your chairship, Mr Twigg. I thank the Petitions Committee for enabling this debate and the hon. Member for Sunderland Central (Lewis Atkinson) for opening it.
There is nothing any parent fears more than the loss of a child. Tragically, in 2022, Ellen Roome suffered this loss. Her world was shattered when she came home to find her son Jools not breathing. He had taken his own life aged just 14. While Ellen was dealing with the enormous pain of her loss, she also had questions about what had happened in the days, weeks and months leading up to Jools’ death. Jools was a happy boy. A video filmed just before his death shows him playing happily with friends. The absence of any hints that he might have been inclined to harm himself led Ellen’s search to his social media accounts.
In her search for answers, Ellen found herself blocked by a legal system unable to tackle the complexities of social media and obstructive social media giants that placed process ahead of compassion. The police had no reason to suspect a crime, so did not see a reason to undertake a full investigation into Jools’ social media. The inquest did not require a thorough analysis of Jools’ online accounts. None of the social media companies would grant Ellen access to Jools’ browsing data, citing regulations. A court order was needed to access his digital data, which required eye-watering legal fees.
Ellen sought nothing more than what amounts to access to her deceased child’s personal effects. In years gone by, that would have required searching through a child’s bedroom, perhaps looking at diaries, notes, letters, toy boxes, stickers or any other clues. The modern-day equivalent of such a search necessitates access to social media accounts, but because the law has not kept pace with the realities of modern life, that search has not been and cannot be completed. This is a cruel and inhumane process to impose on a grieving parent seeking nothing more than answers about what happened before their child took their own life. That is all Ellen wanted.
I ask all of us present, and anyone watching at home, to consider what we would want to happen if we found ourselves in Ellen’s shoes, and go further to think what rights a parent would assume in those circumstances, as a matter of natural justice. There is, of course, a much wider debate about online harms, but Ellen is using her experiences and her campaign to bring about positive change in this debate. She is seeking answers in order that others do not have to in future.
The case of Jools and Ellen is not the first time that social media companies have come up short. The dynamic and fast-moving nature of the internet means that social media companies are able to act before legislators have a chance to catch up. This is a problem that has persisted for many years, but it is notable that they act only when pushed by brave campaigners like Ellen shining a light on what is happening.
As we have heard, the Online Safety Act takes us a step forward, and it does improve rights of access. The current legislation, however, means that bereaved parents like Ellen are still left to fight bureaucracy. In Ellen’s case, she is seeking retrospective action too. The Government should look at how exactly they can rectify that urgently and in retrospective cases.
There is now an acknowledgment that giving parents the right to automatic access to living children’s social media accounts may have unintended and undesirable consequences relating to child protection, but if the law and parents are to acknowledge that balance, social media companies must do their bit to keep children safe online from predators, inappropriate content and content that may cause children to harm themselves.
Sadly, in recent weeks and months we have seen social media companies make increasingly vociferous claims that the protection of free speech and freedom of expression online must come above all else. The examples of Elon Musk’s bizarre approach to X and Meta’s decision to ditch moderation in favour of community notes are instructive of what is happening and what could happen next, and there has also been much discussion of the impact of the TikTok algorithm on children’s mental health. Other platforms and examples will come up in the future. We have also seen democracies start to act to curtail the power of social media companies—the example of the Australian Government’s approach is instructive, whether or not Members of this House agree with the detail.
I thank the hon. Member for Darlington (Lola McEvoy) for her support for my constituent Ellen; I know it has been valuable to her over the past few weeks and months. I thank Ellen herself and pay tribute to her: she is the person whose petition brings us here today. The heartache and devastation she has endured is unimaginable for the rest of us, but Ellen has turned her grief into something that is positive and could be even more positive for this country and other parents. Having watched her campaign so tirelessly, and provided support where I could for the past few months, I am immensely proud of what she has achieved. We should all be thankful for what Ellen and other members of Bereaved Families for Online Safety are doing. They know what we in this Chamber, the Government, the legal system, police forces and social media companies know: the system is badly failing children and families.
Social media companies must now be placed on notice. They must protect children and respect families or face the consequences. They must protect children so that the Joolses of the present and the future do not meet a tragic and early end. They must respect the Ellens of the past, present and future so they can be confident that their children can be safe too.
It is every parent’s worst nightmare to lose a child—imagine losing them and not knowing why they are gone. Ellen Roome is Jools’ mum. She deserves answers but, unbelievably, she is not allowed access to the data that might provide them, which is so wrong.
This petition is for Jools’ law, which would allow parents to have access to their child’s online data in specific circumstances. Jools Sweeney was hugely loved and is greatly missed by his family and community. In actuality, Jools’ law would present a small amendment of no more than 100 words to the Online Safety Act 2023; the amendment and Jools’ law would appear in section 101 of chapter 4, which is titled “Information powers and information notices”.
The Act currently outlines the powers that a senior coroner has in relation to instructing Ofcom to issue a notice to online platforms to provide data in relation to the death of a child. Section 101 of the Act will be amended by clause 122 of the Data (Use and Access) Bill to strengthen the powers that Ofcom has to prevent the deletion of a child’s data when a notice has been given and issued to the regulated platforms that the child has died.
The progress made in this policy area is testament to the parents of children who are no longer with us and to their incredible strength and work. I thank those present and those watching for everything they have done to protect our children. We need to build on this work to allow parents access to data without the need for a second inquest.
A further amendment would allow for Ofcom to be notified as a routine course of action in the event of future tragedies of child suicide or unexplained deaths. That would alleviate the risk of vital answers to parents’ inevitable questions being deleted, and mitigate the reality of it being solely the responsibility of the parents to request the data in those painful early days of grief. The authorities should initiate a data notice in the event of a child’s death to protect those answers from being lost. The amendment, while small in word count, would be transformational to the rights and experiences of bereaved families. We in this place would be hard pressed to find a parent, or indeed anyone touched by the darkness of a child’s death, who would not support the measures.
Ellen Roome is Jools’ mum. She and I have bonded over our shared belief that there can be a future where our children are safe online, and that there must be a future where every child, in every corner of our great country, is protected from online harm. We are bonded by the fact that we will continue to shout loud until that becomes a reality. Ellen has asked me to read her statement about her work to get to this point—her story. That is a great privilege, and I will read her statement in its entirety without taking interventions, as a mark of respect for her incredible work in this area and for all those she speaks for who have experienced such intolerable pain. The work of the Bereaved Families for Online Safety group has already changed the law, and for that they should be immensely proud.
Before I read Ellen’s statement, I must pay tribute to her. We all hope that if we were put in Ellen’s position or faced with her reality, we would stand up and fight for change. The hard truth is that most people cannot, but Ellen Roome is not most people: she is exceptional. Her grit, tenacity and determination to turn her pain into purpose and progress, and to fight for answers for her family, for all those who knew and loved Jools and for those who have found themselves in the same terrible situation, is truly remarkable. More than that, it is Ellen’s warmth, openness and grace that I have been moved by.
These are her own words:
“It only takes one person to make a stand for morality and justice; in this case, that’s me. However, I'm supported by thousands and thousands of people across the world who think it is morally wrong that I am not entitled to see my child’s social media data, which might provide answers as to why my 14-year-old son chose to end his own life.
When I launched the petition, I asked that ‘Parents should have the right to full access to their child’s social media accounts either whilst they are still alive (to protect them) or if they die, as in my case’. It hadn’t crossed my mind that the parents might be the perpetrators. I now understand this could be the case and hope the Online Safety Bill and Ofcom can protect live children online. However, in my case and that of other parents, when the child has died, who are we protecting? The predators on these platforms? Social Media companies? Surely, I should have the right to look for answers to his cause of death. Jools’ young friends struggle to understand why he is no longer here. The ripple effect of his death is felt not only by us as his parents, but also Jools’ friends, teachers, and everyone in his life was so shocked as to why he ended his life – we deserve possible answers or at least to try for answers.
I am his parent, and he is a minor. As a child, he consented to terms and conditions that permitted social media companies to control his online data. I’m unaware of any other legal context in which a 13-year-old can authorise a legal document, such as terms and conditions.
I have always said that I do not know if it was social media that caused my son to end his own life; however, as a parent, I feel I should morally and humanely have the right to that data to give me possible answers as there was nothing offline which seemed to be an issue to Jools. He was not bullied; he was doing well at school and had many friends. There didn’t appear to be body issues, and whilst he didn’t like his floppy hair or chin, we are unaware of anything else that could be of concern. Yes, he had a cheeky side to him, as do a lot of teenagers, but he was a great kid who loved his parents, and his parents loved him VERY MUCH. I fight now for the right to possible answers as to why my son is no longer alive. I have always thought this to be an online challenge gone wrong.
Many MPs feel that the data bill will solve this issue. It won’t help me or other parents who are in the same awful boat as me. The data bill will allow a coroner the right to access this data in future deaths of children, BUT only if the coroner or the police request it. How do we stop future cases where neither the police nor the coroner asked to see this information? This is what happened in Jools’ case.
As a bereaved parent, I was barely breathing myself after the death of Jools, and I was in no fit state to ask or even think of asking the police and or coroner for this information. This could easily happen again with the new data bill. Also, retrospectively, we cannot obtain this information without applying to the High Court for a second inquest. My lawyer has quoted that it will cost me up to £86,000 to hopefully succeed in the high court, but that seems so wrong to have to find this level of legal fees, which is beyond the reach of almost all bereaved parents, to start looking into missing online activity and what was going on. Also, what a waste of legal professionals and staff involved with a new inquest. I’m just asking for data which I feel should be available to me as his parent. However, I’m not allowed to see it, which is wrong.
I hope this will be a good debate. But please remember that as a member of the Bereaved Families for Online Safety group, I represent many other families in the same awful situation and want to try for answers as to why their precious children are no longer alive.
If this had been your child, you would want answers too. I don’t want any other family to be in this hideous position, which will forever affect us all: our family, Jools’ friends, his teachers, everyone in Jools’ life, and their families, forever.”
Those are Ellen’s words. Ellen’s campaign for justice is rare. As a new MP, I may be forgiven for my perceived naivety, but to me Ellen’s campaign poses a binary choice for us—there is no grey area—so I ask that the Minister does everything in her power to help those seeking answers now, whose cases may not be supported through new measures. It is simply wrong that information that may offer clarity and peace to parents who face a new reality without their child is denied them. It is simply wrong that parents who are living in that unenviable reality now face the colossal emotional and financial burden of a second inquest to discover whether the information exists at all. Ellen Roome is Jools’ mum, a campaigner, a leader and a mother, and Ellen Roome is right.
It is a pleasure to see you in the Chair, Mr Twigg.
I join colleagues in thanking the petitioners, and Ellen Roome in particular, for initiating the petition and enabling this Westminster Hall debate. We were all deeply affected by hearing the statement that was just read out. Ellen, you have the sympathies of everybody here on the loss of Jools aged just 14. We think also of other bereaved families and other campaigners—in the last few days we have been reminded of Ian Russell and the work he has done since the tragic death of Molly—and all those who take the most unimaginably awful situation for a parent and a family and use it to try to make something better for others for the future.
The Government’s response to the petition notes not only that, under the Online Safety Act, platforms have to set out their policy for dealing with such tragic situations, but that the Act
“introduces measures to strengthen coroners’ ability to obtain information”
from platforms via Ofcom, thereby providing a route for parents. We will have to see how that works in practice and how timely it is. What we must not do is put a new, onerous layer on top of parents at the most difficult time imaginable, as they are grieving.
As has been mentioned, there is also the question of historic cases. There will be future historic cases, because not in every case will the inquest have covered this question. I hope the Minister will be able to say a word about whether the data Bill is the opportunity to put it beyond doubt that, ultimately, the parent has an absolute right, with the right safeguards and verifications, to see the information related to their child.
Let me turn from the most tragic of cases to all families and all children. I start with the most important point, which is that trust, support and love within families are the most effective things. Most of the time it is irrelevant what the law is because, within families, we set our own rules. Generally, it is clear that even if our rules are, at times, a pain for our children, they are well-intentioned. We must also note that not quite all families are loving families. Some parents are abusive, and children must always have ways confidentially to seek help from child protection services, the police, the health service and bona fide charities. That applies at any age.
It is also true that everyone needs a degree of privacy, but there have always been different degrees of privacy, and how private something is should be proportionate to the level of risk involved. In discussing accessing online services, we are talking about things that can have very serious consequences. We want and need to be able to protect our children from harm—from bullying, from unwanted contact, including from adults, and from being drawn to dangerous interests, which can become dangerous obsessions. We also have a responsibility, and we should be held responsible, for them not perpetrating harms on others. Although we trust our children, we know that children do sometimes get into trouble and can come under pressure, and in some cases severe coercion, from others. Of course, they potentially have ready access to material of all sorts that is much more harmful than we had as children. They can go deeper and deeper down rabbit holes.
Parents are not the only ones who can help children, but they have a unique position in children’s lives and are uniquely placed to help and support them. That is why I agree in principle with the petitioner that parents should have a right to see what their child is subjected to or is doing for as long as they are a child and we, as the parents, are responsible for them—and that means at least until age 16. There is a separate debate to be had about the extent of that, and what the threshold and process should be. I understand entirely what the hon. Member for Sunderland Central (Lewis Atkinson) was saying. I do not think anybody is proposing constant, ongoing monitoring, but there are situations that a child could find themselves in that I believe warrant the availability of that access.
There is also a problem, or a hurdle, with the principle: we can only request access to something that we know exists. It is common for children to have multiple social media accounts on a single platform. They probably have different names these days, but people used to call their fake and real accounts finsta and rinsta. The account their mum sees is not necessarily the real one—ironically, the one that was called “fake” was the one where their real lives were actually happening. Of course, they could also be on lots of other platforms that parents and others do not necessarily know about.
I agree with the hon. Member for Sunderland Central, who opened the debate on behalf of the Petitions Committee, that it is of paramount importance that we are able to put some guardrails around what children can access. That is one of the reasons we have parental controls. How those controls work, and the limits of them, are what I want to talk about this afternoon.
I will read out a short note from Microsoft, which is not a company that people normally worry about—it is a very responsible operator—to a constituent ahead of their child’s 13th birthday. It says:
“Congratulations on Fred’s birthday. At this age, certain laws allow them to have more control and choices over their own account settings. This means that they’ll be able to change a number of family safety settings, even if you already have them set up. Fred will also need to allow you to continue receiving data about their activities to guide their digital journey. They can turn off your ability to see their activity on Windows, Xbox, and Android devices. They can turn off your ability to see their devices and check on updates…safety settings like firewall and antivirus…They can stop sharing their location through their mobile phone.”
That was for a child approaching their 13th birthday, which leads me to question what “certain laws” are being cited. I can only assume it is the Data Protection Act 2018, which sets out that
“a child aged 13 years or older”
can
“consent to his or her personal data being processed by providers of information society services.”
The genesis of that was European law, and Parliament was debating and voting on it in parallel with, but before actually completing, exit from the European Union. The age 13 is not universal. EU law specified a range between 13 and 16, and multiple countries did select 13, but not all. France set the age at 15, with some limited non-contractual consents for data processing allowed between 13 and 15. Germany and the Netherlands set the age at 16. There is that question of what is the appropriate age, but the other big question is what that age actually means.
The 2018 Act was passed before we considered the Online Safety Bill, which became the Online Safety Act 2023, but we were already concerned in this House about online safety, and I am fairly sure that it was not Parliament’s intent to reduce parental oversight. In particular, I do not think saying that a service can have a child sign up to it at 13 is the same as saying that the parent cannot stop them. Still less, it is not the same as saying that the parent should not be able to know what their child is signed up to.
In setting out why the age was set at 13, the explanatory notes to the 2018 Act say, quite rightly, that that is in line with the minimum age that popular services such as Facebook, WhatsApp and Instagram set, but they go on to say, slightly unrelatedly:
“This means children aged 13 and above would not need to seek consent from a guardian when accessing, for example…services which provide educational websites and research resources to complete their homework.”
I think that sentence might have a lot to answer for. It sounds very sensible—we would not want children having to get over hurdles to finish their homework—but if we think about it, it is not necessary to sign up to research something on the internet for homework anyway, and educational websites are generally exempt from consent requirements. But the big question is, what else might it allow—or, crucially, what else might it be interpreted to allow?
I repeat that I do not believe that it was Parliament’s intent in effect to disable parental safety controls for 13, 14 and 15-year-olds. There is a whole other question about those safety controls themselves and how they work, and how difficult it can be for parents—and even all of us, who tend to think we are quite good at this sort of thing—to keep on top of them, particularly if they have multiple children, different operating systems and multiple platforms. There really should be a single industry standard entry system that can cover all of screen time and basic, entry-level approvals with a default “safety on” version of the different platforms.
We talk about age thresholds and age limits; there is a whole other set of questions about how those apply and how we make age assurance or age verification work properly. Those are both debates for another day. Today, I simply ask the Minister: is it the Government’s understanding of the existing legislation that children under 16 should be able to switch off parental controls? If not, what could be done to clarify the situation? Is a change needed in primary legislation?
It is a pleasure to serve under your chairship, Mr Twigg. I thank all those who signed the petition for raising this issue. My heart goes out to Jools’ family, and I thank them for their work to bring about change so that this does not happen to other people in the future.
I was recently joined by local leaders at an event in Worcester’s fantastic local library to hear the views of young people in the Worcestershire Youth Cabinet. They presented their manifesto for young people, and I was struck by their insights and passion, and in particular by their deep concerns about the impact of the online world on their mental health. I share their concern that online harms have run away from us. We live in a world where people young and old are exposed to harmful content and interfaces. I would like to see us move at pace to regulate not only extreme harms online, but persistent low-level harms that are eroding young people’s mental health.
Having recently met Ofcom, I am concerned that there is much more to do to regulate harmful online media. Although I welcome us taking some first steps in this area, we are far behind where we need to be. If I were to liken our regulation of online harms to the regulation of drugs, we would be in a situation where a local newsagent would be required to assess the risk of supplying class A drugs, while alcohol, cigarettes and over-the-counter medicines remained an unregulated free-for-all. These are historical shortcomings due to previous Governments, but none the less we have much work to do to address the risks of online bullying, harassment and addiction.
In my constituency, I have heard at first hand the stories of men struggling with addiction to pornography and the damaging effects that has had on their relationships and personal wellbeing. We need to be open eyed about the impact of the new online world on adults and young people alike, and it is the duty of Government to empower people to stay safe.
Young people in Worcester told me that although they want online sources to be regulated, and to be equipped themselves for that world, what they want most is for their parents to be empowered to advise, guide and journey with them through the digital world. I agree that most parents are currently very poorly supported, and I welcome the enthusiasm I sense from Labour leaders for family hubs, which offer a community-led and empowering vehicle for that work. Does the Minister agree that we should equip parents and carers to navigate these hazards with their children, so that instead of feeling isolated, anxious and alone, young people feel supported, understood and empowered?
I am very grateful to be able to speak in this debate, which was prompted by Ellen Roome’s petition, although I am extremely sorry that any of us needs to be here at all. I pay tribute to Ellen and all the other families in the Bereaved Families for Online Safety network for their tireless campaigning.
A week before Christmas, I sat in a Committee Room with Ellen and senior representatives from all the major tech firms, including Meta, TikTok, YouTube and Snap. One conversation that morning will stay with me for a very long time—a conversation that I can describe only as harrowing, shocking and deeply depressing. Sitting alongside two other heartbroken parents who have also lost their children because of online harms, Ellen confronted the representatives of TikTok and Instagram, pleading with them to release information that could give her some peace of mind following the death of her beloved son, Jools. There can be nothing worse for a parent than losing a child, but to lose a child and not understand how or why must compound that agony.
Ellen does not know why Jools died. Unlike many other children and young people, he was not being bullied online and did not seem to have any mental health issues. All Ellen wants is to find out what her son was looking at online before he died; it might shed some light on this tragedy that has clearly caused immeasurable grief. It was infuriating to listen to the tech firms’ pathetic excuses that morning about why they could not or would not release the data that Ellen is asking for.
There was—there is—no good reason not to release that data. Jools is no longer with us, so claiming data protection seems frankly pointless. TikTok said that it would be fined for releasing the data, but my question is: by whom? Who is going to press charges against a global tech company for supporting the request of a bereaved mother? Who in their right mind would think that a court case on that point would help anyone?
As we have heard from the hon. Member for Sunderland Central (Lewis Atkinson), some social companies have behaved differently in such cases. It is quite clear, however, why some will not agree to release that data: it is a pathetic attempt to avoid the potential bad publicity that will follow if it becomes clear that Jools’ short life ended after taking part in a social media challenge, which is one possibility. It is about protecting the reputation of those social media companies. It is about the accountants who fear the lawsuits. In short, I suggest it is about money. The absence of humanity, care and compassion in that room before Christmas was palpable and I applaud Ellen for having the courage to come back here today.
I can see no reason why tech companies cannot immediately release the data that these devastated parents are asking for. I fully support Ellen and all the other parents in their attempts to get Jools’ law on the statute books. In the meantime, I plead with Instagram and TikTok to not wait for a legal challenge, but just release the data: find your inner human and do something decent; imagine if it were your child.
Under UK law we have clear, legal processes for handling physical estates after death. It is high time that we establish clear protocols for the digital estates that are left behind, particularly the digital estates left by young people. The law must catch up with the world we are living in. Current provisions, such as Facebook’s legacy contact feature, are not sufficient, because they rely on a child making a decision while they are still alive, often without fully understanding the implications, as has been mentioned. It is also quite possible that, if children were asked whether their parents could have full access to all of their digital online life in the event of their death, they would say no. Without formal, legal access arrangements, parents are left with no way of viewing their children’s account.
I was reading up on that issue in preparation for this debate and I came across some amendments to the Data (Use and Access) Bill that would require those huge providers and tech companies to have a complaints procedure, where parents could appeal to their better nature for the release of the data, but if they were refused it, there would be a proper complaints procedure. Does the hon. Lady think that goes far enough?
No, I do not believe that goes far enough. There should be a legal right to access that data without having to go through any complaints process, particularly at a time when one is struggling with the worst bereavement imaginable.
The petition seeks to address that gap in law and ensure that, in the tragic event of a child’s death, parents have the right to access their child’s account to gain closure, to preserve memories and to ensure that harmful content is removed. I support the addition of Jools’ law into the Online Safety Act, and I urge the Government to do whatever they can to apply it retrospectively for those who have campaigned on this issue.
What Ellen’s family have been through is the absolute worst imaginable, but tens of thousands of families up and down the country are struggling with the impact of social media on their children and teenagers. Those children are addicted to their screens because of the wicked algorithms that lure them in; cowed by bullies who can intimidate them in their own bedrooms late at night; struggling with their body image because they do not look like the influencers they watch; depressed because their lives do not resemble the doctored, airbrushed Instagram image of perfection; and brainwashed by influencers who spew toxic messages through their pages.
The damaging impact of social media on our children is vast. Medical professionals from all disciplines tell us regularly of the harms children are experiencing from hour after hour spent glued to a screen. Their physical health is damaged, their mental health even more so, and even their ability to communicate and socialise with other humans is changing.
Does my hon. Friend agree that it is about not only mental health harm, but inattention? I speak to many headteachers in my constituency who tell me that children are unable to concentrate any more because of hours spent on their screen. Would she agree that the Government study announced in November that seeks to explore that issue further should be published soon, because every day and every year we leave it, more children are at risk of harm?
I could not agree more. What is becoming obvious is the impact of children being on their phones late at night, which affects their sleep—even that has a knock-on effect on how well they can operate.
Parents across my South Devon constituency are desperate to protect their children, but they are overwhelmed by the digital world and the power it has over young people. They need legislation to empower and support them. The NSPCC reports that over 60% of young people have encountered online bullying. That is a staggering number, highlighting the need for more robust protections from us for children in the digital space.
It is clear that we need more robust protection, and it is incumbent on us as lawmakers to protect children as we do from other harms such as tobacco and alcohol. It may be right that parents should not have access to their teenager’s social media because of privacy reasons and to protect children’s ability to seek support online, but that makes it even more important and urgent that social media companies should be required and obliged to take responsibility for watertight age verification, and that we should look seriously at raising the age of access to some social media platforms to 16.
I urge the Government to work with social media companies and other stakeholders to create a clear and respectful framework that allows parents access to their child’s social media accounts after a death with no questions asked. This is not about data protection; this is about ensuring that families can concentrate on grieving and healing rather than going into battle against the world’s tech giants.
It is abominable that any bereaved parent should have to do what Ellen and other campaigners are doing. I urge the Minister to legislate so that that does not happen again. I commend the Petitions Committee for bringing this debate to the House and the hon. Member for Sunderland Central for introducing it.
It is a pleasure to serve under your chairmanship, Mr Twigg. I pay tribute to Ellen Roome’s steadfast campaign in the most awful, unimaginable circumstances, and to the campaigns of all the other bereaved parents who seek change so that no other parent has to suffer like they are.
As citizens, parents and legislators, we are rightfully worried about what our children consume online. The recent Channel 4 programme “Swiped” demonstrated the addiction our children have, the concerns parents have about the time they spend online, and the harms that children continue to face.
Before they are able to properly comprehend it, our children are sucked into the online world by algorithms that are designed to get them hooked and, as if it were a drug, they keep coming back for more. In this world, they are taught to look up to influencers with unhealthy opinions, unrealistic beauty standards and conspicuous wealth beyond their dreams. They are told that they are not good enough, they may be cyber-bullied by their peers for not being good enough, they have trouble sleeping and their attention span withers. We also know that short-sightedness is becoming more prevalent. Our children’s work suffers and they find it increasingly difficult to read and learn.
Our children see pornography online before they receive high-quality sex and relationships education in school. They are shown adverts for apps that can use AI to nudify their peers and spread such images to their friends and around school. They are criminalised for doing that, but the tools they use remain legal and readily accessible. They get trapped in the whirlpool of online pornography and dragged into increasingly extreme and violent content. They become desensitised and their perceptions and expectations of sex and healthy relationships are warped. Online behaviours quickly become offline behaviours, such as self-harm, dangerous viral challenges and peer-on-peer sexual abuse, which do huge harm to mental health, so that one in five children now has a diagnosable mental health disorder.
A generation of children chronically online and harmed by it bear the brunt of a technology that was never designed with children’s development in focus and that acts with no regard for the consequences of the harm it causes. When questioning tech companies recently, none of them could confirm that they develop products widely consumed by children with input from child development experts. I do not understand why we expect stringent standards in all other aspects of our children’s lives—their toys, cots and bikes, and our cars—and yet not on the impact of social media products.
We cannot stand idly by in the name of freedom, because there is no freedom in addiction or in being harmed. We cannot let our children’s lives be dominated by the dangerous online world. Whether it is depression or misogyny, eating disorders or myopia, we are failing children by continuing to subject them, and those they interact with, to the impacts of a childhood spent online. We need to reclaim childhood for the real world.
I recognise the important role of internet access in providing spaces for children to access support, but I wonder how we weigh up the harms caused through access to social media, which support services, mostly in the voluntary and community sector and our public services, need to mop up afterwards. We must look more at whether we could provide that access more safely in school settings or through youth services. I am very aware of the huge impact of abusive parents and carers, but it might be time for us to start asking whether we are using that as an excuse, rather than thinking about how we ensure our children can get the access they may need to get safe without also succumbing to the dangers of the online world.
We fundamentally need to change the role the internet plays in growing up, and that must be a societal shift, given the pressure children and young people feel to be online. That is why I back Ellen Roome’s call for parental oversight. Parents deserve to have all the tools available to them to help them to protect their children, and that is why I am proud to be one of the co-sponsors of the safer phones Bill introduced by my hon. Friend the Member for Whitehaven and Workington (Josh MacAlister).
Much of the focus is on parental control, but as the right hon. Member for East Hampshire (Damian Hinds) has eloquently outlined, there is potentially no control from the age of 13. Even with controls, who sets what is the right developmental level for access to some apps and social media when there is no child development expert involved? App stores, for example, determine age restrictions themselves. In a number of instances, developers have set an age restriction of 18 for an app, but app stores have lowered that to 17 or 16. There is access but no scrutiny. Unlike for films or other things that our children consume, we have no way of understanding whether there has been independent, child-led expert oversight.
We need to raise the age of internet adulthood and ensure that, this summer, Ofcom properly implements age verification for pornographic content as part of the Online Safety Act 2023. We need to remain open to the need for a new online safety Bill to fill the gaps left in the legislation, as has been argued for recently by Ian Russell, Molly Russell’s father. I also support the calls in this debate for bereaved parents to be given retrospective access to their children’s social media accounts. With children’s safety and the future of our society on the line, the time for action is now.
It is a pleasure to serve under your chairmanship, Mr Twigg. I thank my hon. Friend the Member for Sunderland Central (Lewis Atkinson) for introducing this important debate on behalf of the Petitions Committee. He made some powerful points in his speech, and I look forward to hearing the Minister’s response to them. I also associate myself with the remarks made by my hon. Friend the Member for Darlington (Lola McEvoy), who spoke so eloquently, and my constituency neighbour, the hon. Member for Cheltenham (Max Wilkinson).
It is only right to start by paying tribute to Ellen and Bereaved Families for Online Safety, who are in the Public Gallery. They have raised this petition following the heartbreaking loss of Ellen’s son, Jools. It is unbelievably courageous to turn away from the abyss of pain and grief, and to turn that into a campaign for good. It is, frankly, inspirational to many Members in this House and I thank the 1,711 Gloucester constituents who signed the petition.
As a relatively new parent, I now know what it is like to really worry about a child. People say that no one really knows what it is like to be a parent before their child is born. I laughed that off and thought, “Of course you can know,” but, truly, no one does. The moment a parent meets their child for the first time, their world changes. I realised that I would literally walk through walls for my child. Thankfully, my child is only one. He does not yet have any social media channels that I am aware of. If that could continue for as long as possible, I would be grateful.
I know that parents are increasingly worried about their children’s use of social media across the country and in my constituency of Gloucester. According to the Children’s Commissioner, two thirds of parents are concerned about the content that their children can access online, and the impact that it is having on their children’s mental health. Those concerns are felt by not only parents but children themselves. Research shows that nearly three quarters of teenagers have encountered potential harms online and three in five secondary-school age children have been contacted online in ways that have made them feel uncomfortable. As social media becomes increasingly embedded in our children’s lives, it is clear that urgent action is needed. While the Online Safety Act 2023 was a long-awaited start to protecting children online, we must go further and I was pleased to hear my right hon. Friend the Secretary of State for Science, Innovation and Technology say that he is “open-minded” about introducing new legislation. I encourage him to consider introducing Jools’ law.
My remarks will not particularly focus on online content, although my hon. Friend the Member for Lowestoft (Jess Asato) outlined some of the dangerous and horrific online content to which children and young people are exposed. Members of this House do not need to be reminded of the horrific content available online. Not long after being elected, I found that pictures of myself and my baby had been posted with a threatening message, anonymously, as part of an online “spotted” page. Members of this House develop a relatively thick skin in this job but imagine the impact of that on a teenager. I think back to my teenage days, and whether I was mentally equipped to be able to deal with cases of bullying. I am just about young enough to remember when Facebook came in—I was a teenager when Facebook started—and back then it was a very different place. Nowadays, we see, frankly, a cesspit of online content and I do not think that I would have had the mental capacity, during my teenage years, to deal with that level of abuse.
The hon. Member, my constituency neighbour, makes an interesting point. My school years are long behind me and we sometimes look back at our youth with rose-tinted glasses, but being at school can feel like being in a warzone—there is so much pressure. If someone is being consistently bullied, I can barely imagine what it must be like for them to try to escape that at home and then have a device in front of them with such material coming in, even at night. Does the hon. Member agree?
The hon. Member is entirely correct that, in the days before social media, bullying was confined behind the school gates and in most cases, though sadly not all, the home was a place of safety and a haven in which a teenager could recover and steel themselves for the next day. That safety has been removed by social media and people are able to get someone, wherever they are.
As a teenager, I grew up with social media as it was first coming out—with Bebo, which I do not think exists anymore, and MSN Messenger. My parents had no oversight over what I was accessing or who I was talking to. Frankly, it was dangerous. That is not to question my parents’ parenting skills; they are of an age where they are still learning how to use Facebook in 2025. However, we need to do more to protect teenagers at such a vulnerable age when they are learning about themselves and about how to build the mental resilience to deal with some of the stuff that some Members of this House experience on a regular basis.
Although we could have another debate on online content, we all know why we are speaking about the petition today. It concerns me that grieving parents cannot access information that may relate to the death of their children. We know that there has been a worrying rise in dangerous pranks and trends that go around on social media and in people using social media to groom and target young people, and that dangerous information is going viral—information that may have played a role in tragic and heartbreaking deaths up and down the country. Giving grieving parents the right to find answers for themselves must be within our power as a Government. I encourage the Minister to do what we can to support parents and families in this absolutely heartbreaking position.
It is a pleasure to serve under your chairmanship, Mr Twigg. I would like to start by thanking my hon. Friend the Member for Sunderland Central (Lewis Atkinson). I am sure all my constituents in Edinburgh South West who signed the petition will be grateful for the time he invested in crafting his introductory speech. Like others, I pay tribute to Ellen Roome. She has shown herself to be a fantastic campaigner and I am sure her family is proud of her. However, the situation she finds herself in is absolutely shameful. The hon. Member for South Devon (Caroline Voaden) is right to question the motives of the social media companies and to ask who would oppose them doing the right thing. Hopefully we will hear from them sooner rather than later on this issue. However, we have to accept that the framework within which these companies are operating was set by either action or inaction within this building, so we have a duty to fix it.
My children are all adults now, at least in theory—I hope they are not listening. However, when they were younger, it was always difficult to get the balance right between respecting their privacy and ensuring that they were safe in all aspects of life, but particularly online. We have to accept that the internet is a dangerous place. All of us are concerned about material online relating to eating disorders, self-harm and even suicide and, we should remind ourselves, have a duty to do more about it.
The hon. Member for Leeds East (Richard Burgon) is not in his place today, but before Christmas he had an Adjournment debate on the role of internet service providers in blocking the most harmful content, which had been linked to the deaths of vulnerable young people. I went along to that debate just to learn more about the subject, but was utterly ashamed and frustrated at what is happening. There are companies that seem to be looking for reasons not to do the right thing rather than find a way to support vulnerable people across our country.
I hope the Minister can support as much as possible what Ellen Roome and her campaign are asking for, but we all, including the Minister, have to go beyond that. It is not just about the law: it is about creating a culture online for our young people where service providers and social media companies feel that they have a social responsibility to support the most vulnerable people in our society and do all they can to support them.
It is a pleasure to serve under your chairmanship, Mr Twigg. I pay tribute to Ellen Roome in the most genuine and heartfelt way for what she has achieved and what she does. The pain she has been through is utterly unimaginable. What we can do today, as I hope the Minister’s response will, is make her bravery worth while.
I will focus my brief remarks on the Online Safety Act 2023, because so many of our hopes as parents, campaigners and elected representatives were pinned on that legislation. It is a step forward, but only a first step. I believe that more should be done. The Act was the product of a weak Conservative Government, with many Ministers and Back Benchers who shared the then Opposition’s conviction that strong regulation of social media companies was essential but were being held to ransom by extreme libertarians who had dressed up their ability to monetise hatred and extreme content as a free speech issue. Government Ministers gave in to an alliance of social media companies that were not willing to dilute profits to spend on effective moderation and that had a financial and political interest in driving engagement with extreme content. That was a deplorable outcome, as many hon. Members said at the time.
Plainly, as a first task, the new Government must make that legislation work as best they can. I understand why my right hon. and hon. Friends are pressing on with implementing it as best as they can. However, my request to them is to heed the petition and recognise that what was good enough for the then Government under those circumstances cannot be good enough for this new Government in the medium and long term. It is certainly not good enough for our children.
The immediate concern for me, for the people I speak to at the school gates and for many of my hon. Friends is that the proposed regulatory regime will let some of the most dangerous and extreme websites escape the proper regulation that the vast majority of people in this country expect them to receive. It cannot be right that we allow some sites to escape accountability for their failure to remove certain promotional material speedily simply because they are small. Of course they are small—such content is so vile that the chances of the promoters getting bigger audiences will always be limited—but the need for the firmest regulation in these cases is driven by content, not by size.
The failure of Ofcom to regulate the small but risky platforms seems to mean that a site such as LinkedIn is being regulated to a greater extent than platforms such as Telegram or Discord, which are overrun with far-right activism, self-harm, misinformation, homophobia and antisemitic content. Does my hon. Friend agree that that needs to be rectified and that Ofcom needs to raise its game?
I thank my hon. Friend for that intervention. I believe that more needs to be done. I do not believe that the Government have ruled that out: they are collecting evidence, so I believe that in future iterations of the code, if that argument is accepted by Ofcom, they will make the appropriate changes. It is up to us to continue to submit the evidence and to call for those changes to be made.
The main point in the debate is about the balance between regulation and innovation and about where we draw the line between the obligations of site users and those of content providers, so that we do not discourage new services and investment. However, I believe that that is not the issue that the petition we are debating addresses. Hatred and the data to which we are calling for access are not drivers of economic growth. Nor is the inclusion of high-harm sites in category 1 a barrier to investment plans for the frontline market leaders. This is about doing the right thing. I hope that all the voices will be heard today.
It is a pleasure to serve under your chairship, Mr Twigg, and to debate such a powerful and moving topic.
I want to start, as more or less everyone in this room has done, by paying tribute to Ellen for her tireless campaigning work. I cannot come close to comprehending the pain that the death of Jools must have caused you, even before it was exacerbated and extended by the lack of closure when social media companies refused to give you access to data that could help to explain what happened to him and when collectively, as a society, we showed ourselves impotent to compel them to do so. Your ability to turn that pain and that love into such a powerful petition and call for action, which motivated so many parents in my Hitchin constituency to sign up to it too, is a true inspiration to so many of us in the room. Just as we are inspired by you, we are inspired by the many families here with you today as part of the Bereaved Families for Online Safety support group, which has been doing so much important campaign work on the issue.
It is pretty clear to all of us who have spoken today that for far too long we have tolerated a belief that online harm is too tricky and too practically infeasible to regulate in the same way that we would regulate every other form of harm to which a young person could be exposed, often in contexts in which as legislators we have historically been more comfortable getting involved. We cannot tolerate that state of affairs any longer. Although there have been some steps forward, which are to be welcomed and which I will touch on shortly, the petition highlights several areas in which it is clear that we need to consider going further to ensure that we are all living up to our duty to do everything we can to protect young people right across society.
Both as a teacher and as a children’s social lead, I got to see at first hand some of the very real ways in which young people can be exposed to harm by our failure to act on the issue of social media and online harm over the past decade. As an MP now, I am always struck by the fact that, heartbreakingly, whenever I do an assembly, even in a lower school or a primary school, almost without fail there will be one young person, and often several, who will raise their hand and talk to me about an example where they have been made to feel unsafe or at risk online and ask what I, as their MP, am going to do about it. I know that urgency to act is felt by so many colleagues across party boundaries, both in this Hall today and across the House.
The petition focuses on particular aspects where we could do more to ensure that parents have both the right ability to provide oversight for children on social media and access to really important data after bereavement. I know that, following the petition, there has been some important progress from the Online Safety Act to give some powers to coroners and to Ofcom to ensure that in certain circumstances they can support parents’ ability to access that data, but as colleagues have pointed out, there are clear areas in which it does not go far enough.
As my hon. Friend the Member for Sunderland Central (Lewis Atkinson) set out, it is pretty clear that at the moment some ambiguity in the legal position is being used as an excuse by social media companies not to act. We should not tolerate that; we should do all we can, hopefully as a Government, to clarify the position. Collectively, we should not be letting social media companies off the hook for not doing everything in their power to give the families access to the data where no right-minded individual could see any reason not to and where no right-minded individual or agency is likely to seek recourse against them for doing so. So many Members have rightly pointed out that that feels like an excuse, not a reason, to fail to disclose data. We should not tolerate that excuse from companies that we come into contact with in our work as representatives.
However, as has rightly been pointed out, it is important that we do not just look at this issue in the context of the existing legislation. We know that there are very real risks after bereavement that the data could be deleted. As my hon. Friend the Member for Darlington (Lola McEvoy) pointed out, thinking about ways in which we can compel earlier notification to Ofcom and to social media companies and online platforms that can have that data in order to remove that risk at source is surely a common-sense way of ensuring that that risk to getting justice and getting closure can be closed off.
Moreover, the petitioner and many others rightly point out and ask us to reflect on the fact that if we are really interested in child safeguarding and keeping young people safe, it simply cannot be only after a tragedy, after bereavement, that the opportunity for parental oversight and involvement takes place. My hon. Friend the Member for Sunderland Central rightly set out some of the very real and justified concerns that children’s charities and advocates have about unfettered and complete access, but as many other Members have set out, that should not be the case. It should not be beyond us to reason and think through how, in exactly the same way that parents provide oversight over every other aspect of a young person’s life, they can have access to the best and most sensible ways to do so on this platform, too. As the right hon. Member for East Hampshire (Damian Hinds) pointed out, thinking about the age at which young people can meaningfully and rightfully consent to opting out of parental oversight has to be part of that process.
It is fair to say that a few hon. Members are concerned that the implementation of the Online Safety Act, as it is currently envisaged and as some of Ofcom’s recent publications show, may fall short of doing justice to the importance of the issues. Whether it is considering what more we can do to protect young people from imagery and content relating to suicide and self-harm on social media, ensuring that we do not tolerate technical infeasibility as an excuse for tech companies not to act on the most egregious forms of harm, or having well-intentioned and important conversations about the right age of consent, to which many Members have alluded today, there is clearly a lot more that we can do collectively.
Will the hon. Member go further and say that Ofcom’s implementation so far has been weak, overly cautious and fundamentally disappointing? Does he concur that it is unfair to put parents in the intolerably pressured situation of being the policemen of their children’s social media activity?
Some aspects of how Ofcom has said it will take these matters forward are to be welcomed, but I absolutely agree with the underlying sentiment of the hon. Lady’s comment. Currently, what has been set out does not go anywhere near far enough. As representatives of our communities and of the families who want to do everything possible to keep young people safe from online harm, it is our responsibility to ensure that we are holding Ofcom accountable for being far more ambitious about how it can most creatively and robustly deploy the powers that we are giving it to keep young people safe.
I thank my hon. Friend for his impressive and articulate outlining of the debate so far. Will he join my calls for Ofcom to strengthen the upcoming children’s code and, as the code is not yet published, to use this opportunity to include functionality, a stronger dynamic risk assessment—a live document that will be constantly updated—and the measures that my hon. Friend has laid out for the smaller and riskier platforms?
I concur wholeheartedly. My hon. Friend has been a tireless campaigner on this issue, both in our debate today and throughout the time I have known her—a very short time, but an impressive one none the less. As she rightly points out, the children’s code is a real opportunity to do right by the intentions of the legislation and by the collective ambition that we are discussing today. From my hon. Friend to the children’s commissioner, campaigners on the issue are pretty united about the opportunity that a more ambitious code could deliver for safeguarding young people.
For far too long, we have allowed young people to be exposed to a level of harm online that we would not tolerate in any other aspect of life. It is potentially understandable, but not excusable, that as legislators we are sometimes more comfortable imposing restrictions or acting in areas where we have more direct lived experience, as in the Children’s Wellbeing and Schools Bill or the Tobacco and Vapes Bill. Those are tangible things that we are comfortable and used to voting and making laws on, whereas online harm can sometimes feel a bit more nebulous and a bit tougher. However, that is no excuse not to act. The failure to act is written across the tragedies experienced by so many families across the country and so many campaigners in the room today. We must do better, and we have to make sure that this is the Parliament in which we do.
It is a pleasure to serve under your chairmanship, Mr Twigg. I pay tribute to Ellen and all the families for their tireless campaigning and for this petition, which 119 people in my Bournemouth West constituency signed. Like my hon. Friend the Member for Worcester (Tom Collins), I want to talk about some of the broader harms of social media and smartphone use that students and parents have raised with me. This is particularly salient in the context of announcements this week about the removal of independent fact-checking from Meta and other platforms.
Last week, I visited Bourne academy, a secondary school in my constituency that is taking part in the Dorset boys’ impact hub project. The project aims to champion the experiences of young boys who face inequalities, and provide a platform for their voices. The boys I met raised a range of issues, from knife crime to the desire for safe and legal spaces to practise graffiti art, but the thing I found most interesting was that we got into a long conversation about the impact of social media in their lives and its uses in their socialising. Frankly, I was shocked by some of the things they told me about the content they are pushed on these platforms.
The boys mentioned Reddit and Instagram as particularly vile examples of distressing content: as soon as they log on to those platforms things like videos of people being decapitated and other forms of serious violence are pushed. They said that Snapchat is a platform where group chats are regularly used to conduct cyber-bullying and to spread rumours, with very little accountability because all the messages disappear within 24 hours. Clearly, as it stands what we have is not working to protect young people. The boys also reported that age restrictions are incredibly easy to get around, whatever the age restriction.
[Sir Desmond Swayne in the Chair]
It is worth noting that Bourne academy has a no-phones policy. Kids are not even allowed to use them during breaks, which caused consternation among the young people. It is clear that smartphones and social media are changing how young people interact and socialise. This can expose them, with very little oversight, to high volumes of violent, inappropriate and harmful content, which many of us will not be used to in our lives. Social media is linked to wellbeing and self-esteem issues, which can have a direct impact on behaviours and the views young people hold, and in some cases is linked to significant increases in young people reporting anxiety, depression, self-harm and suicidal ideation.
Will the Minister outline what further steps the Government are taking to protect young people from harmful content and to prevent young people from circumventing age restrictions? We have discussed potentially raising those restrictions, but that is also key.
I have been contacted by parents about this issue, and there is growing support for the Smartphone Free Childhood movement in Bournemouth and throughout the country. One parent of a five-year-old daughter got in touch with me. He shared some very misogynistic content that he came across and expressed worry about it being so easy for her to access that content. Other schools in my constituency have spoken specifically about the impact of misogynistic content on school behaviour, and the extra effort they have to make to try to address it without the toolkits that they need.
It is not just about content. As my hon. Friend the Member for Lowestoft (Jess Asato) mentioned, platforms produce compulsive and addictive behaviours. Another parent, a father of two who is an active member of the Smartphone Free Childhood movement, raised with me the addictive design features, from the pull to refresh to the gamification, photo scrolling and push notifications. He has a lot to say on that because he was somebody who designed such features for smartphones and social media. He has a real concern about how they affect both children and adults. Does the Minister’s assessment of online safety include an assessment of the impacts of addictive and compulsive features? What steps is the Department taking to address them?
Among the parents I have spoken to there is a variety of views about what the answer is. Some want a ban on smartphones for under-16s; some want an Australia-style under-16 ban on social media; and some would prefer a school-led approach. Many are simply worried about how to push back on the social pressure that they get from very young children to have smartphones and access to social media. Will the Minister outline what steps her Department is taking to engage young people, parents and schools about the best way to find an appropriate solution?
As we have heard from many Members, technological change often goes faster than we can deal with. We cannot keep up with it. We have the ability to regulate it, but it is a bit too slow. I commend the Ministers and the Government for their work on this issue, but encourage them to take heed of the things we have heard about today, support the efforts by parents and teachers to help young people to grapple with the challenges, and ensure that the measures we introduce reflect the need to protect and safeguard young people sufficiently.
It is a pleasure to serve under your chairmanship, Sir Desmond. I am delighted to contribute to this extremely important debate. I thank the hon. Member for Sunderland Central (Lewis Atkinson) for setting out in such a well-informed and balanced way the issues we are considering.
With your permission, Sir Desmond, may I take this opportunity to address Ellen Roome directly? I want to pay tribute to your great courage and bravery. I am a mother of two young children. I cannot begin to imagine what you have been through. To start this petition and push this campaign forward in the way that you have is—
No; all remarks are addressed to the Chair.
Okay. I will do so now. Thank you, Sir Desmond.
I pay tribute to Ellen for her campaign. I also want to say how cross I am that Ellen is having to push this campaign to get access to Jools’ online data. A number of us who were here during the passage of the Online Safety Act in the previous Parliament attended a meeting organised by Baroness Kidron, at which she brought together Ian Russell and some of the lawyers who supported him in Molly’s case. They talked powerfully about the battles they had to go through to access data. Baroness Kidron led a really strong campaign to change the law but, sadly, it has still not happened, which is why we are here today.
The use of social media accounts is now prolific across society, especially for young people. Ofcom’s 2023 Online Nation report highlighted the fact that children aged eight to 10 spent an average of two hours and 23 minutes a day online. That rose gradually to an average of four hours and 35 minutes online daily—the equivalent of 66 days online per year—for 15 to 17-year-olds. That is just an average; we all know that a number of young people spend far more time than that online.
The digital age has transformed the way we live, communicate and interact, and social media in particular has become an integral part of our daily lives, especially for children. Although the platforms offer numerous benefits, they also pose significant risks. As Liberal Democrats, we advocate a balanced approach that respects the privacy of our young people while ensuring their safety and wellbeing.
The right hon. Member for East Hampshire (Damian Hinds) touched on the point that of those aged eight to 17 with profiles on social media, video sharing platforms or messaging platforms, nearly six in 10 have said they use more than one profile on any particular social media app or site. When asked why, just under a quarter said it was because one account was just for parents and families to see, while a similar proportion said that one account was for close friends and one was for everyone else. Meanwhile, 13% of eight to 17-year-olds who had more than one profile said that one account was for the “real me” and another contained edited, filtered posts or photos. Those statistics tell us an awful lot.
Children themselves are concerned about their time online. An Ofcom report last year showed that young adults were less likely than older people to think they had a good balance between their online and offline lives. Another Ofcom survey showed that children’s concern about their time online increases with age. Indeed, last year the Children’s Commissioner published a brilliant report on the “Big Ambition” survey, in which she spoke to more than 367,000 children. The survey found that staying safe online was a huge issue and priority for many young people.
We must remember that young people want to be consulted and involved in the discussion and solutions. It is not just about us telling them what is right, and it is not about the tech companies telling them what is right: it is about involving young people in the solutions. That is why the ongoing inquiry into youth violence and social media by the Youth Parliament is so important. I urge young people throughout the country to participate in the inquiry by sharing their experiences on social media, and I keenly await the publication of the inquiry’s findings.
Behind each of the statistics I have cited are young people, their peers and their families. I have heard from some of those parents and young people in my constituency, and I thank the 465 people in Twickenham who signed the petition. I also speak as the mother of two young children. I have a six-year-old and a 10-year-old, and the 10-year-old is desperately begging her parents every single day for a smartphone. Some of her friends already have their own YouTube channel. We are trying to delay as long as we can—hopefully until some time into her secondary schooling—before we give her a phone. I know, as a parent and from hearing from other parents and young people in my constituency, that we as legislators have a responsibility to act.
I am afraid it was after many years of delay that the Conservative Government introduced the Online Safety Act. The Liberal Democrats welcomed a number of the measures in the Act as an important step forward, and we support its swift implementation. Empowering coroners to obtain information from online services about a deceased child’s online activity was a significant step in the right direction but, as we have heard so powerfully today, there is a strong case to be made for parents to be able to access data after their child is deceased. That provision should be made retrospective as well. As others have pointed out, the data Bill provides an ideal opportunity to explore such a change and how it could work.
However, measures often come too late, and too many young people’s lives have been tragically lost already. We cannot afford to delay before we take some sort of action, and there is much more we can do to protect our children and young people online by putting in place more guardrails, as others have described them. Social media companies must do more to enforce their existing minimum-age requirements, using the latest age-verification technology. They must do more to create age-appropriate digital environments and increase transparency in their data practices. Ofcom should do more to use the full powers of the Online Safety Act, including looking at the harms caused by the functionality and design of social media, as well as the content.
After meeting organisations such as the Internet Watch Foundation, 5Rights and the Molly Rose Foundation, it is clear to me that we must push for not just strong regulation but safety by design. We must recognise that children and teenagers are particularly vulnerable to the dangers of the online world. Cyber-bullying, exposure to harmful content and online prejudices are just a few of the threats they face. Both the Government and the social media companies must do much more to protect children from harmful content and activity online. I would like to hear what the Government are doing to work with Ofcom to ensure that children are protected during the transition period.
We must also be mindful of the importance of privacy and trust. There are good reasons why parents cannot access children’s data while they are alive. That is an important safeguard, and we have heard some of the reasons for having it. Adolescence is a time of exploration and self-discovery, and young people need space to express themselves freely. However, that safeguard relies on children being kept safe online, which is patently not currently the case, so Ofcom and social media companies need to do much more on that front. Any measures that we implement must strike a delicate balance between safeguarding children and respecting their right to privacy.
Education is crucial to achieving that balance. Schools need to teach children about online dangers and how to use the internet and social media safely and responsibly. Parents must also be empowered to protect our children online—I say that as a parent who feels like I am way behind my younger children—including through digital literacy education and advice and support on best practice. Dare I say, although this is not necessarily a politically expedient thing to say, that we parents also have a lot of responsibility over how much time we allow our children to spend online and what devices we give them access to. It is hard when our children face so much peer pressure, but we need to take responsibility too.
The Education Committee report last year, “Screen time: impacts on education and wellbeing”, also called for education, as well as a cross-Government, holistic approach. It said:
“Government should work across departments including DHSC, DSIT, Education and the Home Office to produce guidance for parents on how to best manage and understand the impact of screen time on their children.”
I look forward to what the Minister has to say on that point. That is why the Liberal Democrats are also calling on the Government to create an independent children’s online safety advocate, as called for by the NSPCC, which would act like a consumer watchdog to promote and protect children’s interests. We must ensure that proper safeguards are in place and that children are not just protected from online harms but empowered to exercise their digital rights.
This petition on parental access to children’s social media accounts highlights a critical issue that demands our attention. As we navigate the complexities of the digital age, we must prioritise the safety and wellbeing of our children. By implementing thoughtful and balanced measures, we can protect our young people from the dangers of the online world while respecting their right to privacy. Let us move forward with compassion, determination and a commitment to creating a safer digital future for our children. Thank you, and with apologies, Sir Desmond.
It is a pleasure to serve under your chairmanship, Sir Desmond, and I thank the hon. Member for Sunderland Central (Lewis Atkinson) for introducing this debate. I would like to start by thanking Ellen Roome for her determined work in fighting to highlight this issue. Her courage and her stoicism in pursuing this cause have been hugely impressive, and Parliament would not be debating this today were it not for her impassioned commitment.
This e-petition has garnered some 126,000 signatures in support of calls to give parents and guardians the right to access the social media accounts of their children. We have heard many important contributions from Members this afternoon, and I am sure that parents across their constituencies will be grateful to them for doing so. The hon. Members for Cheltenham (Max Wilkinson) and for Darlington (Lola McEvoy) paid tribute to Ellen Roome and have shared her own words. The hon. Members for Sunderland Central and for South Devon (Caroline Voaden) spoke about the refusal of social media companies to release data, citing legal restrictions. The hon. Members for Worcester (Tom Collins) and for Lowestoft (Jess Asato) spoke of the impact of harmful content on children’s development, and my right hon. Friend the Member for East Hampshire (Damian Hinds) spoke about how current legislation gives control to children as young as 13.
With the vast majority of children now having access to a phone or tablet by the age of 12, children are exposed to an enormous range of content online. Many children are being exposed to social media content that is inappropriate and dangerous and poses substantial risks to safety and development. There has been a growing crisis in children’s mental health, with recent research highlighting that 32% of eight to 17-year-olds state that they have viewed worrying or upsetting online content in the last 12 months, yet only 20% of parents with children and teenagers in that age group report their child telling them they had seen something online that scared or upset them during the same timeframe. Evidence has shown that the widening of access to the internet has seen more children moving away from social interactions, with the subsequent detrimental impacts on mental health and social development.
We welcome much of the work that this Government are doing on protections for children by building on the foundations laid by the previous Government, but could I ask the Minister what is being done to increase mental health support for children? In January last year the Labour party pledged to introduce specialist mental health support for children and young people in every school, as well as open-access children and young people’s mental health hubs in every community, as part of the child health action plan. Although I appreciate that it is not part of her brief, could the Minister outline what progress the Government are making towards the delivery of those pledges, as they relate to this topic more broadly?
Keeping children safe online in the current media landscape is a challenge that will require agile and adroit legislation that simultaneously keeps pace with technological developments and reflects cultural usage of media platforms. We also need to recognise the power that social media giants now hold, and ensuring accountability will be a key aspect of any legislation. We must ensure that parents have the right to be able to ensure that their children are safe from harm on platforms, especially in circumstances where children may be being mistreated.
I have previously heard Ellen describe how social media companies have abdicated responsibility in assisting in the disclosure of messages that could help to identify how a tragedy has occurred. In Jools’ case, TikTok has not released any of the messages on his account, and Instagram Meta has released some but not all. Any parent should be concerned that they will not have the right to access details of their child’s online life, even if it is suspected to have contributed to their death. Parents like Ellen are currently required to take legal action to pursue the release of such information and, even if they have the financial resources to do so, why should any parent be forced to go to such lengths just to find out what may be, at best, critical information and, at worst, closure? The majority of parents do not even have access to such resources.
As a newly elected Member, I will not stand here and pretend that the previous Government got everything right, but the Online Safety Act was a crucial and positive step forward to keeping more children and young people safe online so that fewer families have to face situations like those we have heard and spoken about in this debate. Under section 101 of the Act, Ofcom has the power to support the investigation of a coroner or procurator fiscal into the death of a child via the data preservation measure. The measure came into effect under the previous Government in April last year, and it is under this section that the amendment that would be Jools’ law would sit.
Although the current iteration of section 101 is a step in the right direction, it is not an easily accessible outcome and it can only be put into effect following a tragedy. In many instances, parental access to social media accounts could prevent tragic outcomes. Do the Government plan to introduce legislation to give parents and guardians the right to access their child’s social media accounts and the messages contained within them? If they do, would that build on the Online Safety Act?
There are further considerations that must be taken into account, such as safeguarding. Though parental access to children’s social media accounts may sound like a simple and prudent solution, not every child has parental figures who have their best interests at heart, and that includes vulnerable children in a family with an abusive parent. A child who is seeking help in communicating domestic abuse to friends or organisations may find their only avenue of escape is compromised. There may also be instances in which a parent could use their child’s social media account to gain access to information about other children and teenagers. There are therefore wider implications to granting parents unrestricted access to the information of children other than their own, as that could unintentionally make unsolicited and inappropriate contact easier. Would the Minister consider how parental access rights could be designed to give parents the ability to monitor their children’s safety and to ensure children have the privacy they may need to facilitate their own safety, and how such measures could be designed so as not to be exploited by any of the parties that are subject to them?
I was reassured to see the Secretary of State for Science, Innovation and Technology meeting bereaved parents who have lost children after being influenced by harmful content online. I also welcome the publishing of the Secretary of State’s “Draft Statement of Strategic Priorities for online safety” in November last year, which provided clarity on the framework that the Government will expect the independent regulator to work within. The Secretary of State has stated that the Government will be
“implementing safety by design to stop…harm occurring in the first place”,
and we should consider whether the expectation should fall on users themselves to take precautionary steps to avoid severely harmful content. Given how instrumental algorithms are in pushing themed content to users’ feeds, what plans do the Government have to give users the ability to opt out or reset these algorithms?
We support parents in raising concerns about content they do not want their children to see by requiring sites to take measures to remove content as soon as it is flagged. Since the introduction of the 2023 Act, we have seen many cases in which the response from platforms has been far quicker than before, and we would welcome a detailed plan that lays out how the Government will ensure that all companies act quickly and the cost of their not doing so.
It is right that services must assess any risk to children from using their platforms and set appropriate age restrictions to ensure that child users have age-appropriate experiences and are shielded from harmful content, such as pornography or content relating to violence, self-harm, eating disorders or even suicide. That is why the last Government tightened up age restrictions by requiring social media companies to enforce their age limits consistently and protect their child users, but many parents still believe that these age limits are too easily circumvented by children lying about their age. The Government talk of ensuring that age-assurance technology to protect children is being effectively deployed, but how do the Government intend to ensure this? How do they intend to ensure that companies are investing in the most up-to-date technology to facilitate that? Will the Government proactively stress-test that capability and, if so, how?
For all of this, Ofcom plays a vital role. As an evidence-based regulator, its task is to regulate the trust and safety systems and processes. Its role is not necessarily to police individual pieces of content; it is to ensure companies have the correct measures in place to minimise harms to users. At the end of last year, we heard about how the Government had informed Ofcom that it would need to build more safety measures into these systems. I would welcome the Minister’s outlining how the Government will aid Ofcom in its aims and ensure that any Government support needed will be supplied. These regulations would not be anything without empowering Ofcom to take action, which is why we gave it powers to issue fines of up to £18 million or 10% of global revenue, whichever is higher, or to pursue criminal investigations into senior managers if they fail to comply with enforcement notices. Will the Minister outline what steps the Government are taking to make sure that Ofcom brings forward its children’s safety codes and guidance in April?
As we have all seen, technology keeps moving and advancements are constantly made, so the risk of digital progress outstripping the pace of legislation is an all too real prospect. We must embrace technology and understand that the internet and social media, embedded in our daily lives, can be a force for good, but we must also understand that checks and balances are essential if we are to ensure a safe online environment not only for today’s users but for those newly entering the online world. It is for the Government not only to guarantee an environment conducive to users of all ages, but to ensure that parents have the confidence that the online environment can be made as safe as they strive to make the home environment.
It is a pleasure to serve under your chairmanship, Sir Desmond. I start by paying tribute to Ellen Roome both for launching this petition and for all the campaigning she has done in this area. Let us take a moment to remember her son, Jools. As a parent, I know that we do everything to keep our children safe. We teach them how to cross a road and why it matters not to talk to strangers—we do all we can, but it can still be terrifying to think about what our children are exposed to, even in the safety of our own homes. I can only imagine how it would feel for a parent not to know how or why their child lost their life. I know that parents across the country feel the same way.
As we have heard, Ellen’s petition received over 120,000 signatures between 10 May and the dissolution of Parliament on 30 May. That shows the strength of feeling on this issue, and I am grateful to the brave parents, including Ellen, Ian and others who campaigned on this issue during the passage of the Online Safety Act, who continue to shine a light on it. The Secretary of State has met them a number of times, and their views are absolutely crucial to the work we are doing in this area. Finally, I thank my hon. Friend the Member for Sunderland Central (Lewis Atkinson) for securing a debate on this e-petition on behalf of the Petitions Committee, along with other hon. and right hon. Members for their powerful contributions.
I know how long it has taken to get the Online Safety Act across the line. It is not a perfect piece of legislation, and the delay in delivering it has come at a heartbreaking human cost. As the Secretary of State has set out numerous times, we are working to implement the Act as quickly as we possibly can so that the protections it puts in place can begin to change the online world that our children experience.
The Act has two provisions relevant to this debate. First, section 101 seeks to address problems faced when there is uncertainty over the circumstances leading to the death of a child. The provision supports coroners and procurators fiscal in their investigations by giving Ofcom the power to require information about a child’s online activity following a request from the investigating coroner. It is already in force, and the coroners have begun to make use of the powers available to them.
Secondly, section 75 imposes additional duties on categorised services to be transparent with parents regarding a company’s data disclosure processes following the death of a child. We have been clear that we plan to build on the Online Safety Act where it does not go far enough, and the Secretary of State only yesterday set out how the Online Safety Act is uneven and, in some cases, unsatisfactory. He also set out the need for Parliament to learn to legislate much faster—we cannot wait another 10 years to make changes to the legislation.
At the end of last year, the Secretary of State decided to use his powers to issue a statement of strategic priorities to Ofcom, asking them to ensure that safety is embedded in our online world from the very start. That is why the Government will also seek to establish a data preservation process through clause 122 of the Data (Use and Access) Bill. The proposed clause will require Ofcom to issue a data preservation notice to specified companies at the request of the coroner or, in Scotland, the procurator fiscal. That will require these companies to preserve information relating to the use of their services by the child who has died. This proposal fulfils a manifesto commitment to further strengthen powers, and will help coroners understand the tragic circumstances surrounding a child’s death.
Let me turn to the matter of coroners sharing information with families. Interested persons, including bereaved families, have the right to receive evidence from coroners, subject to their judicial discretion. The chief coroner has provided detailed guidance on this. Coroners have a statutory duty to issue a prevention of future deaths report if their investigation reveals that future deaths could be prevented by one or more measures. Evidence accessed via Ofcom powers will help to inform a decision on whether a report should be issued.
I know from parents and children just how complex this issue is. The Secretary of State recently visited the NSPCC, where he met a group of young people to understand more about their lives online. The NSPCC was concerned that giving parents complete access to their children’s social media accounts could raise complex issues around children’s rights to privacy and, in extreme cases—as we have heard today—safeguarding. For example, as raised earlier, if a child is exploring their sexuality online, they may not want their parents to know and they would be right to expect that privacy.
All Members raised the retrospective application of section 101 of the Act. Ofcom’s powers to require information from companies on behalf of coroners can still be used where a second coroner’s inquest is ordered. Ofcom can use these powers on the instruction of a coroner. Ofcom will also be able to use data preservation notices in the event that a second coroner’s inquest is ordered. Any personal data that is captured by the data preservation notice, and held by the online service at the time of issue, will still be in scope and must be retained upon receipt of notice. However, I have heard very powerfully from all Members today about the lengths parents have to go to request a second inquest and about the associated costs. As I have said, the legislation is not perfect and there is room for improvement, and I would like to meet Members and parents to explore this matter further. We need to continue to review the legislation.
When it comes to age limits, a smartphone and social media ban for under-16s has been raised. We are aware of the ongoing debate as to what age children should have smartphones or access to social media. As the Secretary of State for Science, Innovation and Technology has previously said, there are no current plans to implement a smartphone or social media ban for children. We will continue to do what is necessary to keep our children safe online.
On that note, we have heard from several Members today about their concerns for children’s mental health, when their expectations are often measured against heavily doctored images they see online. Will the Minister commit to use and/or amend legislation that commits hosts—as is common with regulated news outlets —to clearly identify doctored imagery, and the accounts and pages that spread them?
I will come to that point.
On the issue of a ban on smartphones and social media for under-16s, we are focused on building the evidence base to inform any future action. We have launched a research project looking at the links between social media and children’s wellbeing. I heard from the hon. Member for Esher and Walton (Monica Harding) that that needs to come forward and I will pass that on to my colleagues in the Department.
My hon. Friend the Member for Lowestoft (Jess Asato) mentioned the private Member’s Bill in the name of my hon. Friend the Member for Whitehaven and Workington (Josh MacAlister). We are aware of his Bill and share his commitment to keeping children safe online. We are aware of the ongoing discussion around children’s social media and smartphone use, and it is important that we allocate sufficient time to properly debate the issue. We are focused on implementing the Online Safety Act and building the evidence base to inform any future action. Of course, we look forward to seeing the detail of my hon. Friend’s proposal and the Government will set out their position on that in line with the parliamentary process.
My hon. Friend the Member for Darlington (Lola McEvoy) raised the issue of Ofcom’s ambitions. Ofcom has said that its codes will be iterative, and the Secretary of State’s statement will outline clear objectives for it to require services to improve safety for their users.
The hon. Member for Twickenham (Munira Wilson) and my hon. Friend the Member for Bournemouth West (Jessica Toale) mentioned engagement with children, and we know how important that is. Ofcom engaged with thousands of children when developing its codes, and the Children’s Commissioner is a statutory consultee on those codes, but of course we must do more.
The hon. Member for Huntingdon (Ben Obese-Jecty) raised the matter of mental health services and our commitment in that regard. He is right that the Government’s manifesto commits to rolling out Young Futures hubs. That national network is expected to bring local services together to deliver support for not only teenagers at risk of being drawn into crime, but those facing mental health challenges, and, where appropriate, to deliver universal youth provision. As he rightly said, that is within the health portfolio, but I am happy to write to him with more detail on where the programme is.
We want to empower parents to keep their children safe online. We must also protect children’s right to express themselves freely, and safeguard their dignity and autonomy online.
The Minister spoke earlier about age limits. I was not sure if she had finished responding to Members’ comments and questions, and whether she would be able to comment on not only what the various age thresholds should be, but what they mean. In particular, if the GDPR age is 13, does that mean that parental controls can effectively be switched off by somebody of age 13, 14 or 15?
I am sure the right hon. Gentleman’s party would have discussed the issue of the age limit and why it was 13 during the passage of the Online Safety Act.
I am more than happy to write to him in detail on why the age limit has been set at 13. As I said, there is currently a live discussion about raising the age and evidence is being collated.
The challenge of keeping our children safe in a fast-moving world is one that we all—Government, social media platforms, parents and society at large—share. As we try to find the solutions, we are committed to working together and continuing conversations around access to data in the event of the tragic death of a child.
I will finish by again thanking Ellen for her tireless campaigning. I also thank all the speakers for their thoughtful contributions. I know that Ellen has waited a long time for change and we still have a long way to go. Working with Ellen, the Bereaved Families for Online Safety group, other parents and civil society organisations, we will build a better online world for our children.
I thank Members from across the House for their contributions to this well attended and powerful debate. We have demonstrated significant consensus across the House about the need to continue to strengthen and evolve online regulation, which, as the petition shows, is a matter of significant public concern. I think it is fair to say that the public expect Parliament to work on a cross-party basis on these issues, and I hope that in future months we can continue the debate in the spirit in which it has been conducted today.
I thank the Minister for her response. I particularly welcome her indication that the Government will give consideration to any potential changes to the forthcoming data Bill to put it beyond doubt that, in historic cases such as Jools’, parents have rights to access data following bereavement and that the powers that have been set out to coroners can be applied without undue financial penalty or campaigning by parents like Ellen.
Of course, the people that we have not heard from today are the social media companies, although I am sure they are listening to us all. I invite them to demonstrate that they genuinely want to do the right thing—to get in touch and do the right thing for bereaved parents like Ellen; not to hide behind data protection regulations; and to actively engage with policymakers and families to get this right, to allow children to be protected and, in tragic cases when a child has died, to give parents the opportunity to grieve properly.
I thank the Petitions Committee staff for their assistance in organising and preparing the debate, which it has been an honour to lead. It is right that I use the final words of the debate to pay tribute to the 126,000 petitioners, and in particular to Ellen Roome: may Jools’ memory always be a blessing, and I sincerely hope that today’s debate helps you and other bereaved families get the data and the answers that you need.
Question put and agreed to.
Resolved,
That this House has considered e-petition 661407 relating to children’s social media accounts.