Children’s Social Media Accounts

Damian Hinds Excerpts
Monday 13th January 2025

(2 days, 6 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Hinds Portrait Damian Hinds (East Hampshire) (Con)
- Hansard - -

It is a pleasure to see you in the Chair, Mr Twigg.

I join colleagues in thanking the petitioners, and Ellen Roome in particular, for initiating the petition and enabling this Westminster Hall debate. We were all deeply affected by hearing the statement that was just read out. Ellen, you have the sympathies of everybody here on the loss of Jools aged just 14. We think also of other bereaved families and other campaigners—in the last few days we have been reminded of Ian Russell and the work he has done since the tragic death of Molly—and all those who take the most unimaginably awful situation for a parent and a family and use it to try to make something better for others for the future.

The Government’s response to the petition notes not only that, under the Online Safety Act, platforms have to set out their policy for dealing with such tragic situations, but that the Act

“introduces measures to strengthen coroners’ ability to obtain information”

from platforms via Ofcom, thereby providing a route for parents. We will have to see how that works in practice and how timely it is. What we must not do is put a new, onerous layer on top of parents at the most difficult time imaginable, as they are grieving.

As has been mentioned, there is also the question of historic cases. There will be future historic cases, because not in every case will the inquest have covered this question. I hope the Minister will be able to say a word about whether the data Bill is the opportunity to put it beyond doubt that, ultimately, the parent has an absolute right, with the right safeguards and verifications, to see the information related to their child.

Let me turn from the most tragic of cases to all families and all children. I start with the most important point, which is that trust, support and love within families are the most effective things. Most of the time it is irrelevant what the law is because, within families, we set our own rules. Generally, it is clear that even if our rules are, at times, a pain for our children, they are well-intentioned. We must also note that not quite all families are loving families. Some parents are abusive, and children must always have ways confidentially to seek help from child protection services, the police, the health service and bona fide charities. That applies at any age.

It is also true that everyone needs a degree of privacy, but there have always been different degrees of privacy, and how private something is should be proportionate to the level of risk involved. In discussing accessing online services, we are talking about things that can have very serious consequences. We want and need to be able to protect our children from harm—from bullying, from unwanted contact, including from adults, and from being drawn to dangerous interests, which can become dangerous obsessions. We also have a responsibility, and we should be held responsible, for them not perpetrating harms on others. Although we trust our children, we know that children do sometimes get into trouble and can come under pressure, and in some cases severe coercion, from others. Of course, they potentially have ready access to material of all sorts that is much more harmful than we had as children. They can go deeper and deeper down rabbit holes.

Parents are not the only ones who can help children, but they have a unique position in children’s lives and are uniquely placed to help and support them. That is why I agree in principle with the petitioner that parents should have a right to see what their child is subjected to or is doing for as long as they are a child and we, as the parents, are responsible for them—and that means at least until age 16. There is a separate debate to be had about the extent of that, and what the threshold and process should be. I understand entirely what the hon. Member for Sunderland Central (Lewis Atkinson) was saying. I do not think anybody is proposing constant, ongoing monitoring, but there are situations that a child could find themselves in that I believe warrant the availability of that access.

There is also a problem, or a hurdle, with the principle: we can only request access to something that we know exists. It is common for children to have multiple social media accounts on a single platform. They probably have different names these days, but people used to call their fake and real accounts finsta and rinsta. The account their mum sees is not necessarily the real one—ironically, the one that was called “fake” was the one where their real lives were actually happening. Of course, they could also be on lots of other platforms that parents and others do not necessarily know about.

I agree with the hon. Member for Sunderland Central, who opened the debate on behalf of the Petitions Committee, that it is of paramount importance that we are able to put some guardrails around what children can access. That is one of the reasons we have parental controls. How those controls work, and the limits of them, are what I want to talk about this afternoon.

I will read out a short note from Microsoft, which is not a company that people normally worry about—it is a very responsible operator—to a constituent ahead of their child’s 13th birthday. It says:

“Congratulations on Fred’s birthday. At this age, certain laws allow them to have more control and choices over their own account settings. This means that they’ll be able to change a number of family safety settings, even if you already have them set up. Fred will also need to allow you to continue receiving data about their activities to guide their digital journey. They can turn off your ability to see their activity on Windows, Xbox, and Android devices. They can turn off your ability to see their devices and check on updates…safety settings like firewall and antivirus…They can stop sharing their location through their mobile phone.”

That was for a child approaching their 13th birthday, which leads me to question what “certain laws” are being cited. I can only assume it is the Data Protection Act 2018, which sets out that

“a child aged 13 years or older”

can

“consent to his or her personal data being processed by providers of information society services.”

The genesis of that was European law, and Parliament was debating and voting on it in parallel with, but before actually completing, exit from the European Union. The age 13 is not universal. EU law specified a range between 13 and 16, and multiple countries did select 13, but not all. France set the age at 15, with some limited non-contractual consents for data processing allowed between 13 and 15. Germany and the Netherlands set the age at 16. There is that question of what is the appropriate age, but the other big question is what that age actually means.

The 2018 Act was passed before we considered the Online Safety Bill, which became the Online Safety Act 2023, but we were already concerned in this House about online safety, and I am fairly sure that it was not Parliament’s intent to reduce parental oversight. In particular, I do not think saying that a service can have a child sign up to it at 13 is the same as saying that the parent cannot stop them. Still less, it is not the same as saying that the parent should not be able to know what their child is signed up to.

In setting out why the age was set at 13, the explanatory notes to the 2018 Act say, quite rightly, that that is in line with the minimum age that popular services such as Facebook, WhatsApp and Instagram set, but they go on to say, slightly unrelatedly:

“This means children aged 13 and above would not need to seek consent from a guardian when accessing, for example…services which provide educational websites and research resources to complete their homework.”

I think that sentence might have a lot to answer for. It sounds very sensible—we would not want children having to get over hurdles to finish their homework—but if we think about it, it is not necessary to sign up to research something on the internet for homework anyway, and educational websites are generally exempt from consent requirements. But the big question is, what else might it allow—or, crucially, what else might it be interpreted to allow?

I repeat that I do not believe that it was Parliament’s intent in effect to disable parental safety controls for 13, 14 and 15-year-olds. There is a whole other question about those safety controls themselves and how they work, and how difficult it can be for parents—and even all of us, who tend to think we are quite good at this sort of thing—to keep on top of them, particularly if they have multiple children, different operating systems and multiple platforms. There really should be a single industry standard entry system that can cover all of screen time and basic, entry-level approvals with a default “safety on” version of the different platforms.

We talk about age thresholds and age limits; there is a whole other set of questions about how those apply and how we make age assurance or age verification work properly. Those are both debates for another day. Today, I simply ask the Minister: is it the Government’s understanding of the existing legislation that children under 16 should be able to switch off parental controls? If not, what could be done to clarify the situation? Is a change needed in primary legislation?

--- Later in debate ---
Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I will come to that point.

On the issue of a ban on smartphones and social media for under-16s, we are focused on building the evidence base to inform any future action. We have launched a research project looking at the links between social media and children’s wellbeing. I heard from the hon. Member for Esher and Walton (Monica Harding) that that needs to come forward and I will pass that on to my colleagues in the Department.



My hon. Friend the Member for Lowestoft (Jess Asato) mentioned the private Member’s Bill in the name of my hon. Friend the Member for Whitehaven and Workington (Josh MacAlister). We are aware of his Bill and share his commitment to keeping children safe online. We are aware of the ongoing discussion around children’s social media and smartphone use, and it is important that we allocate sufficient time to properly debate the issue. We are focused on implementing the Online Safety Act and building the evidence base to inform any future action. Of course, we look forward to seeing the detail of my hon. Friend’s proposal and the Government will set out their position on that in line with the parliamentary process.

My hon. Friend the Member for Darlington (Lola McEvoy) raised the issue of Ofcom’s ambitions. Ofcom has said that its codes will be iterative, and the Secretary of State’s statement will outline clear objectives for it to require services to improve safety for their users.

The hon. Member for Twickenham (Munira Wilson) and my hon. Friend the Member for Bournemouth West (Jessica Toale) mentioned engagement with children, and we know how important that is. Ofcom engaged with thousands of children when developing its codes, and the Children’s Commissioner is a statutory consultee on those codes, but of course we must do more.

The hon. Member for Huntingdon (Ben Obese-Jecty) raised the matter of mental health services and our commitment in that regard. He is right that the Government’s manifesto commits to rolling out Young Futures hubs. That national network is expected to bring local services together to deliver support for not only teenagers at risk of being drawn into crime, but those facing mental health challenges, and, where appropriate, to deliver universal youth provision. As he rightly said, that is within the health portfolio, but I am happy to write to him with more detail on where the programme is.

We want to empower parents to keep their children safe online. We must also protect children’s right to express themselves freely, and safeguard their dignity and autonomy online.

Damian Hinds Portrait Damian Hinds
- Hansard - -

The Minister spoke earlier about age limits. I was not sure if she had finished responding to Members’ comments and questions, and whether she would be able to comment on not only what the various age thresholds should be, but what they mean. In particular, if the GDPR age is 13, does that mean that parental controls can effectively be switched off by somebody of age 13, 14 or 15?

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I am sure the right hon. Gentleman’s party would have discussed the issue of the age limit and why it was 13 during the passage of the Online Safety Act.

--- Later in debate ---
Damian Hinds Portrait Damian Hinds
- Hansard - -

Will the hon. Lady write to me?

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I am more than happy to write to him in detail on why the age limit has been set at 13. As I said, there is currently a live discussion about raising the age and evidence is being collated.

The challenge of keeping our children safe in a fast-moving world is one that we all—Government, social media platforms, parents and society at large—share. As we try to find the solutions, we are committed to working together and continuing conversations around access to data in the event of the tragic death of a child.

I will finish by again thanking Ellen for her tireless campaigning. I also thank all the speakers for their thoughtful contributions. I know that Ellen has waited a long time for change and we still have a long way to go. Working with Ellen, the Bereaved Families for Online Safety group, other parents and civil society organisations, we will build a better online world for our children.