(2 weeks, 2 days ago)
Commons ChamberOur children’s use of phones and social media give us many things to worry about, but broadly speaking they are grouped into three categories.
The first is about content, going from pornography and violence and the insidious effects of curated lives, influencers and celebs on our children and their sense of self-worth, their body image and so on through to dodgy news and views propagated across the internet not by worth, let alone veracity, but by engagement and likes. All of those things have vortexes that children can get sucked deeper and deeper into.
The second is about contact. Contact includes, in the worst cases, child abuse and the generation of child sexual abuse material, and goes through to, at a lower level, contact that can be from other children, such as what we call in this House cyber-bullying, although no child ever uses that phrase; they just talk about people being very mean to each other online.
The third is about the sheer amount of children’s time that gets sucked into these activities. It is the compounding factor, because it is the thing that makes the other two things, content and contact, worse and more risky. It also has an effect on children’s sleep, on their concentration and even on their physical development, and it crowds out the other things that we want children to be doing and that children themselves want to be doing, when they do actually do them. If we ever do get a child away from their phone for a full weekend, they talk about how wonderful the experience was with their friends.
The Online Safety Act 2023 did some good things on content and on contact. There was more to do, but it made some good progress. We have a lot more to do, in particular on the topic of time and the addictiveness of social media, and that is where I think the work of the hon. Member for Whitehaven and Workington (Josh MacAlister) has been incredibly valuable. I commend him on all his work in the lead up to this point and his use of convening power to bring together so many individuals and organisations. Those conversations, some of which I had the opportunity to attend, covered a huge range. Obviously the Bill we have in front of us today is, shall we say, somewhat thinner than the Bill envisaged.
I also attended events that the hon. Member for Whitehaven and Workington pulled together. Does my right hon. Friend agree that the strong characteristic that came out of all of them was deep and profound anger among parents about what has been allowed to develop?
I think that is right. The other thing I was struck by in some of the sessions was the great unity of views. Whether it was trade unions, charities, parent groups, doctors or parents, there was a great commonality of view about what needed to be done.
I understand what happens sometimes with private Members’ Bills and the need to make progress and to have Government support, but I say to the Government that this is a huge missed opportunity. If the Minister looks behind him, he will see all his colleagues who have rearranged their Fridays and rearranged their surgeries and all their appointments because they believe in this subject. He should heed the list that his hon. Friend the Member for Whitehaven and Workington read out of all the organisations that came together in support of action in this area. It is so worth doing, and we have made good progress with the Online Safety Act, but there is further to go.
There are things we can do with a private Member’s Bill that it is harder sometimes to do with Government legislation, because of the party political controversies that come in. This is a missed opportunity, because this may well be the only private Member’s Bill with a good chance of success in this area, being at the top of the ballot, in this entire Parliament.
The Bill as drafted is unlikely to require this House to divide, because there is not much in it that anyone could disagree on. I will, if I may, focus my comments on the things that the Bill envisages, such as the CMO’s advice for parents on the use of smartphones and social media, and the plan for research that the Secretary of State will prepare on the effect of the use of social media on children and the appropriateness and effectiveness of the so-called digital age of consent. I will say one very simple thing to the Minister about that research: the evidence is not perfect today; it will not be perfect in one year; it will never, ever be perfect. If we hang around waiting for perfect evidence, we will never act in the way that we should. Why is it not perfect? Because this is a phenomenon that has happened across the entire world at the same time. There is no control group.
Given that this is such a huge topic, the studies that there are, which try to narrow it down to something manageable, tend to end up looking at either Facebook or Twitter, neither of which is particularly relevant for teenagers. When we have proxy studies, they are generally inadequate. For phone use in schools, studies tend to look at a school that has a phone ban and a school that does not. That is a totally invalid scientific comparison, because there could be all sorts of other things going on, and the sort of school that is likely to do well in GCSEs is also likely to bring in a phone ban, so we cannot prove the direction of causality.
People will also tell us that there has not been enough time, because the technology is constantly developing. It may have been around for 20 years or so, but the current version of it has only been around for 18 months, so there has not been time to say conclusively what the effects are. None of that is about to change. The evidence will continue to be imperfect.
However, the evidence that we do have is pretty clear. We know, as the hon. Member for Dulwich and West Norwood (Helen Hayes), who chairs the Education Committee, mentioned, that there can be some benefit from relatively small amounts of screen time. The 2019 programme for international student assessment—PISA —study covered this in some detail, looking at multiple countries. It talked about a “Goldilocks” effect, whereby about an hour of screen time a day seemed to be correlated with increased wellbeing. But the same study found that in almost every country studied, with the fascinating exception of the Dominican Republic, high levels of internet usage were associated with lower levels of life satisfaction. There are lots of other studies, which colleagues have referred to, that look at happiness, quality of relationships, eyesight, sleep, concentration and so on.
Then there is the rising prevalence of mental ill health in young people. Often, when people look at the numbers on mental ill health, particularly in teenagers, they reach immediately for their preferred explanation for why teenagers are having these difficulties, and sometimes it gets quite political. It is important to note that the rise in teenage mental ill health is not a uniquely British phenomenon. On the two main measures of mental wellbeing used in the 2021 UNICEF-Gallup “Changing Childhood” study—“How often do you experience feeling worried, nervous or anxious?” and “How often do you experience feeling depressed or having little interest in doing things?”—the UK was broadly in line with the average of 21 countries, including France, Germany and the US. Actually, it was slightly better on most of the measures.
There are ample other studies from around the world, including the World Health Organisation’s multi-country “Health Behaviour in School-aged Children” study, France’s EDC—I will not attempt the language—study, which is quite a long time series, and the shorter time series in the United States, “Trends in Mental Wellbeing”. The best study of all is the NHS’s “Mental Health of Children and Young People in England”. I say in passing to the Minister that I do not think we have yet had a commitment from the Department of Health and Social Care to carry on with that time series. It is incredibly valuable, and that is a relatively simple thing that the Government could do.
I have said that the rise in teenage mental ill health is not a uniquely British phenomenon. It is also not only about covid. A lot of the studies in recent years have set out to answer the question, “What happened to children’s mental health during covid?” That is a perfectly legitimate question, but if we look at the shape of the curve, it looks very unlikely that it started in covid, and in the NHS study, it carries on growing long after covid, up until the most recent wave.
The Minister said this in a debate in Westminster Hall the other day, and he is right that it is entirely invalid to infer causality from correlation, but the Bradford Hill criteria, which his hon. Friend the Member for Whitehaven and Workington mentioned, are relevant, particularly the criteria of consistency, strength, plausibility, coherence and analogy, as well as temporality. In any event, it seems odd that we allow something to happen to our children because we cannot 100% prove that it causes harm, rather than because we can prove that it is safe. That is not the way in which we deal with children’s toys, food or medicine.
I turn the question around and say to people who query the direction of causality: with something like self-harm, are you honestly trying to tell me that incidents of self-harm in our country are nothing to do with the prevalence and normalisation of imagery around self-harm on social media? As I say, I worry that if we continue to seek perfect information, we simply will not act as we should. I have pages more to say, but I will not say them, because I know that many colleagues wish to speak.
(3 weeks, 6 days ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I thank Kim Campbell and the petitioners, including almost 400 from East Hampshire, for bringing this debate to Parliament. There has been a lot of interest of late in Australia’s upcoming ban on social media for under-16s, and I was interested in how the Australians are going to implement it, considering some of the complexities and definitional difficulties. I recommend to colleagues a very good interview on American National Public Radio with Australia’s eSafety Commissioner, in which she said that it is not about flicking some big switch. She said that there was a possibility that some social media functionality could be removed, rather than an entire app being blocked; that
“messaging and gaming sites and anything that delivers education or health care information”
would be exempt; and that, ultimately, it would be for the Minister for Communications to
“decide which platforms are in and which are out.”
Well, I hope they have invested in their legal defence budgets.
It is true that parents vary widely in what they think is good or acceptable. Everybody agrees that their child should be able to call or text home to let mum or dad know that they are delayed or feeling worried, or that their club has been cancelled. Some also value things that can be done only on a smartphone—such as using a map to find the way home—and there is a whole other debate about education technology and the use of Show My Homework and all the rest of it. Some parents are totally happy with the entirety of the electronic world—smartphones and social media. Let us be honest: it is parents who often help children get around the minimum age limit to be on these platforms. Sometimes, we say that they do that only for fear of the child missing out, and that may be true, but we do not know that it is in the majority of cases.
In addressing these questions as legislators, we often fall back on saying, “Hang on, we’re not talking about banning all phones; we’re talking specifically about smartphones. And we’re not talking about getting rid of the good stuff; we’re only talking about getting rid of the bad stuff.” This, of course, is the easy stage in the legislative process, and things become much harder later, when we have to define precisely what we mean. I am about to recommit that sin: I am going to talk about an ill-defined “it” that we may in some way want to restrict. That “it” is something about smartphones and social media that I will today fail to define, but I hope to come back at the end to say a little about more precisely what I mean.
I am not in the business of trying to put new restrictions on how parents manage their families or of trying to do things to them that they could do for themselves. There is already a minimum age for using social media; it just happens to be an arbitrary age that is based on some legislation—not even from this country—from the 1990s. When the GDPR came in through the European Union, which we were in then, countries could choose an age anywhere between 13 and 16. Different countries chose different ages; we happen to have settled on 13. Most people would say that we have to set the bar somewhere, so the question becomes, where? Of course, we could, alternatively, say that the Government or a regulator have no role in setting an age at all. However, if that is not our view, and we accept that there should be an age, we have to ask the secondary question: what should it be? There is no ancient right to be on TikTok at age 13. These are novel technologies, and we are facing these questions now for the first time.
In this country, there are two main thresholds for the transition from childhood to adulthood, and they are 16 and 18. Those are not the only ones, but they are the main ones. In English law, there has never been a concept of an age of digital consent and nor, to my knowledge, was there a non-digital concept of consent in contract law previously for somebody under the age of majority. I grant that it is arguable, but it seems that 16 is the most appropriate threshold.
I have met parents from Smartphone Free Childhood, but also young people. This is a big issue in Brighton Pavilion. Has the right hon. Member thought about pushing for the Minister and Members to talk more with young people about where the age limit should lie, rather than trying to come up with a number in the middle of a debate? It is clear from talking to young people that they feel that parts of social media are very toxic, but I also think they are best placed to judge where the limit should lie.
To be fair to the Minister and previous Ministers, I think they do make efforts to hear from young people. An interesting survey by the Youth Endowment Fund, which I commend to the hon. Lady and others, put an extreme proposition to 13 to 17-year-olds: “If you could turn off social media forever for you and everybody else, would you do it?” While a majority did not say yes to that extreme proposition, something like a third did. We also have various other surveys.
It is true that when we talk to children, as I am sure many colleagues have done in schools across their constituencies, we get a variety of views. In particular, children do not want to be left out, and as parents we do not want that for our children either. If everybody else is in a certain group or has a certain means of communication, we tend to want our children to have that too.
The evidence is not perfect. There is even evidence that some screen time is a positive good. A programme for international student assessment study in 2019 talked about a Goldilocks effect, where about an hour of screen time was beneficial for mental wellbeing, after which the benefit declined. That same study found wide differences in life satisfaction between what it called “extreme internet users” and others. There are now plenty of other studies on everything from happiness, the quality of relationships and eyesight to the effect on sleep and concentration.
There is also the rising incidence of mental ill health among teenagers, which—for the avoidance of doubt and to take politics out of it—is not unique to this country and not uniquely a post-covid effect. Causality is still hard to prove, but it seems extraordinary that, when we are talking about children, we allow something to happen because we cannot prove 100% that it causes harm, rather than allowing it to happen only if we can prove that it is safe. That is not the way we deal, for example, with children’s food or toys. I would turn the question around: are people really suggesting that the prevalence of self-harm is nothing to do with the prevalence and normalisation of certain imagery on social media?
The Online Safety Act was a landmark piece of legislation, and we will debate it again in Westminster Hall on Wednesday. Everybody who worked on it— including myself—was always clear that it would not be the last time we had to come back to this subject in legislation. It is inevitable that there will be further regulation and restrictions in the interests of greater child protection. I therefore urge the Government to move from working out whether there will be further protections to working out what those will be. Of course, to write legislation—to return to where I started—one needs to be able to define things precisely and, in reality, there is no bright line between a smartphone and a brick phone, and no slam-dunk definition of social media either.
It can be instructive to think about individual platforms and services. One of the things we worry about is TikTok. Do we worry about Snapchat? Yes, we probably do, because of the association with bullying and the disappearing messages. But some families like the snap friends function, because they can see where different family members are. Do we worry about Instagram? Yes, we probably do, and it has a particular association with issues around body image. But it is also a way for people to share lovely family photos, and for extended families to keep in touch.
A lot of families allow children to have WhatsApp, when they would not allow them to have TikTok, and up until quite recently, some would not even have called it a social media platform. Where we think we have problems with disinformation on TikTok and Facebook, other countries have them with WhatsApp. What about YouTube? For many people, YouTube is not social media; it is a place where they go to watch videos or for music. But because it has user-generated content, it is also social media; it is certainly capable of sucking up a lot of young people’s time, and it has potential rabbit holes that people can fall down.
What about gaming? Gaming is different from social media, but modern gaming also has quite a lot of social media-like functions, such as lists of friends. Certainly, it is a way of trying to create communities of people with common interests. It is also often linked to the use of Discord or to streaming on Twitch. And, again, it certainly takes up a lot of time—unless, of course, someone is in China, where the Government will allow them to do it for only one hour a day, on Fridays, Saturdays and Sundays.
All of the above have risks attached, and they all have negatives, but we are unlikely to say that we want to ban them all—far from it. There is also a different risk: if we take one thing and ban it based on its specific features—its specific definition—we just push people to other places. Other things will then get more social-media characteristics, and children may end up in darker places on the internet. All of that is probably why the Australians ended up where they did: saying that it is probably more about specific functionality and that, at the end, it might be about having to make case-by-case judgments.
We worry about content; unwanted, inappropriate contact, as others have said; the excessive time children spend on platforms; potential addiction; the effects on sleep and concentration; and myopia. Crucially—my hon. Friend the Member for Reigate (Rebecca Paul) covered this very well—these technologies can also crowd out other things. Whether they, in and of themselves, are good or bad, there are only 24 hours in a day, and we want children, in the time they are not at school and not asleep, to be able to access the full range of things that childhood should be all about.
There is no defined time limit at the moment, but I did suggest that people take about six minutes. I presume that the right hon. Gentleman is bringing his remarks to a conclusion.
I confessed myself a sinner at the start, Mr Stringer, and I will now come to a close.
In the Online Safety Act, we covered a lot regarding content and contact, but we need to do more on the issues of time and addiction, and I am pleased to see some of that in the work of the hon. Member for Whitehaven and Workington (Josh MacAlister). In the meantime, as others have said, we also need to do more on parental controls. I would like to see NHS advice to parents, which can be very powerful, on what an appropriate amount of time would be for children. We also need to enforce the existing age limits, particularly the one at age 13, and to recognise that some people who falsely proved they were 13 when they were eight, nine or 10 are now showing up on social media lists as being over 18, when, in fact, they are still in their much earlier teens.
There are four or five different areas where the legislation is not sufficient for the task. Both codes require parliamentary approval, but that process will happen in the next few weeks, with the powers coming into effect this spring. As a Government, we have to decide whether it is better to make that happen now and bed it in, or say that we will have another piece of legislation. I am not allowed to make commitments on behalf of the Government, but I would be absolutely amazed if they did not bring forward further legislation in this field in the next few years. All these issues—and the others that will come along—will definitely need to be addressed, not least because, as my hon. and learned Friend the Member for Folkestone and Hythe said at the beginning of the debate, we need to make sure that the legislation is up to date.
My hon. Friend the Member for Whitehaven and Workington (Josh MacAlister) talked about the burden of proof, and he is quite right. Of course there should not be a one-way burden of proof. We have to bear in mind two things about proof—perhaps evidence is a better word, because it is not about criminality; it is about evidence-based policy. The first is that, as everybody has said, causation is not correlation. I apologise for the slightly flippant way of putting this, but Marathon became Snickers at the same time as Mrs Thatcher gave way to John Major. I am not aware of any causal relationship between those two events. Many people understand that, but it is often very difficult to weed out what is causation and what is correlation in a specific set of events. For instance, we have all laid out the problems in relation to mental health for children, but only one Member mentioned covid. I would argue that covid is quite a significant player. It was shocking that we strove hard as a Parliament to open pubs again before we opened schools, and that children, who were at the least risk, bore the heaviest burden and that sacrifice on behalf of others. I think we need to factor that in.
The second point is something that I have campaigned on for quite a long time: acquired brain injury. Children from poorer backgrounds are four times more likely to suffer a brain injury under the age of five than kids from wealthier backgrounds, and again in their teenage years. Acquired brain injury in schools is barely recognised. Some schools respond to it remarkably well, but it is likely that there are somewhere between one and three children with a brain injury in every single primary class in this land. Nobody has yet done sufficient work on how much that has contributed to the mental health problems that children have today. We certainly know that the use of phones and screens after brain injury is a significant added factor, but we need to look at all the factors that affect the mental health of children to ensure that we target the specific things that really will work in a combination of policy changes.
The Minister speaks with a great deal of knowledge and authority, particularly on acquired brain injury, but I want to come back to the covid point. Obviously, a Westminster Hall debate is not the place to establish correlation versus causality in any sense. However, if we look at a graph of what has happened with children’s and young people’s mental ill health in this country, France, Germany and the United States, while the data are not perfectly comparable, the shape of the line is not consistent with the hypothesis that it is mainly the result of covid. It predates covid, and it carries on going up afterwards.
I am sorry if I indicated that it was mainly covid; I was not trying to say that at all. I am simply saying that that is one factor, and there may be many others—social factors, personal factors and the structure of education. One could argue, as one of my hon. Friends did, that there are other things that kids could do in society. We might, for instance, want to intervene by having a creative education option. We hardly have a youth service in most of the country any more.
The Minister would be making perfectly adequate points if we were talking only about this country. We could make all sorts of points about what Government policy was and what happened to Sure Start, the curriculum and youth clubs, but those things did not happen in France, Germany or the United States.
I have not seen any of the statistics for what has happened to youth services and the cultural education offer in schools in France and other countries.
No, I do not. I am trying to make a very simple point: many factors have contributed to the mental health problems that many young people have, and social media is undoubtedly one. The question is, how do we rate and address all those different factors? As the hon. Member for Harpenden and Berkhamsted (Victoria Collins) said, we must address this from a public health angle, and that is essential. But then, when we have the whole bag of evidence, rather than just individual bits of evidence, the question is, what is the most useful intervention that we can make?
I want to come on to the definition issues. Several Members raised the issue of what social media is. That is partially addressed by the Online Safety Act, but we may want to go further. As to the reason why the previous Government landed on 13 rather than 16—which was an option available to them—the consultation at the time came back with 13. It is interesting that Members referred to content availability and to there being two ages in the UK that are generally reckoned to be part of the age of majority: 16 and 18. Actually, for film classification, it is 12 and 15. There is an argument for saying that we ought to look at film classification because it is long established and—although the issues are different in many regards—some of them are similar. We might want to learn from that—I say this from my Department for Culture, Media and Sport angle—to inform the debate on this matter.
On enforcement, several Members referred to the fact that there is no point in just changing the law; if we do that but have no form of enforcement, that is worse than useless. That is one of the Government’s anxieties, and we need to make sure that the enforcement process works properly. I take the point that there are two areas where Ofcom feels it is unable to act, because the law does not allow it to do so, and we will need to look at that. That is why we are keen to get to the moment in April when the two codes will be voted on in Parliament. We will then make sure that Ofcom has not just the powers but the ability to enforce. As my hon. Friend the Member for Congleton (Mrs Russell) said, Ofcom has the power to fine up to £18 million, or 10% of qualifying worldwide revenue in the relevant year, which could be a substantial amount, but it needs to ensure it is in a legally effective position to do so.
My final point is that the Secretary of State has made it clear that nothing is off the table. We are keen to act in this space. The question is, how do we act most proportionately and effectively in a way that tackles the real problem? Some of that is about how the evidence stacks up, and some of it is about when the right time to legislate is. But, as I said earlier, I do not think for a single instant that this debate or the Online Safety Act will be the end of the story. I would be amazed if there were not further legislation, in some shape or other, in this field in the next two or three years.
With that, I once again thank Kim Campbell for bringing the petition to us, and I thank my hon. and learned Friend the Member for Folkestone and Hythe for introducing the debate on behalf of the Petitions Committee.
(2 months, 1 week ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to see you in the Chair, Mr Twigg.
I join colleagues in thanking the petitioners, and Ellen Roome in particular, for initiating the petition and enabling this Westminster Hall debate. We were all deeply affected by hearing the statement that was just read out. Ellen, you have the sympathies of everybody here on the loss of Jools aged just 14. We think also of other bereaved families and other campaigners—in the last few days we have been reminded of Ian Russell and the work he has done since the tragic death of Molly—and all those who take the most unimaginably awful situation for a parent and a family and use it to try to make something better for others for the future.
The Government’s response to the petition notes not only that, under the Online Safety Act, platforms have to set out their policy for dealing with such tragic situations, but that the Act
“introduces measures to strengthen coroners’ ability to obtain information”
from platforms via Ofcom, thereby providing a route for parents. We will have to see how that works in practice and how timely it is. What we must not do is put a new, onerous layer on top of parents at the most difficult time imaginable, as they are grieving.
As has been mentioned, there is also the question of historic cases. There will be future historic cases, because not in every case will the inquest have covered this question. I hope the Minister will be able to say a word about whether the data Bill is the opportunity to put it beyond doubt that, ultimately, the parent has an absolute right, with the right safeguards and verifications, to see the information related to their child.
Let me turn from the most tragic of cases to all families and all children. I start with the most important point, which is that trust, support and love within families are the most effective things. Most of the time it is irrelevant what the law is because, within families, we set our own rules. Generally, it is clear that even if our rules are, at times, a pain for our children, they are well-intentioned. We must also note that not quite all families are loving families. Some parents are abusive, and children must always have ways confidentially to seek help from child protection services, the police, the health service and bona fide charities. That applies at any age.
It is also true that everyone needs a degree of privacy, but there have always been different degrees of privacy, and how private something is should be proportionate to the level of risk involved. In discussing accessing online services, we are talking about things that can have very serious consequences. We want and need to be able to protect our children from harm—from bullying, from unwanted contact, including from adults, and from being drawn to dangerous interests, which can become dangerous obsessions. We also have a responsibility, and we should be held responsible, for them not perpetrating harms on others. Although we trust our children, we know that children do sometimes get into trouble and can come under pressure, and in some cases severe coercion, from others. Of course, they potentially have ready access to material of all sorts that is much more harmful than we had as children. They can go deeper and deeper down rabbit holes.
Parents are not the only ones who can help children, but they have a unique position in children’s lives and are uniquely placed to help and support them. That is why I agree in principle with the petitioner that parents should have a right to see what their child is subjected to or is doing for as long as they are a child and we, as the parents, are responsible for them—and that means at least until age 16. There is a separate debate to be had about the extent of that, and what the threshold and process should be. I understand entirely what the hon. Member for Sunderland Central (Lewis Atkinson) was saying. I do not think anybody is proposing constant, ongoing monitoring, but there are situations that a child could find themselves in that I believe warrant the availability of that access.
There is also a problem, or a hurdle, with the principle: we can only request access to something that we know exists. It is common for children to have multiple social media accounts on a single platform. They probably have different names these days, but people used to call their fake and real accounts finsta and rinsta. The account their mum sees is not necessarily the real one—ironically, the one that was called “fake” was the one where their real lives were actually happening. Of course, they could also be on lots of other platforms that parents and others do not necessarily know about.
I agree with the hon. Member for Sunderland Central, who opened the debate on behalf of the Petitions Committee, that it is of paramount importance that we are able to put some guardrails around what children can access. That is one of the reasons we have parental controls. How those controls work, and the limits of them, are what I want to talk about this afternoon.
I will read out a short note from Microsoft, which is not a company that people normally worry about—it is a very responsible operator—to a constituent ahead of their child’s 13th birthday. It says:
“Congratulations on Fred’s birthday. At this age, certain laws allow them to have more control and choices over their own account settings. This means that they’ll be able to change a number of family safety settings, even if you already have them set up. Fred will also need to allow you to continue receiving data about their activities to guide their digital journey. They can turn off your ability to see their activity on Windows, Xbox, and Android devices. They can turn off your ability to see their devices and check on updates…safety settings like firewall and antivirus…They can stop sharing their location through their mobile phone.”
That was for a child approaching their 13th birthday, which leads me to question what “certain laws” are being cited. I can only assume it is the Data Protection Act 2018, which sets out that
“a child aged 13 years or older”
can
“consent to his or her personal data being processed by providers of information society services.”
The genesis of that was European law, and Parliament was debating and voting on it in parallel with, but before actually completing, exit from the European Union. The age 13 is not universal. EU law specified a range between 13 and 16, and multiple countries did select 13, but not all. France set the age at 15, with some limited non-contractual consents for data processing allowed between 13 and 15. Germany and the Netherlands set the age at 16. There is that question of what is the appropriate age, but the other big question is what that age actually means.
The 2018 Act was passed before we considered the Online Safety Bill, which became the Online Safety Act 2023, but we were already concerned in this House about online safety, and I am fairly sure that it was not Parliament’s intent to reduce parental oversight. In particular, I do not think saying that a service can have a child sign up to it at 13 is the same as saying that the parent cannot stop them. Still less, it is not the same as saying that the parent should not be able to know what their child is signed up to.
In setting out why the age was set at 13, the explanatory notes to the 2018 Act say, quite rightly, that that is in line with the minimum age that popular services such as Facebook, WhatsApp and Instagram set, but they go on to say, slightly unrelatedly:
“This means children aged 13 and above would not need to seek consent from a guardian when accessing, for example…services which provide educational websites and research resources to complete their homework.”
I think that sentence might have a lot to answer for. It sounds very sensible—we would not want children having to get over hurdles to finish their homework—but if we think about it, it is not necessary to sign up to research something on the internet for homework anyway, and educational websites are generally exempt from consent requirements. But the big question is, what else might it allow—or, crucially, what else might it be interpreted to allow?
I repeat that I do not believe that it was Parliament’s intent in effect to disable parental safety controls for 13, 14 and 15-year-olds. There is a whole other question about those safety controls themselves and how they work, and how difficult it can be for parents—and even all of us, who tend to think we are quite good at this sort of thing—to keep on top of them, particularly if they have multiple children, different operating systems and multiple platforms. There really should be a single industry standard entry system that can cover all of screen time and basic, entry-level approvals with a default “safety on” version of the different platforms.
We talk about age thresholds and age limits; there is a whole other set of questions about how those apply and how we make age assurance or age verification work properly. Those are both debates for another day. Today, I simply ask the Minister: is it the Government’s understanding of the existing legislation that children under 16 should be able to switch off parental controls? If not, what could be done to clarify the situation? Is a change needed in primary legislation?
I will come to that point.
On the issue of a ban on smartphones and social media for under-16s, we are focused on building the evidence base to inform any future action. We have launched a research project looking at the links between social media and children’s wellbeing. I heard from the hon. Member for Esher and Walton (Monica Harding) that that needs to come forward and I will pass that on to my colleagues in the Department.
My hon. Friend the Member for Lowestoft (Jess Asato) mentioned the private Member’s Bill in the name of my hon. Friend the Member for Whitehaven and Workington (Josh MacAlister). We are aware of his Bill and share his commitment to keeping children safe online. We are aware of the ongoing discussion around children’s social media and smartphone use, and it is important that we allocate sufficient time to properly debate the issue. We are focused on implementing the Online Safety Act and building the evidence base to inform any future action. Of course, we look forward to seeing the detail of my hon. Friend’s proposal and the Government will set out their position on that in line with the parliamentary process.
My hon. Friend the Member for Darlington (Lola McEvoy) raised the issue of Ofcom’s ambitions. Ofcom has said that its codes will be iterative, and the Secretary of State’s statement will outline clear objectives for it to require services to improve safety for their users.
The hon. Member for Twickenham (Munira Wilson) and my hon. Friend the Member for Bournemouth West (Jessica Toale) mentioned engagement with children, and we know how important that is. Ofcom engaged with thousands of children when developing its codes, and the Children’s Commissioner is a statutory consultee on those codes, but of course we must do more.
The hon. Member for Huntingdon (Ben Obese-Jecty) raised the matter of mental health services and our commitment in that regard. He is right that the Government’s manifesto commits to rolling out Young Futures hubs. That national network is expected to bring local services together to deliver support for not only teenagers at risk of being drawn into crime, but those facing mental health challenges, and, where appropriate, to deliver universal youth provision. As he rightly said, that is within the health portfolio, but I am happy to write to him with more detail on where the programme is.
We want to empower parents to keep their children safe online. We must also protect children’s right to express themselves freely, and safeguard their dignity and autonomy online.
The Minister spoke earlier about age limits. I was not sure if she had finished responding to Members’ comments and questions, and whether she would be able to comment on not only what the various age thresholds should be, but what they mean. In particular, if the GDPR age is 13, does that mean that parental controls can effectively be switched off by somebody of age 13, 14 or 15?
I am sure the right hon. Gentleman’s party would have discussed the issue of the age limit and why it was 13 during the passage of the Online Safety Act.
I am more than happy to write to him in detail on why the age limit has been set at 13. As I said, there is currently a live discussion about raising the age and evidence is being collated.
The challenge of keeping our children safe in a fast-moving world is one that we all—Government, social media platforms, parents and society at large—share. As we try to find the solutions, we are committed to working together and continuing conversations around access to data in the event of the tragic death of a child.
I will finish by again thanking Ellen for her tireless campaigning. I also thank all the speakers for their thoughtful contributions. I know that Ellen has waited a long time for change and we still have a long way to go. Working with Ellen, the Bereaved Families for Online Safety group, other parents and civil society organisations, we will build a better online world for our children.
(3 months, 3 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I give way to my right hon. Friend the Member for East Hampshire (Damian Hinds).
My hon. Friend is right to identify the progress made in constituencies like his, Farnham and Bordon, or mine, East Hampshire. Does he agree, however, that improvement is all the more urgent and important in the most rural areas, where there is already very poor or no mobile signal and very poor broadband speed? They are not on the list for the commercial gigabit roll-out and some are not on the list for the second tier of gigabit roll-out. On top of all that, they hear the announcement that the PSTN—the public switched telephone network—is going to be switched off. In the event of an emergency, in the event of a power cut, they are in danger of being marooned.
My right hon. Friend makes an extraordinarily prescient point. That is a combination of factors that will leave many in rural areas, especially those who are elderly or have other caring needs, at a real disadvantage. That is why it is so essential to turbocharge this roll-out going forward.
It is a pleasure, as always, to serve under your chairmanship, Mr Dowd.
I congratulate the hon. Member for Farnham and Bordon (Gregory Stafford) on securing this debate—although the mention of Liphook in his speech confused me, as I was always under the impression that my aunt’s MP was the right hon. Member for East Hampshire (Damian Hinds)—
Clearly, I have got something wrong.
I want to address a few of the issues experienced in my constituency. As many Members here today will be well aware, it is the largest constituency in England, taking in large parts of Northumberland, going all the way up to the Scottish border and all the way across to the border with Cumbria. I am regularly contacted by constituents who are trying to enjoy the dream situation of living in England’s most beautiful county, but who are unable properly to work, attend meetings with clients or generate the economic growth that this country so sorely needs.
I speak to people in villages such as Stocksfield, Riding Mill, Hedley on the Hill or even Darras Hall, who I know have had frequent issues with getting the appropriate broadband speeds delivered to them. Residents of those villages are continuously working to try to get the broadband speeds that they deserve.
There is a real feeling that for the past 14 years many rural communities were left to sit in splendid isolation, abandoned by the Conservative party. As the hon. Member for Farnham and Bordon eloquently said in his opening remarks, they were left to fend for themselves.
I am absolutely proud to be part of the Labour party that won a swathe of rural seats at the last general election, that is committed to ensuring our rural businesses can grow, and that can ensure that the world-class businesses across my constituency are able not only to access high-speed internet, but to do so in the very smallest communities. When I go out and meet constituents across the north Tyne area, internet is one of the bugbears most commonly raised with me on the doorstep, alongside a lack of housing and the state of the NHS. I hope the Minister will consider how we can get high-speed internet to those most rural constituencies and the hill farms that the hon. Member for Westmorland and Lonsdale (Tim Farron) mentioned, to ensure that they are given the opportunity to benefit from Project Gigabit.