We believe social media companies should be banned from letting children under 16 create social media accounts.
You may be interested in these active petitions
We think this would help:
1. Stop online bullying
2. Stop children being influenced by false posts
3. Stop children seeing content that encourages violence/could be harmful for their future.
We believe social media is having more of a negative impact to children than a positive one. We think people should be of an age where they can make decisions about their life before accessing social media applications. We believe that we really need to introduce a minimum age of 16 to access social media for the sake of our children’s future along with their mental and physical health.
Tuesday 17th December 2024
The government is aware of the ongoing debate as to what age children should have smartphones and access to social media. The government is not currently minded to support a ban for children under 16.
I would like to thank all those who signed the petition on this incredibly important issue. It is right that tech companies should take responsibility to ensure their products are safe for UK children, and the Online Safety Act 2023 is a crucial tool in holding them to account for this.
The government is aware of the ongoing debate as to what age children should have smartphones and access to social media; however, the government is not currently minded to support a ban for children under 16. Children face a significant risk of harm online and we understand that families are concerned about their children experiencing online bullying, encountering content that encourages violence, or other content which may be harmful. We will continue to do what is needed to keep children safe online. However, this is a complicated issue. We live in a digital age and must strike the right balance so that children can access the benefits of being online and using smartphones while we continue to put their safety first. Furthermore, we must also protect the right of parents to make decisions about their child’s upbringing.
The current evidence on screentime is mixed and a systematic review by the UK Chief Medical Officers in 2019 does not show a causal link between screen-based activities and mental health problems, though some studies have found associations with increased anxiety or depression. Therefore, the government is focused on building the evidence base to inform any future action. Last month, the government commissioned a feasibility study into future research to understand the ongoing impact of smartphones and social media on children, to grow the evidence base in this area.
The government’s priority is working with Ofcom to effectively implement the Online Safety Act 2023, so social media users, especially children, can benefit from the Act’s protections as soon as possible. Additionally, the DSIT Secretary of State has outlined the government’s five online safety priorities through a draft Statement of Strategic Priorities. These priorities, which include safety by design, focus on delivering the safest online experiences for all users, in particular children, through the Act.
The Act puts a range of new duties on social media companies and search services, making them responsible for their users’ safety, with the strongest provisions in the Act for children.
Social media platforms, other user-to-user services and search services that are likely to be accessed by children will have a duty to take steps to prevent children from encountering the most harmful content that has been designated as ‘primary priority’ content. This includes pornography and content that encourages, promotes, or provides instructions for self-harm, eating disorders, or suicide. Online services must also put in place age-appropriate measures to protect children from ‘priority’ content that is harmful to children, including bullying, abusive or hateful content and content that encourages or depicts serious violence. Under the Act, where services have minimum age limits, they must specify how these are enforced and do so consistently. Ofcom’s draft proposals would mean that user-to-user services which do not ban primary priority or priority harmful content should introduce highly effective age checks to prevent children from accessing the entire site or app, or age-restrict areas of the service hosting such harmful content.
Finally, companies in scope of the Act must take steps to protect all users from illegal content and criminal activity on their services.
In her letter to parliamentarians on 17 October, Ofcom’s CEO, Melanie Dawes, stressed that Ofcom’s codes are iterative. The letter also noted that Ofcom is seeking to strike a balance between speed and comprehensiveness with its initial codes and stated that Ofcom is seeking to do more work on minimum age limits in the future.
In addition, the Act also updated Ofcom’s statutory duty to promote media literacy. Media literacy can also help tackle a wide variety of online safety issues for all internet users, including children. Additionally, Ofcom is required to raise awareness of the nature and impact of misinformation and disinformation, and to take steps to build the public's capability in establishing the reliability, accuracy, and authenticity of content found on regulated services. These duties are already in force.
The steps Ofcom has set out will represent a positive shift for how children and young people experience the online world. We expect that Ofcom's finalised Children’s Safety Codes of Practice will come into effect by the summer of 2025. However, the Act is designed so it can keep up with evolving areas, and Ofcom has been clear that the Children’s Codes of Practice will be updated as the evidence base of new and existing online harms grows.
We will continue to work with stakeholders to balance important considerations regarding the safety and privacy of children.
Department for Science, Innovation and Technology
This is a revised response. The Petitions Committee requested a response which more directly addressed the request of the petition. You can find the original response towards the bottom of the petition page (https://petition.parliament.uk/petitions/700086)