Ban smartphones and camera phones for under 16s

I have many concerns regarding the use of mobile phones that can access photographs and social media for children under 16. I believe there are too many safeguarding concerns, exploitation, cyber bullying and group trolling that can be mentally damaging children's well being.

This petition closed on 30 May 2024 with 25,981 signatures

Reticulating Splines

I would like an age restriction on phones for children up to the age of 16. Phones that can only text or call can be allowed for children but smartphones and camera phones should be age restricted to avoid harming children.

Petition Signatures over time

Government Response

Tuesday 30th April 2024

The government is committed to making the UK the safest place to be a child online, as evidenced by the Online Safety Act. We are focused on implementing the regime as soon as possible.

I would like to thank all those who signed the petition and share our commitment to keeping children safe online. While most children have a positive experience online, using the internet to connect with peers and to access educational resources, information, and entertainment, I share concerns about the impact of harmful and age-inappropriate content and activity online, which can be particularly damaging for children.

It is important that we strike the right balance by protecting children from harm whilst also allowing them to benefit from safe internet use. We also recognise that parents will decide what is appropriate for their children. We were proud to pass our landmark Online Safety Act last year which will protect children from harmful content. The government keeps all options under review to keep children safe online and build on the progress of the Online Safety Act. However, we are not considering a complete ban on children under 16 having smartphones as that may impact the ability of children to access some of the benefits of the Internet.

The Online Safety Act instead takes a safety-by-design approach, enabling children to access the benefits of the Internet with far fewer of the dangers. The Act received Royal Assent on 26 October 2023 and we are working closely with Ofcom to ensure the regime is operational as soon as possible.

The Online Safety Act places robust, much-needed responsibilities on technology companies – including social media platforms, search services and other services which host user-generated content – to keep all users, but particularly children, safe online.

All companies in scope will need to take robust steps to protect children from illegal content and criminal behaviour on their services. This includes removing and limiting the spread of illegal content and taking steps to prevent similar material from appearing. Additionally, all services which are likely to be accessed by children will be required to provide safety measures for child users from content that is legal but nonetheless presents a risk of harm to children. User-to-user services, including social media platforms, must prevent children of all ages from encountering 'primary priority' content. Pornographic content and content that encourages or promotes either self-harm; eating disorders; or suicide have all been designated as kinds of ‘primary priority’ content. Where these services allow ‘primary priority content’ on their service, they will need to use highly effective age verification or age estimation to ensure children are not able to encounter this content on their service. The Act also includes a standalone provision which requires providers who publish pornographic content on their services to prevent children from accessing that content.

In addition, user-to-user and search service providers must also provide age-appropriate protections to children from 'priority' content that is harmful to children, such as bullying and content that depicts or encourages serious violence. Finally, providers which have age restrictions need to specify in their terms of service what measures they use to prevent underage access and apply these terms consistently. This ensures providers can be held to account for what they say in their terms of service and can no longer do nothing to prevent underage access. The Act will be overseen and enforced by Ofcom. As the independent regulator, Ofcom will set out in codes of practice the steps that providers can take to comply with their duties. Ofcom will also have a range of enforcement powers, which will include substantial fines and, where appropriate, business disruption measures (including blocking).

In addition to the work on the Online Safety Act, the government has also recently published guidance on the use of mobile phones in schools. We know that mobile phones are a distraction to learning for pupils and, if unregulated in classroom settings, lead to significant loss of learning time. That is why the Department for Education is acting on this challenge by strengthening the position on mobile phones – making clear that use should be prohibited throughout the school day. The guidance will provide headteachers with support and advice on how to successfully prohibit mobile phone use, including at break times, to tackle disruptive behaviour and online bullying whilst boosting attention during lessons. If schools fail to implement the new guidance, the government will consider legislating in the future to make the guidance statutory.

The government continues to look at ways that children and other internet users can be kept safe online, to further build on the protections of the Online Safety Act. Our current priority is ensuring that the Act is operational as soon as possible to ensure all children in the UK are provided with a safer online experience.

Department for Science, Innovation and Technology

This is a revised response. The Petitions Committee requested a response which more directly addressed the request of the petition. You can find the original response towards the bottom of the petition page (

Constituency Data

Reticulating Splines