(3 days, 22 hours ago)
Commons Chamber
Emily Darlington
I absolutely agree. Young people, particularly those in the mid-teenage years, understand this issue in a way that sometimes we do not because, quite frankly, our online experience is completely different from theirs. If Members want to test that, they should open an app such as Pinterest and compare what is fed into their Pinterest boards with their child’s Pinterest boards. It is a completely different experience. If Members do not have children, they should ask younger member of staff to open the same app on the different phones, and they will see a completely different world.
A local organisation in my constituency, CyberSafe Scotland, surveyed children about what they were being fed on TikTok. There is a road in my constituency called North Anderson Drive, and children on one side of North Anderson Drive were being fed different content to the children on the other side of it. It is not just an age thing; it is really specific, and we cannot understand what each individual person is seeing because it is different for everybody.
Emily Darlington
That is a very important point about how sophisticated the technology has become. When we ask companies to take action to stop outcomes, the technology exists to do that. We are not asking them to reinvent the wheel or come up with new technology. It already exists because they are even microtargeting two different sides of the road.
Having discussed this with experts, parents and—most importantly—young people, what do I think we need to consider? First, we need to fully and properly implement the Online Safety Act 2023. That must be done at speed, and it requires nothing from the House. It has been a request of the Secretary of State and the Minister, and I recommend that Ofcom gets on and does that as quickly as possible. We must make safe spaces for children online. How do we do that? Part of the answer is ensuring that content is related to ratings that we already understand as parents, such as those from the British Board of Film Classifications. I have been asking YouTube what rating YouTube Kids has for about a year now. Is it rated U? Is it 12A? Is it 15? It cannot tell me because it does not do things on that basis.
As a parent I want to know the rating before allowing my children on an app, because parents have a role in this as well. All apps should be rated like videogames. Roblox has a 5+ rating, which does not exist in videogame ratings. We see ratings such as 4+ or 9+, but those are made up. At the parents forum that I did after the survey, one parent said that she walked in on her nine-year-old playing “guns versus knives”—on an app that is rated 5+. The ratings on apps mean nothing, yet we have video game ratings that we as parents understand, so why are they not used? Should in-app purchases ever be allowed for young children? What is the age at which in-app purchases should be allowed in a game?
We must consider the time limits for the different stages of brain development. We have guides on fruit and vegetables that recommend five a day to parents. We all know that. Schools use the same language, we use the same language, yet we have nothing to support parents in deciding how long a child should be online at different stages of brain development. I hope that the evidence that the Science, Innovation and Technology Committee collects will help inform that.
We need to change addictive and radicalising platform algorithms. To protect children from child sexual abuse images, we need to talk to those behind iOS and Android to stop the creation of self-generated child sexual abuse images—some 70% to 80% of child sexual abuse images are self-generated—and we need to stop end-to-end encryption sites from sharing them. We have technology that can do that. We should always keep the ability to ban in our pockets, but any ban should be for particular apps. We should not ban our children and young people from having an online experience that is good.