Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateLord Storey
Main Page: Lord Storey (Liberal Democrat - Life peer)Department Debates - View all Lord Storey's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Lords ChamberMy Lords, I speak in this Second Reading debate with little detailed knowledge of the digital world. I will probably be taking up my noble friend Lord Allan’s offer. I am not on Facebook, TikTok, Instagram or Snapchat; I have occasionally dabbled on Twitter. What I do have is 40-plus years’ experience as a teacher and head teacher. I have seen first-hand how children can have their lives turned upside down and how they have been physically and emotionally scarred by the effects of social media and the online world.
Yesterday, we heard from a study by the Children’s Commissioner for England how children as young as nine are being exposed to online pornography; how a quarter of 16 to 21 year-olds saw pornography while still at primary school; and how, by the age of 13, 50% had been exposed to it. You might say, “So what?” Do we want to hear that 79% of 18 to 21 year-olds have seen pornography involving sexual violence while they were still children? Do we want to hear that a 12 year- old boy had strangled a girl during a kiss because he thought that was normal? Do we want to hear that half of young people say girls expect sex to involve physical aggression? This all comes, by the way, from the Children’s Commissioner’s report.
The Online Safety Bill, as we have heard, has been a long time coming. The Government’s aim in introducing the Bill is to make Britain the best place in the world to set up and run a digital business, while simultaneously ensuring that Britain is the safest place in the world to be online. But does the Bill really achieve that for children? Childhood is about loving and learning. It is about innocence and enjoying the wonders of life. It is not about having that innocence and wonder shattered by some perverse online content.
My interest in this Bill is how we as a society can restore childhood to our children. The Bill, as the noble Baroness, Lady Kidron, said, must cite the UN Convention on the Rights of the Child, and General Comment 25 on children’s rights in relation to the digital environment. Citing this in the Bill would mean that regulated services would have regard to children’s existing rights. The limited scope of the Bill means that, as the 5Rights Foundation points out, children will still be exposed to harmful systems and processes, including blogs and websites that promote and encourage disordered eating, online games which promote violence, financial harms such as gambling, and parts of the metaverse which have yet to be developed. The Bill will not be future-proofed. Regulating only certain services means that online environments and services which are not yet built or developed are likely not to be subject to safety duties, which will quickly make the Bill out of date.
Turning to age verification, as a teacher it always worries me that children as young as seven or eight are on Facebook. In fact, 60% of UK children aged eight to 12 have a profile on at least one social media service. Almost half of children aged eight to 15 with a social media profile have a user age of 16 plus, and 32% of children aged eight to 17 have a user age of 18. Without age assurance, children cannot be given the protections needed to have an age-appropriate experience online. Some 90% of parents think that social media platforms should enforce minimum age requirements. We should do whatever we can to protect children from harm. The Bill will establish different types of content which could be harmful to children:
“primary priority content that is harmful to children … ‘priority content that is harmful to children’ and ‘content that is harmful to children’”.
I say that any content that is harmful to children should be dealt with.
As the noble Lord, Lord Hastings, has said, media literacy is hugely important to this Bill and should be included. Media literacy allows children to question the intent of media and protect themselves from negative impacts, be it fake news, media bias, mental health concerns or internet and media access. Media literacy helps children and young people safely consume the digital world. I was a bit disappointed that the noble Lord, Lord Hastings, did not ask what a Liberal Government would do, but I can tell him that we would be dealing with this issue.
Yesterday, the Princess of Wales launched a campaign to highlight the importance of childhood. Children need to enjoy their childhood and grow up in a supportive, caring environment. They need good role models, not influencers. Children are very vulnerable, innocent and susceptible. We must do all in our power to ensure that online is a safe place for them, and to be able to say to the daughter of the noble Baroness, Lady Harding, that we did finally do something about it.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Storey
Main Page: Lord Storey (Liberal Democrat - Life peer)Department Debates - View all Lord Storey's debates with the Department for Digital, Culture, Media & Sport
(1 year, 8 months ago)
Lords ChamberTim Cook, the CEO of Apple, put it very well:
“Kids are born digital, they’re digital kids now … And it is, I think, really important to set some hard rails around it”.
The truth is that in the area of app stores, Google and Apple, which, as we have heard, have a more than 95% share of the market, are just not voluntarily upholding their responsibilities in making the UK a safe place for children online. There is an air of exceptionalism about the way they behave that suggests they think the digital world is somehow different from the real world. I do not accept that, which is why I support the amendments in the name of my noble friend Lady Harding and others—Amendments 19, 22, 298, 299 and other connected amendments.
There are major holes in the app stores’ child safety measures, which mean that young teens can access adult apps that offer dating, random chats, casual sex and gambling, even when Apple and Google emphatically know that the user is a minor. I will give an example. Using an Apple ID for a simulated 14 year-old, the Tech Transparency Project looked at 80 apps in the App Store that are theoretically limited to 17 and older. It found that underage users could very easily evade age restrictions in the vast majority of cases. There is a dating app that opens directly into pornography before ever asking the user’s age; adult chat apps filled with explicit images that never ask the user’s age, and a gambling app that lets the minor account deposit and withdraw money.
What kind of apps are we talking about here? We are talking about apps such as UberHoney; Eros, the hook-up and adult chat app; Hahanono—Chat & Get Naughty, and Cash Clash Games: Win Money. The investigation found that Apple and other apps essentially pass the buck to each other when it comes to blocking underage users, making it easy for young teens to slip through the system. My day-to-day experience as a parent of four children completely echoes that investigation, and it is clear to me that Apple and Google just do not share age data with the apps in their app stores, or else children would not be able to download those apps.
There is a wilful blindness to minors tweaking their age. Parental controls on mobile phones are, to put it politely, a joke. It takes a child a matter of minutes to circumvent them—I know from my experience—and I have wasted many hours fruitlessly trying to control these arrangements. That is just not good enough for any business. It is not good enough because so many teenagers have mobile phones, as we discussed—two-thirds of children have a smartphone by the age of 10. Moreover, it is not good enough because they are accessing huge amounts of filthy content, dodgy services and predatory adults, things that would never be allowed in the real world. The Office of the Children’s Commissioner for England revealed that one in 10 children had viewed pornography by the time they were nine years old. The impact on their lives is profound: just read the testimony on the recent Mumsnet forums about the awful impact of pornography on their children’s lives.
To prevent minors from accessing adult-only apps, the most efficient measure would be, as my noble friend Lady Harding pointed out, to check users’ ages during the distribution step, which means directly in the app store or on the web browser, prior to the app store or the internet browser initiating the app or the platform download. This can be done without the developer knowing the user’s specific age. Developing a reliable age-verification regime applied at that “distribution layer” of the internet supply chain would significantly advance the UK’s objective of creating a safer online experience and set a precedent that Governments around the world could follow. It would apply real-world principles to the internet.
This would not absolve any developer, app or platform of their responsibilities under existing legislation—not at all: it would build on that. Instead, it would simply mandate that every player in the ecosystem, right from the app store distribution layer, was legally obliged to promote a safer experience online. That is completely consistent with the principles and aims of the Online Safety Bill.
These amendments would subject two of the biggest tech corporations to the same duties regarding their app stores as we do the wider digital ecosystem and the real world. It is all about age assurance and protecting children. To the noble Lord, Lord Allan, I say that I cannot understand why my corner shop requires proof of age to buy cigarettes, pornography or booze, but Apple and Google think it is okay to sell apps with inappropriate content and services without proper age-verification measures and with systems that are wilfully unreliable.
There is a tremendous amount that is very good about Tim Cook’s commitment to privacy and his objections to the data industrial complex; but in this matter of the app stores, the big tech companies have had a blind spot to child safety for decades and a feeling of exceptionalism that is just no longer relevant. These amendments are an important step in requiring that app store owners step up to their responsibilities and that we apply the same standards to shopkeepers in the digital world as we would to shopkeepers in the real world.
My Lords, I enter this Committee debate with great trepidation. I do not have the knowledge and expertise of many of your Lordships, who I have listened to with great interest. What I do have is experience working with children, for over 40 years, and as a parent myself. I want to make what are perhaps some innocent remarks.
I was glad that the right reverend Prelate the Bishop of Oxford raised the issue of online gaming. I should perhaps declare an interest, in that I think Liverpool is the third-largest centre of online gaming in terms of developing those games. It is interesting to note that over 40% of the entertainment industry’s global revenue comes from gaming, and it is steadily growing year on year.
If I am an innocent or struggle with some of these issues, imagine how parents must feel when they try to cope every single day. I suppose that the only support they currently have, other than their own common sense of course, are rating verifications or parental controls. Even the age ratings confuse them, because there are different ratings for different situations. We know that films are rated by the British Board of Film Classification, which also rates Netflix and now Amazon. But it does not rate Disney, which has its own ratings system.
We also know that the gaming industry has a different ratings system: the PEGI system, which has a number linked to an age. For example PEGI 16, if a parent knew this, says that that rating is required when depiction of violence or sexual activity reaches a stage where it looks realistic. The PEGI system also has pictures showing that.
Thanks to the Video Recordings Act 1984, the PEGI 12, PEGI 16 and PEGI 18 ratings became legally enforceable in the UK, meaning that retailers cannot sell those video games to those below those ages. If a child or young person goes in, they could not be sold those games. However, the Video Recordings Act does not currently apply to online games, meaning that children’s safety in online gaming relies primarily on parents setting up parental controls.
I will listen with great interest to the tussles between various learned Lords, as all these issues show to me that perhaps the most important issue will come several Committee days down the path, when we talk about media literacy. That is because it is not just about enforcement, regulation or ratings; it is about making sure that parents have the understanding and the capacity. Let us not forget this about young people: noble Lords have talked about them all having a phone and wanting to go on pornographic sites, but I do not think that is the case at all. Often, young people, because of peer pressure and because of their innocence, are drawn into unwise situations. Then there are the risks that gaming can lead to: for example, gaming addiction was mentioned by the right reverend Prelate the Bishop of Oxford. There is also the health impact and maybe a link with violent behaviour. There is the interactive nature of video game players, cyber bullying and the lack of a feeling of well-being. All these things can happen, which is why we need media literacy to ensure that young people know of those risks and how to cope with them.
The other thing that we perhaps need to look at is standardising some of the simple gateposts that we currently have, hence the amendment.
My Lords, it is a pleasure to follow the noble Lord, Lord Storey. I support Amendments 19, 22 and so on in the name of my noble friend Lady Harding, on app stores. She set it out so comprehensively that I am not sure there is much I can add. I simply want to thank her for her patience as she led me through the technical arguments.