Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, as I listen to the words echoing around the Chamber, I try to put myself in the shoes of parents or children who, in one way or another, have suffered as a result of exposure to things happening online. Essentially, the world that we are talking about has been allowed to grow like Topsy, largely unregulated, at a global level and at a furious pace, and that is still happening as we do this. The horses have not just bolted the stable; they are out of sight and across the ocean. We are talking about controlling and understanding an environment that is moving so quickly that, however fast we move, we will be behind it. Whatever mousetraps we put in place to try to protect children, we know there are going to be loopholes, not least because children individually are probably smarter than we are collectively at knowing how to get around well-meaning safeguards.
There are ways of testing what is happening. Certain organisations have used what they term avatars. Essentially, you create mythical profiles of children, which are clearly stated as being children, and effectively let them loose in the online world in various directions on various platforms and observe what happens. The tests that have been done on this—we will go into this in more detail on Thursday when we talk about safety by design—are pretty eye-watering. The speed with which these avatars, despite being openly stated as being profiles of children, are deluged by a variety of content that should be nowhere near children is dramatic and incredibly effective.
I put it to the Minister and the Bill team that one of the challenges for Ofcom will be not to be so far behind the curve that it is always trying to catch up. It is like being a surfer: if you are going to keep going then you have to keep on the front side of the wave. The minute you fall behind it, you are never going to catch up. I fear that, however well-intentioned so much of the Bill is, unless and until His Majesty’s Government and Ofcom recognise that we are probably already slightly behind the crest of the wave, whatever we try to do and whatever safeguards we put in place are not necessarily going to work.
One way we can try to make what we do more effective is the clever, forensic use of approaches such as avatars, not least because I suspect their efficacy will be dramatically increased by the advent and use of AI.
Tim Cook, the CEO of Apple, put it very well:
“Kids are born digital, they’re digital kids now … And it is, I think, really important to set some hard rails around it”.
The truth is that in the area of app stores, Google and Apple, which, as we have heard, have a more than 95% share of the market, are just not voluntarily upholding their responsibilities in making the UK a safe place for children online. There is an air of exceptionalism about the way they behave that suggests they think the digital world is somehow different from the real world. I do not accept that, which is why I support the amendments in the name of my noble friend Lady Harding and others—Amendments 19, 22, 298, 299 and other connected amendments.
There are major holes in the app stores’ child safety measures, which mean that young teens can access adult apps that offer dating, random chats, casual sex and gambling, even when Apple and Google emphatically know that the user is a minor. I will give an example. Using an Apple ID for a simulated 14 year-old, the Tech Transparency Project looked at 80 apps in the App Store that are theoretically limited to 17 and older. It found that underage users could very easily evade age restrictions in the vast majority of cases. There is a dating app that opens directly into pornography before ever asking the user’s age; adult chat apps filled with explicit images that never ask the user’s age, and a gambling app that lets the minor account deposit and withdraw money.
What kind of apps are we talking about here? We are talking about apps such as UberHoney; Eros, the hook-up and adult chat app; Hahanono—Chat & Get Naughty, and Cash Clash Games: Win Money. The investigation found that Apple and other apps essentially pass the buck to each other when it comes to blocking underage users, making it easy for young teens to slip through the system. My day-to-day experience as a parent of four children completely echoes that investigation, and it is clear to me that Apple and Google just do not share age data with the apps in their app stores, or else children would not be able to download those apps.
There is a wilful blindness to minors tweaking their age. Parental controls on mobile phones are, to put it politely, a joke. It takes a child a matter of minutes to circumvent them—I know from my experience—and I have wasted many hours fruitlessly trying to control these arrangements. That is just not good enough for any business. It is not good enough because so many teenagers have mobile phones, as we discussed—two-thirds of children have a smartphone by the age of 10. Moreover, it is not good enough because they are accessing huge amounts of filthy content, dodgy services and predatory adults, things that would never be allowed in the real world. The Office of the Children’s Commissioner for England revealed that one in 10 children had viewed pornography by the time they were nine years old. The impact on their lives is profound: just read the testimony on the recent Mumsnet forums about the awful impact of pornography on their children’s lives.
To prevent minors from accessing adult-only apps, the most efficient measure would be, as my noble friend Lady Harding pointed out, to check users’ ages during the distribution step, which means directly in the app store or on the web browser, prior to the app store or the internet browser initiating the app or the platform download. This can be done without the developer knowing the user’s specific age. Developing a reliable age-verification regime applied at that “distribution layer” of the internet supply chain would significantly advance the UK’s objective of creating a safer online experience and set a precedent that Governments around the world could follow. It would apply real-world principles to the internet.
This would not absolve any developer, app or platform of their responsibilities under existing legislation—not at all: it would build on that. Instead, it would simply mandate that every player in the ecosystem, right from the app store distribution layer, was legally obliged to promote a safer experience online. That is completely consistent with the principles and aims of the Online Safety Bill.
These amendments would subject two of the biggest tech corporations to the same duties regarding their app stores as we do the wider digital ecosystem and the real world. It is all about age assurance and protecting children. To the noble Lord, Lord Allan, I say that I cannot understand why my corner shop requires proof of age to buy cigarettes, pornography or booze, but Apple and Google think it is okay to sell apps with inappropriate content and services without proper age-verification measures and with systems that are wilfully unreliable.
There is a tremendous amount that is very good about Tim Cook’s commitment to privacy and his objections to the data industrial complex; but in this matter of the app stores, the big tech companies have had a blind spot to child safety for decades and a feeling of exceptionalism that is just no longer relevant. These amendments are an important step in requiring that app store owners step up to their responsibilities and that we apply the same standards to shopkeepers in the digital world as we would to shopkeepers in the real world.