Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I refer the Committee to my interests as put in the register and declared in full at Second Reading. I will speak to Amendment 2 in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Harding, to Amendments 3 and 5 in my name, and briefly to Amendments 19, 22, 298 and 299 in the name of the noble Baroness, Lady Harding.

The digital world does not have boundaries in the way that the Bill does. It is an ecosystem of services and products that are interdependent. A user journey is made up of incremental signals, nudges and enticements that mean that, when we use our devices, very often we do not end up where we intended to start. The current scope covers user-to-user, search and commercial porn services, but a blog or website that valorises self-harm and depression or suggests starving yourself to death is still exempt because it has limited functionality. So too are games without a user-to-user function, in spite of the known harm associated with game addiction highlighted recently by Professor Henrietta Bowden-Jones, national expert adviser on gambling harms, and the World Health Organization in 2019 when it designated gaming disorder as a behavioural addiction.

There is also an open question about immersive technologies, whose protocols are still very much in flux. I am concerned that the Government are willing to assert that these environments will meet the bar of user-to-user when those that are still building immersive environments make quite clear that that is not a given. Indeed, later in Committee I will be able to demonstrate that already the very worst harms are happening in environments that are not clearly covered by the Bill.

Another unintended consequence of the current drafting is that the task of working out whether you are on a regulated or unregulated service is left entirely to children. That is not what we had been promised. In December the Secretary of State wrote in a public letter to parents,

“I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders”.

It is likely that the Minister will suggest that the limited- functionality services will be caught by the gatekeepers. But, as in the case of immersive technology, it is dangerous to suggest that, just because search and user- to-user are the primary access points in 2023, that will remain the case. We must be more forward thinking and ensure that services likely to be accessed that promote harm are in scope by default.

Amendments 3 and 5 are consequential, so I will not debate them now. I have listened to the Government and come back with a reasonable and implementable amendment that applies only to services that are likely to be accessed by children and that enable harm. I now ask the Government to listen and do likewise.

Amendments 92 and 193 cover the child user condition. The phrase “likely to be accessed”, introduced in this House into what became the Data Protection Act 2018, is one of the most unlikely successful British exports. Both the phrase and its definition, set out by the ICO, have been embedded in regulations in countries the world over—yet the Bill replaces this established language while significantly watering down the definition.

The Bill requires

“a significant number of children”

to use the service, or for the service to be

“likely to attract a significant number of users who are children”.

“Significant” in the Bill is defined relative to the overall UK user base, which means that extremely large platforms could deem a few thousand child users not significant compared with the several million-strong user base. Since only services that cross this threshold need comply with the child safety duties, thousands of children will not benefit from the safety duties that the Minister told us last week were at the heart of the Bill.

Amendment 92 would put the ICO’s existing and much-copied definition into the Bill. It says a service is

“likely to be accessed by children”

if

“the service is designed or intended for use by children … children form a substantive and identifiable user group … the possibility of a child accessing the service is more probable than not, taking into consideration … the nature and content of the service and whether that has particular appeal for children … the way in which the service is accessed and any measures in place to prevent children gaining access … market research, current evidence on user behaviour, the user base of similar or existing services”

that are likely to be accessed.

Having two phrases and definitions is bad for business and even worse for regulators. The ICO has first-mover advantage and a more robust test. It is my contention that parents, media and perhaps even our own colleagues would be very shocked to know that the definition in the Bill has the potential for many thousands, and possibly tens of thousands, of children to be left without the protections that the Bill brings forward. Perhaps the Minister could explain why the Government have not chosen regulatory alignment, which is good practice.

Finally, I will speak briefly in support of Amendments 19, 22, 298 and 299. I am certain that the noble Baroness, Lady Harding, will spell out how the app stores of Google and Apple are simply a subset of “search”, in that they are gatekeepers to accessing more than 5 million apps worldwide and the first page of each is indeed a search function. Their inclusion should be obvious, but I will add a specific issue about which I have spoken directly with both companies and about which the 5Rights Foundation, of which I am chair, has written to the ICO.

When we looked at the age ratings of apps across Google Play Store and Apple, four things emerged. First, apps are routinely rated much lower than their terms and conditions: for example, Amazon Shopping says 18 but has an age rating of 4 on Apple. This pattern goes across both platforms, covering social sites, gaming, shopping, et cetera.

Secondly, the same apps and services did not have the same age rating across both services, which, between them, are gatekeepers for more than 95% of the app market. In one extreme case, an app rated four on one of them was rated 16 on the other, with other significant anomalies being extremely frequent.

Thirdly, almost none of the apps considered their data protection duties in coming to a decision on their age rating, which is a problem, since privacy and safety and inextricably linked.

Finally, in the case of Apple, using a device registered to a 15 year-old, we were able to download age-restricted apps including a dozen or more 18-plus dating sites. In fairness, I give a shoutout to Google, which, because of the age-appropriate design code, chose more than a year ago not to show 18-plus content to children in its Play Store. So this is indeed a political and business choice and not a question of technology. Millions of services are accessed via the App Store. Given the Government’s position—that gatekeepers have specific responsibilities in relation to harmful content and activity—surely the amendments in the name of the noble Baroness, Lady Harding, are necessary.

My preference was for a less complicated Bill based on principles and judged on outcomes. I understand that that ship has sailed, but it is not acceptable for the Government now to use the length and complexity of the Bill as a reason not to accept amendments that would fill loopholes where harm has been proven. It is time to deliver on the promises made to parents and children, and to put the onus for keeping young people safe online squarely on tech companies’ shoulders. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I rise to speak to Amendments 19, 22, 298 and 299 in my name and those of the noble Baroness, Lady Stowell, and the noble Lords, Lord Knight and Lord Clement-Jones. I will also briefly add at the end of my speech my support for the amendments in the name of my friend, the noble Baroness, Lady Kidron. It has been a huge privilege to be her support act all the way from the beginnings of the age-appropriate design code; it feels comfortable to speak after her.

I want briefly to set out what my amendments would do. Their purpose is to bring app stores into the child protection elements of the Bill. Amendment 19 would require app stores to prepare

“risk assessments equal to user-to-user services due to their role in distributing online content through apps to children and as a primary facilitator of user-to-user”

services reaching children. Amendment 22 would mandate app stores

“to use proportionate and proactive measures, such as age assurance, to prevent children”

coming into contact with

“primary priority content that is harmful to children”.

Amendments 298 and 299 would simply define “app” and “app stores”.

Let us be clear what app stores do. They enable customers to buy apps and user-to-user services. They enable customers to download free apps. They offer up curated content in the app store itself and decide what apps someone would like to see. They enable customers to search for apps for user-to-user content. They provide age ratings; as the noble Baroness, Lady Kidron, said, they may be different age ratings in different app stores for the same app. They sometimes block the download of apps based on the age rating and their assessment of someone’s age, but not always, and it is different for different app stores.

Why should they be included in this Bill—if it is not obvious from what I have already said? First, two companies are profiting from selling user-to-user products to children. Two app stores account for some 98%-plus of all downloads of user-to-user services, with no requirements to assess the risk of selling those products to children or to mitigate those risks. We do not allow that in the physical world so we should not allow it in the digital world.

Secondly, parents and teenagers tell us that this measure would help. A number of different studies have been done; I will reference just two. One was by FOSI, the Family Online Safety Institute, which conducted an international research project in which parents consistently said that having age assurance at the app store level would make things simpler and more effective for them; ironically, the FOSI research was conducted with Google.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The noble Lord makes a good point. I certainly think we are heading into a world where there will be more regulation of app stores. Google and Apple are commercial competitors with some of the people who are present in their stores. A lot of the people in their stores are in dispute with them over things such as the fees that they have to pay. It is precisely for that reason that I do not think we should be throwing online safety into the mix.

There is a role for regulating app stores, which primarily focuses on these commercial considerations and their position in the market. There may be something to be done around age-rating; the noble Baroness made a very good point about how age-rating works in app stores. However, if we look at the range of responsibilities that we are describing in this Bill and the tools that we are giving to intermediaries, we see that they are the wrong, or inappropriate, set of tools.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

Would the noble Lord acknowledge that app stores are already undertaking these age-rating and blocking decisions? Google has unilaterally decided that, if it assesses that you are under 18, it will not serve up over-18 apps. My concern is that this is already happening but it is happening indiscriminately. How would the noble Lord address that?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.

--- Later in debate ---
Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as my name is on Amendment 9, I speak to support these amendments and say that they are worthy of debate. As your Lordships know, I am extremely supportive of the Bill and hope that it will be passed in short order. It is much needed and overdue that we have the opportunity for legislation to provide us with a regulator that is able to hold platforms to account, protect users where it can and enhance child safety online. I can think of no better regulator for that role than Ofcom.

I have listened to the debate with great interest. Although I support the intentions of my noble friend Lord Moylan’s amendment, I am not sure I agree with him that there are two cultures in this House, as far as the Bill is concerned; I think everybody is concerned about child safety. However, these amendments are right to draw attention to the huge regulatory burden that this legislation can potentially bring, and to the inadvertent bad consequences it will bring for many of the sites that we all depend upon and use.

I have not signed many amendments that have been tabled in this Committee because I have grown increasingly concerned, as has been said by many others, that the Bill has become a bit like the proverbial Christmas tree where everyone hangs their own specific concern on to the legislation, turning it into something increasingly unwieldy and difficult to navigate. I thought the noble Baroness, Lady Fox, put it extremely well when she effectively brought to life what it would be like to run a small website and have to comply with this legislation. That is not to say that certain elements of micro-tweaking are not welcome—for example, the amendment by the noble Baroness, Lady Kidron, on giving coroners access to data—but we should be concerned about the scope of the Bill and the burden that it may well put on individual websites.

This is in effect the Wikipedia amendment, put forward and written in a sort of wiki way by this House—a probing amendment in Committee to explore how we can find the right balance between giving Ofcom the powers it needs to hold platforms to account and not unduly burdening websites that all of us agree present a very low risk and whose provenance, if you like, does not fit easily within the scope of the Bill.

I keep saying that I disagree with my noble friend Lord Moylan. I do not—I think he is one of the finest Members of this House—but, while it is our job to provide legislation to set the framework for how Ofcom regulates, we in this House should also recognise that in the real world, as I have also said before, this legislation is simply going to be the end of the beginning. Ofcom will have to find its way forward in how it exercises the powers that Parliament gives it, and I suspect it will have its own list of priorities in how it approaches these issues, who it decides to hold to account and who it decides to enforce against. A lot of its powers will rest not simply on the legislation that we give it but on the relationship that it builds with the platforms it is seeking to regulate.

For example, I have hosted a number of lunches for Google in this House with interested Peers, and it has been interesting to get that company’s insight into its working relationship with Ofcom. By the way, I am by no means suggesting that that is a cosy relationship, but it is at least a relationship where the two sides are talking to each other, and that is how the effectiveness of these powers will be explored.

I urge noble Lords to take these amendments seriously and take what the spirit of the amendments is seeking to put forward, which is to be mindful of the regulatory burden that the Bill imposes; to be aware that the Bill will not, simply by being passed, solve the kinds of issues that we are seeking to tackle in terms of the most egregious content that we find on the internet; and that, effectively, Ofcom’s task once this legislation is passed will be the language of priorities.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, this is not the first time in this Committee, and I suspect it will not be the last, when I rise to stand somewhere between my noble friend Lord Vaizey and the noble Baroness, Lady Kidron. I am very taken by her focus on risk assessments and by the passionate defences of Wikipedia that we have heard, which really are grounded in a sort of commoner’s risk assessment that we can all understand.

Although I have sympathy with the concerns of the noble Baroness, Lady Fox, about small and medium-sized businesses being overburdened by regulation, I am less taken with the amendments on that subject precisely because small tech businesses become big tech businesses extremely quickly. It is worth pointing out that TikTok did not even exist when Parliament began debating this Bill. I wonder what our social media landscape would have been like if the Bill had existed in law before social media started. We as a country should want global tech companies to be born in the UK, but we want their founders—who, sadly, even today, are predominantly young white men who do not yet have children—to think carefully about the risks inherent in the services they are creating, and we know we need to do that at the beginning of those tech companies’ journeys, not once they have reached 1 million users a month.

While I have sympathy with the desire of the noble Baroness, Lady Fox, not to overburden, just as my noble friend Lord Vaizey has said, we should take our lead from the intervention of the noble Baroness, Lady Kidron: we need a risk assessment even for small and medium-sized businesses. It just needs to be a risk assessment that is fit for their size.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Everything the noble Baroness has said is absolutely right, and I completely agree with her. The point I simply want to make is that no form of risk-based assessment will achieve a zero-tolerance outcome, but—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I am so sorry, but may I offer just one final thought from the health sector? While the noble Lord is right that where there are human beings there will be error, there is a concept in health of the “never event”—that when that error occurs, we should not tolerate it, and we should expect the people involved in creating that error to do a deep inspection and review to understand how it occurred, because it is considered intolerable. I think the same exists in the digital world in a risk assessment framework, and it would be a mistake to ignore it.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, I am now going to attempt for the third time to beg the House’s leave to withdraw my amendment. I hope for the sake of us all, our dinner and the dinner break business, for which I see people assembling, that I will be granted that leave.