59 Baroness Kidron debates involving the Department for Digital, Culture, Media & Sport

Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage
Wed 1st Feb 2023
Mon 20th Jun 2022
Fri 10th Dec 2021

Online Safety Bill

Baroness Kidron Excerpts
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The Twitter scenario, and other scenarios of mixed sites, are some of the most challenging that we have to deal with. But I would say, straightforwardly, “Look, 13% is a big chunk, but the primary purpose of Twitter is not the delivery of pornography”. I use Twitter on a daily basis and I have never seen pornography on it. I understand that it is there and that people can go for it, and that is an issue, but I think people out there would say that for most people, most of the time, the primary purpose of Twitter is not pornography.

What we want to do—in answer to the noble Lord’s second point—is create an incentive for people to be recategorised in the right direction. There is an assumption here that it is all going to be about gaming the system. I actually think that there is an opportunity here for genuine changes. There will be a conversation with Twitter. It will be interesting, given Twitter’s current management—apparently it is run by a dog, so there will be a conversation with the dog that runs Twitter. In that conversation, the regulator, Ofcom, on our behalf, will be saying, “You could change your terms of service and get rid of pornography”. Twitter will say yes or no. If it says no, Ofcom will say, “Well, here are all the things we expect you to do in order to wall off that part of the site”.

That is a really healthy and helpful conversation to have with Twitter. I expect it is listening now and already thinking about how it will respond. But it would expect that kind of treatment and conversation to be different; and I think the public would expect that conversation to be a different and better conversation than just saying “Twitter, you’re Pornhub. We’re just going to treat you like Pornhub”.

That is the distinction. As I say, we have an opportunity to get people to be more robust about either limiting or removing pornography, and I fear that the amendments we have in front of us would actually undermine rather than enhance that effort.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

At the centre of this is the question of whether we are trying to block the entire service or block at the level of porn content. It is the purpose of a set of amendments in the names of the noble Lord, Lord Bethell, myself and a number of other noble Lords to do exactly the latter. But I have to say to the noble Baroness that I am very much in sympathy with, first, putting porn behind an age gate; secondly, having a commencement clause; and, thirdly and very importantly—this has not quite come up in the conversation—saying that harms must be on the face of the Bill and that porn is not the only harm. I say, as a major supporter of the Bereaved Families for Online Safety, that “Porn is the only harm children face” would be a horrendous message to come from this House. But there is nothing in the noble Baroness’s amendments, apart from where the action happens, that I disagree with.

I also felt that the noble Baroness made an incredibly important point when she went into detail on Amendment 125A. I will have to read her speech in order to follow it, because it was so detailed, but the main point she made is salient and relates to an earlier conversation: the reason we have Part 5 is that the Government have insisted on this ridiculous thing about user-to-user and search, instead of doing it where harm is. The idea that you have Part 5, which is to stop the loophole of sites that do not have user-to-user, only to find that they can add user-to-user functionality and be another type of site, is quite ludicrous. I say to the Committee and the Minister, who I am sure does not want me to say it, “If you accept Amendment 2, you’d be out of that problem”—because, if a site was likely to be accessed by children and it had harm and we could see the harm, it would be in scope. That is the very common-sense approach. We are where we are, but let us be sensible about making sure the system cannot be gamed, because that would be ludicrous and would undermine everybody’s efforts—those of the Government and of all the campaigners here.

I just want to say one more thing because I see that the noble Lord, Lord Moylan, is back in his place. I want to put on the record that age assurance and identity are two very separate things. I hope that, when we come to debate the package of harms—unfortunately, we are not debating them all together; we are debating harms first, then AV—we get to the bottom of that issue because I am very much in the corner of the noble Lord and the noble Baroness, Lady Fox, on this. Identity and age assurance must not be considered the same thing by the House, and definitely not by the legislation.

Online Safety Bill

Baroness Kidron Excerpts
Moved by
2: Clause 3, page 3, line 14, at end insert—
“(d) an internet service, other than a regulated user-to-user service or search service, that meets the child user condition and enables or promotes harmful activity and content as set out in Schedule (Online harms to children).”Member’s explanatory statement
This amendment would mean any service that meets the 'child user condition' and enables or promotes harmful activity and content to children, as per a new Schedule, would be in scope of the regulation of the bill.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I refer the Committee to my interests as put in the register and declared in full at Second Reading. I will speak to Amendment 2 in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Harding, to Amendments 3 and 5 in my name, and briefly to Amendments 19, 22, 298 and 299 in the name of the noble Baroness, Lady Harding.

The digital world does not have boundaries in the way that the Bill does. It is an ecosystem of services and products that are interdependent. A user journey is made up of incremental signals, nudges and enticements that mean that, when we use our devices, very often we do not end up where we intended to start. The current scope covers user-to-user, search and commercial porn services, but a blog or website that valorises self-harm and depression or suggests starving yourself to death is still exempt because it has limited functionality. So too are games without a user-to-user function, in spite of the known harm associated with game addiction highlighted recently by Professor Henrietta Bowden-Jones, national expert adviser on gambling harms, and the World Health Organization in 2019 when it designated gaming disorder as a behavioural addiction.

There is also an open question about immersive technologies, whose protocols are still very much in flux. I am concerned that the Government are willing to assert that these environments will meet the bar of user-to-user when those that are still building immersive environments make quite clear that that is not a given. Indeed, later in Committee I will be able to demonstrate that already the very worst harms are happening in environments that are not clearly covered by the Bill.

Another unintended consequence of the current drafting is that the task of working out whether you are on a regulated or unregulated service is left entirely to children. That is not what we had been promised. In December the Secretary of State wrote in a public letter to parents,

“I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders”.

It is likely that the Minister will suggest that the limited- functionality services will be caught by the gatekeepers. But, as in the case of immersive technology, it is dangerous to suggest that, just because search and user- to-user are the primary access points in 2023, that will remain the case. We must be more forward thinking and ensure that services likely to be accessed that promote harm are in scope by default.

Amendments 3 and 5 are consequential, so I will not debate them now. I have listened to the Government and come back with a reasonable and implementable amendment that applies only to services that are likely to be accessed by children and that enable harm. I now ask the Government to listen and do likewise.

Amendments 92 and 193 cover the child user condition. The phrase “likely to be accessed”, introduced in this House into what became the Data Protection Act 2018, is one of the most unlikely successful British exports. Both the phrase and its definition, set out by the ICO, have been embedded in regulations in countries the world over—yet the Bill replaces this established language while significantly watering down the definition.

The Bill requires

“a significant number of children”

to use the service, or for the service to be

“likely to attract a significant number of users who are children”.

“Significant” in the Bill is defined relative to the overall UK user base, which means that extremely large platforms could deem a few thousand child users not significant compared with the several million-strong user base. Since only services that cross this threshold need comply with the child safety duties, thousands of children will not benefit from the safety duties that the Minister told us last week were at the heart of the Bill.

Amendment 92 would put the ICO’s existing and much-copied definition into the Bill. It says a service is

“likely to be accessed by children”

if

“the service is designed or intended for use by children … children form a substantive and identifiable user group … the possibility of a child accessing the service is more probable than not, taking into consideration … the nature and content of the service and whether that has particular appeal for children … the way in which the service is accessed and any measures in place to prevent children gaining access … market research, current evidence on user behaviour, the user base of similar or existing services”

that are likely to be accessed.

Having two phrases and definitions is bad for business and even worse for regulators. The ICO has first-mover advantage and a more robust test. It is my contention that parents, media and perhaps even our own colleagues would be very shocked to know that the definition in the Bill has the potential for many thousands, and possibly tens of thousands, of children to be left without the protections that the Bill brings forward. Perhaps the Minister could explain why the Government have not chosen regulatory alignment, which is good practice.

Finally, I will speak briefly in support of Amendments 19, 22, 298 and 299. I am certain that the noble Baroness, Lady Harding, will spell out how the app stores of Google and Apple are simply a subset of “search”, in that they are gatekeepers to accessing more than 5 million apps worldwide and the first page of each is indeed a search function. Their inclusion should be obvious, but I will add a specific issue about which I have spoken directly with both companies and about which the 5Rights Foundation, of which I am chair, has written to the ICO.

When we looked at the age ratings of apps across Google Play Store and Apple, four things emerged. First, apps are routinely rated much lower than their terms and conditions: for example, Amazon Shopping says 18 but has an age rating of 4 on Apple. This pattern goes across both platforms, covering social sites, gaming, shopping, et cetera.

Secondly, the same apps and services did not have the same age rating across both services, which, between them, are gatekeepers for more than 95% of the app market. In one extreme case, an app rated four on one of them was rated 16 on the other, with other significant anomalies being extremely frequent.

Thirdly, almost none of the apps considered their data protection duties in coming to a decision on their age rating, which is a problem, since privacy and safety and inextricably linked.

Finally, in the case of Apple, using a device registered to a 15 year-old, we were able to download age-restricted apps including a dozen or more 18-plus dating sites. In fairness, I give a shoutout to Google, which, because of the age-appropriate design code, chose more than a year ago not to show 18-plus content to children in its Play Store. So this is indeed a political and business choice and not a question of technology. Millions of services are accessed via the App Store. Given the Government’s position—that gatekeepers have specific responsibilities in relation to harmful content and activity—surely the amendments in the name of the noble Baroness, Lady Harding, are necessary.

My preference was for a less complicated Bill based on principles and judged on outcomes. I understand that that ship has sailed, but it is not acceptable for the Government now to use the length and complexity of the Bill as a reason not to accept amendments that would fill loopholes where harm has been proven. It is time to deliver on the promises made to parents and children, and to put the onus for keeping young people safe online squarely on tech companies’ shoulders. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak to Amendments 19, 22, 298 and 299 in my name and those of the noble Baroness, Lady Stowell, and the noble Lords, Lord Knight and Lord Clement-Jones. I will also briefly add at the end of my speech my support for the amendments in the name of my friend, the noble Baroness, Lady Kidron. It has been a huge privilege to be her support act all the way from the beginnings of the age-appropriate design code; it feels comfortable to speak after her.

I want briefly to set out what my amendments would do. Their purpose is to bring app stores into the child protection elements of the Bill. Amendment 19 would require app stores to prepare

“risk assessments equal to user-to-user services due to their role in distributing online content through apps to children and as a primary facilitator of user-to-user”

services reaching children. Amendment 22 would mandate app stores

“to use proportionate and proactive measures, such as age assurance, to prevent children”

coming into contact with

“primary priority content that is harmful to children”.

Amendments 298 and 299 would simply define “app” and “app stores”.

Let us be clear what app stores do. They enable customers to buy apps and user-to-user services. They enable customers to download free apps. They offer up curated content in the app store itself and decide what apps someone would like to see. They enable customers to search for apps for user-to-user content. They provide age ratings; as the noble Baroness, Lady Kidron, said, they may be different age ratings in different app stores for the same app. They sometimes block the download of apps based on the age rating and their assessment of someone’s age, but not always, and it is different for different app stores.

Why should they be included in this Bill—if it is not obvious from what I have already said? First, two companies are profiting from selling user-to-user products to children. Two app stores account for some 98%-plus of all downloads of user-to-user services, with no requirements to assess the risk of selling those products to children or to mitigate those risks. We do not allow that in the physical world so we should not allow it in the digital world.

Secondly, parents and teenagers tell us that this measure would help. A number of different studies have been done; I will reference just two. One was by FOSI, the Family Online Safety Institute, which conducted an international research project in which parents consistently said that having age assurance at the app store level would make things simpler and more effective for them; ironically, the FOSI research was conducted with Google.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I want to reassure the noble Lord: I have his blog in front of me and he was quite right—there were not a lot of children on that site. It is a very good blog, which I read frequently.

I want to make two points. First, age-rating and age-gating are two different things, and I think the noble Lord has conflated them. There is a lot of age- rating going on, and it is false information. We need good information, and we have not managed to get it by asking nicely. Secondly, I slightly dispute his idea that we have a very structured Bill regarding user-to-user and so on. We have a very structured Bill from a harms perspective that describes the harms that must be prevented—and then we got to commercial porn, and we can also get to these other things.

I agree with the noble Lord’s point about freedom of speech, but we are talking about a fixed set of harms that will, I hope, be in the Bill by the end. We can then say that if children are likely to be accessed by this test, and known harm is there, that is what we are looking at. We are certainly not looking at the noble Lord’s blog.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I appreciate the intervention by the noble Baroness; I hope through this grit we may conjure up a pearl of some sort. The original concept of the Bill, as championed by the noble Baroness, would have been a generalised set of duties of care which could have stretched much more broadly. It has evolved in a particular direction and become ever more specific and tailored to those three services: user-to-user, search, and pornography services. Having arrived at that point, it is difficult to then open it back up and stretch it to reach other forms of service.

My intention in intervening in this debate is to raise some of those concerns because I think they are legitimate. I may be at the more sceptical end of the political world, but I am at the more regulation-friendly end of the tech community. This is said in a spirit of trying to create a Bill that will actually work. I have done the work, and I know how hard Ofcom’s job will be. That sums up what I am trying to say: my concern is that we should not give Ofcom an impossible job. We have defined something quite tight—many people still object to it, think it is too loose and do not agree with it—but I think we have something reasonably workable. I am concerned that, however tempting it is, by re-opening Pandora’s box we may end up creating something less workable.

That does not mean we should forget about app stores and non-user-to-user content, but we need to think of a way of dealing with those which does not necessarily just roll over the mechanism we have created in the Online Safety Bill to other forms of application.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I share noble Lords’ determination to deliver the strongest protections for children and to develop a robust and future-proofed regulatory regime. However, it will not be possible to solve every problem on the internet through this Bill, nor through any piece of legislation, flagship or otherwise. The Bill has been designed to confer duties on the services that pose the greatest risk of harm—user-to-user services and search services—and where there are proportionate measures that companies can take to protect their users.

As the noble Baroness, Lady Kidron, and others anticipated, I must say that these services act as a gateway for users to discover and access other online content through search results and links shared on social media. Conferring duties on these services will therefore significantly reduce the risk of users going on to access illegal or harmful content on non-regulated services, while keeping the scope of the Bill manageable and enforceable.

As noble Lords anticipated, there is also a practical consideration for Ofcom in all this. I know that many noble Lords are extremely keen to see this Bill implemented as swiftly as possible; so am I. However, as the noble Lord, Lord Allan, rightly pointed out, making major changes to the Bill’s scope at this stage would have significant implications for Ofcom’s implementation timelines. I say this at the outset because I want to make sure that noble Lords are aware of those implications as we look at these issues.

I turn first to Amendments 2, 3, 5, 92 and 193, tabled by the noble Baroness, Lady Kidron. These aim to expand the number of services covered by the Bill to incorporate a broader range of services accessed by children and a broader range of harms. I will cover the broader range of harms more fully in a separate debate when we come to Amendment 93, but I am very grateful to the noble Baroness for her constructive and detailed discussions on these issues over the past few weeks and months.

These amendments would bring new services into scope of the duties beyond user-to-user and search services. This could include services which enable or promote commercial harms, including consumer businesses such as online retailers. As I have just mentioned in relation to the previous amendments, bringing many more services into scope would delay the implementation of Ofcom’s priorities and risk detracting from its work overseeing existing regulated services where the greatest risk of harm exists—we are talking here about the services run by about 2.5 million businesses in the UK alone. I hope noble Lords will appreciate from the recent communications from Ofcom how challenging the implementation timelines already are, without adding further complication.

Amendment 92 seeks to change the child-user condition in the children’s access assessment to the test in the age-appropriate design code. The test in the Bill is already aligned with the test in that code, which determines whether a service is likely to be accessed by children, in order to ensure consistency for providers. The current child-user condition determines that a service is likely to be accessed by children where it has a significant number or proportion of child users, or where it is of a kind likely to attract a significant number or proportion of child users. This will already bring into scope services of the kind set out in this amendment, such as those which are designed or intended for use by children, or where children form a—

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

I am sorry to interrupt. Will the Minister take the opportunity to say what “significant” means, because that is not aligned with the ICO code, which has different criteria?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.

On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.

The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.

--- Later in debate ---
I hope that gives the noble Lord the reassurance that the points he is exploring through his amendments are covered. I invite him not to press them and, with a promise to continue discussions on the other amendments in this group, I invite their proposers to do the same.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank the Minister for an excellent debate; I will make two points. First, I think the Minister was perhaps answering on my original amendment, which I have narrowed considerably to services

“likely to be accessed by children”

and with proven harm on the basis of the harms described by the Bill. It is an “and”, not an “or”, allowing Ofcom to go after places that have proven to be harmful.

Secondly, I am not sure the Government can have it both ways—that it is the same as the age-appropriate design code but different in these ways—because it is exactly in the ways that it is different that I am suggesting the Government might improve. We will come back to both those things.

Finally, what are we asking here? We are asking for a risk assessment. The Government say there is no risk assessment, no harm, no mitigation, nothing to do. This is a major principle of the conversations we will have going forward over a number of days. I also believe in proportionality. It is basic product safety; you have a look, you have standards, and if there is nothing to do, let us not make people do silly things. I think we will return to these issues, because they are clearly deeply felt, and they are very practical, and my own feeling is that we cannot risk thousands of children not benefiting from all the work that Ofcom is going to do. With that, I beg leave to withdraw.

Amendment 2 withdrawn.
--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak in support of Amendment 9, tabled by the noble Lord, Lord Moylan, and in particular the proposed new paragraph 10A to Schedule 1. I hope I will find myself more in tune with the mood of the Committee on this amendment than on previous ones. I would be interested to know whether any noble Lords believe that Ofcom should be spending its limited resources supervising a site like Wikipedia under the new regime, as it seems to me patently obvious that that is not what we intend; it is not the purpose of the legislation.

The noble Lord, Lord Moylan, is right to remind us that one of the joys of the internet is that you buy an internet connection, plug it in and there is a vast array of free-to-use services which are a community benefit, produced by the community for the community, with no harm within them. What we do not want to do is interfere with or somehow disrupt that ecosystem. The noble Baroness, Lady Fox, is right to remind us that there is a genuine risk of people withdrawing from the UK market. We should not sidestep that. People who try to be law-abiding will look at these requirements and ask themselves, “Can I meet them?” If the Wikimedia Foundation that runs Wikipedia does not think it can offer its service in a lawful way, it will have to withdraw from the UK market. That would be to the detriment of children in the United Kingdom, and certainly not to their benefit.

There are principle-based and practical reasons why we do not want Ofcom to be operating in this space. The principle-based one is that it makes me uncomfortable that a Government would effectively tell their regulator how to manage neutral information sites such as Wikipedia. There are Governments around the world who seek to do that; we do not want to be one of those.

The amendment attempts to define this public interest, neutral, informational service. It happens to be user-to-user but it is not like Facebook, Instagram or anything similar. I would feel much more comfortable making it clear in law that we are not asking Ofcom to interfere with those kinds of services. The practical reason is the limited time Ofcom will have available. We do not want it to be spending time on things that are not important.

Definitions are another example of how, with the internet, it can often be extremely hard to draw bright lines. Functionalities bleed into each other. That is not necessarily a problem, until you try to write something into law; then, you find that your definition unintentionally captures a service that you did not intend to capture, or unintentionally misses out a service that you did intend to be in scope. I am sure the Minister will reject the amendment because that is what Ministers do; but I hope that, if he is not willing to accept it, he will at least look at whether there is scope within the Bill to make it clear that Wikipedia is intended to be outside it.

Paragraph 4 of Schedule 1 refers to “limited functionality services”. That is a rich vein to mine. It is clear that the intention is to exclude mainstream media, for example. It refers to “provider content”. In this context, Encyclopaedia Britannica is not in scope but Wikipedia is, the difference being that Wikipedia is constructed by users, while Encyclopaedia Britannica is regarded as being constructed by a provider. The Daily Mail is outside scope; indeed, all mainstream media are outside scope. Anyone who declares themselves to be media—we will debate this later on—is likely to be outside scope.

Such provider exemption should be offered to other, similar services, even if they happen to be constructed from the good will of users as opposed to a single professional author. I hope the Minister will be able to indicate that the political intent is not that we should ask Ofcom to spend time and energy regulating Wikipedia-like services. If so, can he point to where in the legislation we might get that helpful interpretation, in order to ensure that Ofcom is focused on what we want it to be focused on and not on much lower priority issues?

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

I will speak to a couple of the amendments in this group. First, small is not safe, and you cannot necessarily see these platforms in isolation. For example, there is an incel group that has only 4,000 active users, but it posts a great deal on YouTube and has 24.2 million users in that context. So we have to be clear that small and safe are not the same thing.

However, I am sympathetic to the risk-based approach. I should probably have declared an interest as someone who has given money to Wikipedia on several occasions to keep it going. I ask the Minister for some clarity on the systems and processes of the Bill, and whether the risk profile of Wikipedia—which does not entice you in and then follow you for the next six months once you have looked at something—is far lower than something very small that gets hold of you and keeps on going. I say that particularly in relation to children, but I feel it for myself also.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to the amendments in the name of the noble Lord, Lord Moylan, on moderation, which I think are more important than he has given himself credit for—they go more broadly than just Wikipedia.

There is a lot of emphasis on platform moderation, but the reality is that most moderation of online content is done by users, either individually or in groups, acting as groups in the space where they operate. The typical example, which many Members of this House have experienced, is when you post something and somebody asks, “Did you mean to post that?”, and you say, “Oh gosh, no”, and then delete it. A Member in the other place has recently experienced a rather high-profile example of that through the medium of the newspaper. On a much smaller scale, it is absolutely typical that people take down content every day, either because they regret it or, quite often, because their friends, families or communities tell them that it was unwise. That is the most effective form of moderation, because it is the way that people learn to change their behaviour online, as opposed to the experience of a platform removing content, which is often experienced as the big bad hand of the platform. The person does not learn to change their behaviour, so, in some cases, it can reinforce bad behaviour.

Community moderation, not just on Wikipedia but across the internet, is an enormous public good, and the last thing that we want to do in this legislation is to discourage people from doing it. In online spaces, that is often a volunteer activity: people give up their time to try to keep a space safe and within the guidelines they have set for that space. The noble Lord, Lord Moylan, has touched on a really important area: in the Bill, we must be absolutely clear to those volunteers that we will not create all kinds of new legal operations and liabilities on them. These are responsible people, so, if they are advised that they will incur all kinds of legal risk when trying to comply with the Online Safety Bill, they will stop doing the moderation—and then we will all suffer.

On age-gating, we will move to a series of amendments where we will discuss age assurance, but I will say at the outset, as a teaser to those longer debates, that I have sympathy with the points made by the noble Lord, Lord Moylan. He mentioned pubs—we often talk about real-world analogies. In most of the public spaces we enter in the real world, nobody does any ID checking or age checking; we take it on trust, unless and until you carry out an action, such as buying alcohol, which requires an age check.

It is legitimate to raise this question, because where we fall in this debate will depend on how we see public spaces. I see a general-purpose social network as equivalent to walking into a pub or a town square, so I do not expect to have my age and ID checked at the point at which I enter that public space. I might accept that my ID is checked at a certain point where I carry out various actions. Others will disagree and will say that the space should be checked as soon as you go into it—that is the boundary of the debate we will have across a few groups. As a liberal, I am certainly on the side that says that it is incumbent on the person wanting to impose the extra checks to justify them. We should not just assume that extra checks are cost-free and beneficial; they have a cost for us all, and it should be imposed only where there is a reasonable justification.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

Far be it for me to suggest that all the amendments tabled by the noble Lord, Lord Moylan, are in the wrong place, but I think that Amendment 26 might have been better debated with the other amendments on age assurance.

On community moderation, I underscore the point that Ofcom must have a risk profile as part of its operations. When we get to that subject, let us understand what Ofcom intends to do with it—maybe we should instruct Ofcom a little about what we would like it to do with it for community moderation. I have a lot of sympathy—but do not think it is a get-out clause—with seeing some spaces as less risky, or, at least, for determining what risky looks like in online spaces, which is a different question. This issue belongs in the risk profile: it is not about taking things out; we have to build it into the Bill we have.

On age assurance and AV, I do not think that today is the day to discuss it in full. I disagree with the point that, because we are checking kids, we have to check ourselves—that is not where the technology is. Without descending into technical arguments, as the noble Lord, Lord Moylan, asked us not to, we will bring some of those issues forward.

The noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford have a package of amendments which are very widely supported across the Committee. They have put forward a schedule of age assurance that says what the rules of the road are. We must stop pretending that age assurance is something that is being invented now in this Bill. If you log into a website with your Facebook login, it shares your age—and that is used by 42% of people online. However, if you use an Apple login, it does not share your age, so I recommend using Apple—but, interestingly, it is harder to find that option on websites, because websites want to know your age.

So, first, we must not treat age assurance as if it has just been invented. Secondly, we need to start to have rules of the road, and ask what is acceptable, what is proportionate, and when we will have zero tolerance. Watching faces around the Committee, I say that I will accept zero tolerance for pornography and some other major subjects, but, for the most part, age assurance is something that we need to have regulated. Currently, it is being done to us rather than in any way that is transparent or agreed, and that is very problematic.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I hesitated to speak to the previous group of amendments, but I want to speak in support of the issue of risk that my noble friend Lady Kidron raised again in this group of amendments. I do not believe that noble Lords in the Committee want to cut down the amount of information and the ability to obtain information online. Rather, we came to the Bill wanting to avoid some of the really terrible harms promoted by some websites which hook into people’s vulnerability to becoming addicted to extremely harmful behaviours, which are harmful not only to themselves but to other people and, in particular, to children, who have no voice at all. I also have a concern about vulnerable people over the age of 18, and that may be something we will come to later in our discussions on the Bill.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords I am grateful to all noble Lords who have contributed to this slightly disjointed debate. I fully accept that there will be further opportunities to discuss age verification and related matters, so I shall say no more about that. I am grateful, in particular, to the noble Lord, Lord Allan of Hallam, for supplying the deficiency in my opening remarks about the importance of Amendments 10 and 11, and for explaining just how important that is too. I also thank the noble Lord, Lord Stevenson. It was good of him to say, in the open approach he took on the question of age, that there are issues still to be addressed. I do not think anybody feels that we have yet got this right and I think we are going to have to be very open in that discussion, when we get to it. That is also true about what the noble Lord, Lord Allan of Hallam, said: we have not yet got clarity as to where the age boundary is—I like his expression—for the public space. Where is the point at which, if checks are needed, those checks are to be applied? These are all matters to discuss and I hope noble Lords will forgive me if I do not address each individual contribution separately.

I would like to say something, I hope not unfairly or out of scope, about what was said by the noble Baronesses, Lady Finlay of Llandaff and Lady Kidron, when they used, for the first time this afternoon, the phrase “zero tolerance”, and, at the same time, talked about a risk-based approach. I have, from my own local government experience, a lot of experience of risk-based approaches taken in relation to things—very different, of course, from the internet—such as food safety, where local authorities grade restaurants and food shops and take enforcement action and supervisory action according to their assessment of the risk that those premises present. That is partly to do with their assessment of the management and partly to do with their experience of things that have gone wrong in the past. If you have been found with mouse droppings and you have had to clean up the shop, then you will be examined a great deal more frequently until the enforcement officers are happy; whereas if you are always very well run, you will get an inspection visit maybe only once a year. That is what a risk-based assessment consists of. The important thing to say is that it does not produce zero tolerance or zero outcomes.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I just want to make the point that I was talking about zero tolerance at the end of a ladder of tolerance, just to be clear. Letting a seven-year-old child into an 18-plus dating app or pornographic website is where the zero tolerance is—everything else is a ladder up to that.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I beg the noble Baroness’s pardon; I took that for granted. There are certain things—access to pornography, material encouraging self-harm and things of that sort—where one has to have zero tolerance, but not everything. I am sorry I took that for granted, so I fully accept that I should have made that more explicit in my remarks. Not everything is to be zero-toleranced, so to speak, but certain things are. However, that does not mean that they will not happen. One has to accept that there will be leakage around all this, just as some of the best-run restaurants that have been managed superbly for years will turn out, on occasion, to be the source of food poisoning. One has to accept that this is never going to be as tight as some of the advocates wanted, but with that, I hope I will be given leave to withdraw—

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.

The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I also welcome these amendments, but I have two very brief questions for the Minister. First, in Amendment 27A, it seems that the child risk assessment is limited only to category 1 services and will be published only in the terms of service. As he probably knows, 98% of people do not read terms of service, so I wondered where else we might find this, or whether there is a better way of dealing with it.

My second question is to do with Amendments 64A and 88A. It seems to me—forgive me if I am wrong—that the Bill previously stipulated that all regulated search and user services had to make and keep a written record of any measure taken in compliance with a relevant duty, but now it seems to have rowed back to only category 1 and 2A services. I may be wrong on that, but I would like to check it for the record.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Kidron, put her finger exactly on the two questions that I wanted to ask: namely, why only category 1 and category 2A, and is there some rowing back involved here? Of course, none of this prejudices the fact that, when we come later in Committee to talk about widening the ambit of risk assessments to material other than that which is specified in the Bill, this kind of transparency would be extremely useful. But the rationale for why it is only category 1 and category 2A in particular would be very useful to hear.

My concern is that, without also asserting a clear vision of what we want to happen as a result of passing this Bill, Parliament, and the people who use the social media and search resources of the internet, will have no framework by which to judge the success or otherwise of this important Bill. This amendment, as I said, is primarily declaratory, but I hope I have proved that it is also purposive. I think that the Government should look at it very carefully and hope that they will accept it. I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I draw attention to my interests in the register, which I declared in full at Second Reading. It is an absolute pleasure to follow the noble Lord, Lord Stevenson, and, indeed, to have my name on this amendment, along with those of fellow members of the pre-legislative committee. It has been so long that it almost qualifies as a reunion tour.

This is a fortuitous amendment on which to start our deliberations, as it sets out the very purpose of the Bill—a North Star. I want to make three observations, each of which underlines its importance. First, as the pre-legislative committee took evidence, it was frequently remarked by both critics and supporters that it was a complicated Bill. We have had many technical briefings from DSIT and Ofcom, and they too refer to the Bill as “complicated”. As we took advice from colleagues in the other place, expert NGOs, the tech sector, academics and, in my own case, the 5Rights young advisory group, the word “complicated” repeatedly reared its head. This is a complex and ground-breaking area of policy, but there were other, simpler structures and approaches that have been discarded.

Over the five years with ever-changing leadership and political pressures, the Bill has ballooned with caveats and a series of very specific, and in some cases peculiar, clauses—so much so that today we start with a Bill that even those of us who are paying very close attention are often told that we do not understand. That should make the House very nervous.

It is a complicated Bill with intersecting and dependent clauses—grey areas from which loopholes emerge—and it is probably a big win for the deepest pockets. The more complicated the Bill is, the more it becomes a bonanza for the legal profession. As the noble Lord, Lord Stevenson, suggests, the Minister is likely to argue that the contents of the amendment are already in the Bill, but the fact that the word “complicated” is firmly stuck to its reputation and structure is the very reason to set out its purpose at the outset, simply and unequivocally.

Secondly, the OSB is a framework Bill, with vast amounts of secondary legislation and a great deal of work to be implemented by the regulator. At a later date we will discuss whether the balance between the Executive, the regulator and Parliament is exactly as it should be, but as the Bill stands it envisages a very limited future role for Parliament. If I might borrow an analogy from my previous profession, Parliament’s role is little more than that of a background extra.

I have some experience of this. In my determination to follow all stages of the age-appropriate design code, I found myself earlier this week in the Public Gallery of the other place to hear DSIT Minister Paul Scully, at Second Reading of the Data Protection and Digital Information (No. 2) Bill, pledge to uphold the AADC and its provisions. I mention this in part to embed it on the record—that is true—but primarily to make this point: over six years, there have been two Information Commissioners and double figures of Secretaries of State and Ministers. There have been many moments at which the interpretation, status and purpose of the code has been put at risk, at least once to a degree that might have undermined it altogether. At these moments, each time the issue was resolved by establishing the intention of Parliament beyond doubt. Amendment 1 moves Parliament from background extra to star of the show. It puts the intention of Parliament front and centre for the days, weeks, months and years ahead in which the work will still be ongoing—and all of us will have moved on.

The Bill has been through a long and fractured process in which the pre-legislative committee had a unique role. Many attacks on the Bill have been made by people who have not read it. Child safety was incorrectly cast as the enemy of adult freedom. While some wanted to apply the existing and known concepts and terms of public interest, protecting the vulnerable, product safety and the established rights and freedoms of UK citizens, intense lobbying has seen them replaced by untested concepts and untried language over which the tech sector has once again emerged as judge and jury. This has further divided opinion.

In spite of all the controversy, when published, the recommendations of the committee report received almost universal support from all sides of the debate. So I ask the Minister not only to accept the committee’s view that the Bill needs a statement of purpose, the shadow of which will provide shelter for the Bill long into the future, but to undertake to look again at the committee report in full. In its pages lies a landing strip of agreement for many of the things that still divide us.

This is a sector that is 100% engineered and almost all privately owned, and within it lie solutions to some of the greatest problems of our age. It does not have to be as miserable, divisive and exploitative as this era of exceptionalism has allowed it to be. As the Minister is well aware, I have quite a lot to say about proposed new subsection (1)(b),

“to provide a higher level of protection for children than for adults”,

but today I ask the Minister to tell us which of these paragraphs (a) to (g) are not the purpose of the Bill and, if they are not, what is.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am pleased that we are starting our Committee debate on this amendment. It is a pleasure to follow the noble Lord, Lord Stevenson, and the noble Baroness, Lady Kidron.

In this Bill, as has already been said, we are building a new and complex system and we can learn some lessons from designing information systems more generally. There are three classic mistakes that you can make. First, you can build systems to fit particular tools. Secondly, you can overcommit beyond what you can actually achieve. Thirdly, there is feature creep, through which you keep adding things on as you develop a new system. A key defence against these mistakes is to invest up front in producing a really good statement of requirements, which I see in Amendment 1.

On the first risk, as we go through the debate, there is a genuine risk that we get bogged down in the details of specific measures that the regulator might or might not include in its rules and guidance, and that we lose sight of our goals. Developing a computer system around a particular tool—for example, building everything with Excel macros or with Salesforce—invariably ends in disaster. If we can agree on the goals in Amendment 1 and on what we are trying to achieve, that will provide a sound framework for our later debates as we try to consider the right regulatory technologies that will deliver those goals.

The second cardinal error is overcommitting and underdelivering. Again, it is very tempting when building a new system to promise the customer that it will be all-singing, all-dancing and can be delivered in the blink of an eye. Of course, the reality is that in many cases, things prove to be more complex than anticipated, and features sometimes have to be removed while timescales for delivering what is left are extended. A wise developer will instead aim to undercommit and overdeliver, promising to produce a core set of realistic functions and hoping that, if things go well, they will be able to add in some extra features that will delight the customer as an unexpected bonus.

This lesson is also highly relevant to the Bill, as there is a risk of giving the impression to the public that more can be done quicker than may in fact be possible. Again, Amendment 1 helps us to stay grounded in a realistic set of goals once we put those core systems in place. The fundamental and revolutionary change here is that we will be insisting that platforms carry out risk assessments and share them with a regulator, who will then look to them to implement actions to mitigate those risks. That is fundamental. We must not lose sight of that core function and get distracted by some of the bells and whistles that are interesting, but which may take the regulator’s attention away from its core work.

We also need to consider what we mean by “safe” in the context of the Bill and the internet. An analogy that I have used in this context, which may be helpful, is to consider how we regulate travel by car and aeroplane. Our goal for air travel is zero accidents, and we regulate everything down to the nth degree: from the steps we need to take as passengers, such as passing through security and presenting identity documents, to detailed and exacting safety rules for the planes and pilots. With car travel, we have a much higher degree of freedom, being able to jump in our private vehicles and go where we want, when we want, pretty much without restrictions. Our goal for car travel is to make it incrementally safer over time; we can look back and see how regulation has evolved to make vehicles, roads and drivers safer year on year, and it continues to do so. Crucially, we do not expect car travel to be 100% safe, and we accept that there is a cost to this freedom to travel that, sadly, affects thousands of people each year, including my own family and, I am sure, many others in the House. There are lots of things we could do to make car travel even safer that we do not put into regulation, because we accept that the cost of restricting freedom to travel is too high.

Without over-labouring this analogy, I ask that we keep it in mind as we move through Committee—whether we are asking Ofcom to implement a car-like regime whereby it is expected to make continual improvements year on year as the state of online safety evolves, or we are advocating an aeroplane-like regime whereby any instance of harm will be seen as a failure by the regulator. The language in Amendment 1 points more towards a regime of incremental improvements, which I believe is the right one. It is in the public interest: people want to be safer online, but they also want the freedom to use a wide range of internet services without excessive government restriction, and they accept some risk in doing so.

I hope that the Minister will respond positively to the intent of Amendment 1 and that we can explore in this debate whether there is broad consensus on what we hope the Bill will achieve and how we expect Ofcom to go about its work. If there is not, then we should flush that out now to avoid later creating confused or contradictory rules based on different understandings of the Bill’s purpose. I will keep arguing throughout our proceedings for us to remain focused on giving the right goals to Ofcom and allowing it considerable discretion over the specific tools it needs, and for us to be realistic in our aims so that we do not overcommit and underdeliver.

Finally, the question of feature creep is very much up to us. There will be a temptation to add things into the Bill as it goes through. Some of those things are essential; I know that the noble Baroness, Lady Kidron, has some measures that I would also support. This is the right time to do that, but there will be other things that would be “nice to have”, and the risk of putting them in might detract from those core mechanisms. I hope we are able to maintain our discipline as we go through these proceedings to ensure we deliver the right objectives, which are incredibly well set out in Amendment 1, which I support.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

I strongly support my noble friend in his amendment. I clarify that, in doing so, I am occupying a guest slot on the Front Bench: I do so as a member of his team but also as a member of the former Joint Committee. As my noble friend set out, this reflects where we got to in our thinking as a Joint Committee all that time ago. My noble friend said “at last”, and I echo that and what others said. I am grateful for the many briefings and conversations that we have had in the run-up to Committee, but it is good to finally be able to get on with it and start to clear some of these things out of my head, if nothing else.

In the end, as everyone has said, this is a highly complex Bill. Like the noble Baroness, Lady Stowell, in preparation for this I had another go at trying to read the blooming thing, and it is pretty much unreadable —it is very challenging. That is right at the heart of why I think this amendment is so important. Like the noble Baroness, Lady Kidron, I worry that this will be a bonanza for the legal profession, because it is almost impenetrable when you work your way through the wiring of the Bill. I am sure that, in trying to amend it, some of us will have made errors. We have been helped by the Public Bill Office, but we will have missed things and got things the wrong way around.

It is important to have something purposive, as the Joint Committee wanted, and to have clarity of intent for Ofcom, including that this is so much more about systems than about content. Unlike the noble Baroness, Lady Stowell—clearly, we all respect her work chairing the communications committee and the insights she brings to the House—I think that a very simple statement, restricting it just to proposed new paragraph (g), is not enough. It would almost be the same as the description at the beginning of the Bill, before Clause 1. We need to go beyond that to get the most from having a clear statement of how we want Ofcom to do its job and the Secretary of State to support Ofcom.

I like what the noble Lord, Lord Allan, said about the risk of overcommitment and underdevelopment. When the right reverend Prelate the Bishop of Oxford talked about being the safest place in the world to go online, which is the claim that has been made about the Bill from the beginning, I was reminded again of the difficulty of overcommitting and underdelivering. The Bill is not perfect, and I do not believe that it will be when this Committee and this House have finished their work; we will need to keep coming back and legislating and regulating in this area, as we pursue the goal of being the safest place in the world to go online —but it will not be any time soon.

I say to the noble Baroness, Lady Fox, who I respect, that I understand what she is saying about some of her concerns about a risk-free child safety regime and the unintended consequences that may come in this legislation. But at its heart, what motivate us and make us believe that getting the Bill right is one of the most important things we will do in all of our times in this Parliament are the unintended consequences of the algorithms that these tech companies have created in pushing content at children that they do not want to hear. I see the noble Baroness, Lady Kidron, wanting to comment.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I just want to say to the noble Baroness, Lady Fox, that we are not looking to mollycoddle children or put them in cotton wool; we are asking for a system where they are not systematically exploited by major companies.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I very much agree. The core of what I want to say in supporting this amendment is that in Committee we will do what we are here to do. There are a lot of amendments to what is a very long and complicated Bill: we will test the Minister and his team on what the Government are trying to achieve and whether they have things exactly right in order to give Ofcom the best possible chance to make it work. But when push comes to shove at the end of the process, at its heart we need to build trust in Ofcom and give it the flexibility to be able to respond to the changing online world and the changing threats to children and adults in that online world. To do that, we need to ensure that we have the right amount of transparency.

I was particularly pleased to see proposed new paragraph (g) in the amendment, on transparency, as referenced by the noble Baroness, Lady Stowell. It is important that we have independence for Ofcom; we will come to that later in Committee. It is important that Parliament has a better role in terms of accountability so that we can hold Ofcom to account, having given it trust and flexibility. I see this amendment as fundamental to that, because it sets the framework for the flexibility that we then might want to be able to give Ofcom over time. I argue that this is about transparency of purpose, and it is a fundamental addition to the Bill to make it the success that we want.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I declare my interests as chair of 5Rights Foundation and the Digital Futures Commission, my positions at Oxford and LSE and at the UN Broadband Commission and the Institute for Ethics in AI, as deputy chair of the APPG on digital regulation and as a member of the Joint Committee on this Bill.

As has already been mentioned, on Monday I hosted the saddest of events, at which Ian Russell and Merry Varney, the Russell family’s solicitor, showed parliamentarians images and posts that had been algorithmically recommended to Molly in the lead-up to her death. These were images so horrible that they cannot be shown in the media, so numerous that we could see only a fraction, and so full of despair and violence that many of the adult professionals involved in the inquest had to seek counselling. Yet in court, much of this material was defended by two tech companies as being suitable for a 14 year-old. Something has gone terribly wrong. The question is: is this Bill sufficient to fix it?

At the heart of our debates should not be content but the power of algorithms that shape our experiences online. Those algorithms could be designed for any number of purposes, including offering a less toxic digital environment, but they are instead fixed on ranking, nudging, promoting and amplifying anything to keep our attention, whatever the societal cost. It does not need to be like that. Nothing about the digital world is a given; it is 100% engineered and almost all privately owned; it can be designed for any outcome. Now is the time to end the era of tech exceptionality and to mandate a level of product safety so that the sector, just like any other sector, does not put its users at foreseeable risk of harm. As Meta’s corporate advertising adorning bus stops across the capital says:

“The metaverse may be virtual, but the impact will be real.”


I very much welcome the Bill, but there are still matters to discuss. The Government have chosen to take out many of the protections for adults, which raises questions about the value and practicality of what remains. In Committee, it will be important to understand how enforcement of a raft of new offences will be resourced and to question the oversight and efficacy of the remaining adult provisions. Relying primarily on companies to be author, judge and jury of their own terms of service may well be a race to the bottom.

I regret that Parliament has been denied the proper opportunity to determine what kind of online world we want for adults, which, I believe, we will regret as technology enters its next phase of intelligence and automation. However, my particular concern is the fate of children, whose well-being is collateral damage to a profitable business model. Changes to the Bill will mean that child safety duties are no longer an add-on to a generally safer world; they are now the first and only line of defence. I have given the Secretary of State sight of my amendments, and I inform the House that they are not probing amendments; they are necessary to fill the gaps and loopholes in the Bill as it now stands. In short, we need to ensure that child safety duties apply to all services likely to be accessed by children. We must ensure the quality control of all age-assurance systems. Age checking must not focus on a particular harm, but on the child; it needs to be secure, privacy-preserving and proportionate, and it must work. The children’s risk assessment and the list of harms must cover each of the four Cs: content harm, conduct harm, contact harm and commercial harm, such as the recommendation loops of violence and self-hatred that push thousands of children into states of misery. Those harms must be in the Bill.

Coroners and bereaved parents must have access to data relevant to the death of a child to end the current inhumane arrangement whereby bereaved families facing the devasting loss of their child are forced to battle, unsuccessfully, with tech behemoths for years. I hope that the Minister will reiterate commitments made in the other place to close that loophole.

Children’s rights must be in the Bill. An unintended consequence of removing protections for adults is that children will now cost companies vastly more developer time, more content moderation and more legal costs than adults. The digital world is the organising technology of our society, and children need to be online for their education and information to participate in civic society—they must not be kicked out.

I thank all those who have indicated their support, and the Secretary of State, the Minister and officials for the considerable time they have given me. However, I ask the Minister to listen very carefully to the mood of the House this evening; the matters I have raised are desperately urgent and long-promised, and must now be delivered unequivocally.

While millions of children suffer from the negative effects of the online world, some pay with their lives. I am a proud supporter of a group of bereaved parents for online safety, and I put on the record that we remember Molly, Frankie, Olly, Breck, Sophie and all the others who have lost their lives. I hope that the whole House will join me in not resting until we have a Bill fit for their memory.

Online Safety Bill

Baroness Kidron Excerpts
Monday 7th November 2022

(2 years, 1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Asked by
Baroness Kidron Portrait Baroness Kidron
- Hansard - -

To ask His Majesty’s Government, in light of the Prevention of Future Deaths Report published at the conclusion of the Molly Russell inquest, what plans they have to bring forward the Online Safety Bill in sufficient time to ensure its passage during this parliamentary Session.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, in begging leave to ask the Question of which I have given private notice, I declare my interests, particularly as founder and chair of 5Rights Foundation.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the arrangement of parliamentary business is, as the noble Baroness will appreciate, a matter for business managers through the usual channels. However, the Bill remains a priority. The Secretary of State committed on 20 October to bringing it back to Parliament shortly. We will continue to work with noble Lords, Members in another place and others on the passage of this important legislation.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

I thank the Minister for that reply and am happy to see him back in his place. However, after four years of waiting, I am afraid his Answer was not quite good enough.

Coroner Walker’s landmark judgment that Molly Russell died after suffering negative effects of online content, and his Prevention of Future Deaths Report, deserve to be met with action. That action should be finally bringing forward the Online Safety Bill. Molly Russell died five years ago, the same five years in which we have been working on the Online Safety Bill, in the absence of which children suffer an aggressive bombardment of material that valorises self-harm, body dysmorphia, violent porn and, of course, suicide— real harms to real children. Does the Minister agree that it is time to stop this suffering and commit to bringing the Bill to this House before the end of this month, which is the date by which we have been told we need it to ensure correct scrutiny and its passage in this Session?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this important legislation has indeed been a long time coming. I was a special adviser in the Home Office when it was first proposed and was in Downing Street when it was first put in the Conservative manifesto in 2017. Like the noble Baroness, I am very keen to see it in your Lordships’ House so that it can be properly scrutinised, so that we can deliver the protections that we all want to see for children and vulnerable people. The noble Baroness is tireless in her defence of these people. She served excellently on the Joint Committee, which has already looked at the Bill. Like her, I am very keen to get it before your Lordships’ House so that we can continue.

Media Literacy

Baroness Kidron Excerpts
Monday 20th June 2022

(2 years, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Through the Online Safety Bill, we are giving Ofcom strengthened media literacy functions on transparency reporting, information gathering and the other areas I set out. However, through its strategy announced in December last year, Ofcom has set out its own expanded work programme to discharge its existing duty, which includes pilots, campaigns to promote media literacy, establishing best practice and creating guidance on evaluation, so we are pleased to see that it is using and extending the powers that it has.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, many digital literacy programmes are provided free of charge to schools by private companies with an emphasis that teaches children about user behaviour rather than the risks created by those very same companies. Given the lack of provision in the Bill, perhaps the Minister could say what plans Her Majesty’s Government have to ensure that schools are not simply marketing tech products but offering a holistic digital literacy to children that is independent of those tech companies?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Digital literacy is a key priority in the computing national curriculum in England, which equips people with knowledge, understanding and skills to use the internet creatively and purposefully. Through citizenship education and other subjects, as I mentioned, we are making sure that schoolchildren are equipped with the skills that they need, and of course the companies themselves have a role to play in delivering and funding media literacy education. We welcome the steps that platforms have already taken, but we believe that they can go further to empower and educate their users.

Digital Technology (Democracy and Digital Technologies Committee Report)

Baroness Kidron Excerpts
Friday 11th March 2022

(2 years, 9 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I declare my interests as chair of 5Rights, a member of the Joint Committee and a member of the digital democracy inquiry. I too pay tribute to Lord Puttnam, our wonderful chairman. For the record, I give thanks for his many acts of kindness and the guidance he offered me as I started to grapple with the world of legislation and politics. He represents the best of this House.

In the life of tech, 20 months is a long time. In that time, we have seen Maria Ressa get a Nobel Peace Prize for her efforts to protect free speech, battling hostile state actors and identifying Facebook’s chilling effect on journalism and the broader information ecosystem. Her words reverberate as we witness the information conflict and resulting violence waging across the globe:

“I don’t think we have wrapped our heads around how much technology has allowed the manipulation of individuals and democracies.”


From Frances Haugen, we heard that Facebook ignored its own research showing Instagram made teens anxious and depressed. To Congress, she said:

“I am here today because I believe Facebook’s products harm children, stoke division, and weaken our democracy.”


Of course, we also had the horror of Covid misinformation, the attacks on black footballers, and a 77% increase in self-generated abuse—all fuelled by algorithms supercharged to spread at any cost and whatever the harm to people or society.

Facebook, feeling the heat, went for rebranding, rather than fixing the problem. It re-emerged as Meta with a toe-curling video of two grown men playing in the office and a $10 billion a year plan to make the metaverse our new home—setting off a goldrush in which even McDonald’s filed for trademarks for virtual restaurants. Within weeks, we saw headlines that read

“the metaverse—already a home to sex predators”,

and

“Metaverse app allows kids into virtual strip clubs.”

My own personal low point was reading about a New York Times journalist who arrived in the metaverse to find that another avatar immediately groped her and then ejaculated in her face. Her pleas to stop were unheard until her abuser, satisfied, walked jauntily away.

Before I continue, I will make two things abundantly clear. First, Meta is not the only tech company at fault. Indeed, it is, by some measure, not the worst. However, it epitomises the culture of a sector that fails to protect its users and spends a fortune on lobbying to make certain that none of the rest of us does. Secondly, the metaverse is not, in and of itself, a problem. My own brief forays include an extraordinary whodunit adventure with a film noir aesthetic, and a fantastic training environment for social workers that allow them to rehearse how to spot signs of abuse or neglect. None the less, Meta has done us an extraordinary favour in showing us that we cannot slice and dice the digital world. The time for picking battles or offering partial protections has passed, because the technology and its associated risk are interconnected and constantly evolving. Laws must be about principles and product safety, with a radical transparency and democratic accountability. In a connected world, the risk to the user is the weakest link.

Twenty months is a long time in the life of a child. I have stood here too many times telling the Minister and his multiple predecessors about the real-time costs to children’s bodies, mental health, life chances and, in tragic cases, lives. It is not too late to fast-track privacy-preserving age-assurance regulation. The daily harms experienced by children, while they wait, must surely be on the consciences of officials and Ministers. So too, the support desperately needed by bereaved parents—in their quest to get tech companies to hand over information which may save other children from a similar fate—cannot wait until 2025.

I believe the online safety Bill will be published on Tuesday, so I will not ask questions that cannot be answered here. But while every part of my being hopes that the Bill will reflect the recommendations of both the Democracy and Digital Technologies Committee and the Joint Committee to simplify, strengthen, future-proof and make enforceable the online safety regime, I fear that we will simply get a series of eye-catching additions to the draft Bill which will fail to make the systemic changes necessary.

Last night, I was speaking to a group of teenagers, one of whom said, “The digital world is our oyster, you should assume we are there until you can prove we are not.” Another simply said, “I don’t think it’s right that the tech companies can prey on us.” They know, we know and the Government know what this Bill must do. It is the job of this House and the other place to make sure that the online safety Bill is fit for the future and, in being so, reinstates the trust that Lord Puttnam so desperately wanted to see.

Children: Online Protection

Baroness Kidron Excerpts
Thursday 10th February 2022

(2 years, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am grateful for the noble Baroness’s support for the newer measures the Government announced this week. Of course, we will be responding in full to the work of the Joint Committee and the DCMS Select Committee in the other place. We have looked at the draft online safety Bill to respond to the further recommendations and suggestions they have made. However, we have not been inactive in the meantime. In June last year, for example, we published safety by design guidance and a one-stop shop on child online safety, which provided guidance on steps platforms can take to design safer services and protect children. Last July, we published our Online Media Literacy Strategy, which supports the empowerment of users. So we are taking steps, as well as introducing the Bill, which will be coming soon.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I also welcome the new commissioner, John Edwards, to his role, and congratulate the Government on this week’s announcement that the online safety Bill will include statutory guidance for privacy-preserving age assurance. Given that, to date, many of the eye-catching changes brought about by the age-appropriate design code, such as safe search and dismantling direct messaging by unknown adults to children, have been rolled out globally, are the Government working with the new commissioner to encourage the UK’s allies and trading partners to adopt the code in other jurisdictions to better enforce its provisions? Does he agree that regulatory alignment between the online safety Bill and the code is essential if we are to keep children safe?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am very grateful for the noble Baroness’s welcome for the new measures. There is agreement at an international level and within the UK that much more needs to be done to create a safer online environment for children, and the noble Baroness has played a significant part in fostering that agreement. The Information Commissioner has an international team responsible for engaging with data protection and information regulators all over the world. He is himself a former privacy commissioner in New Zealand, while his predecessor worked in this area in Canada, and I think that is to the great benefit of international dialogue. The international team works to ensure that the ICO’s regulatory and other priorities are appropriately reflected in international discussions. Through its work in organisations such as the OECD, the Council of Europe and the Global Privacy Assembly, the ICO also influences work on the interoperability of global data protection regimes.

Social Media: Deaths of Children

Baroness Kidron Excerpts
Thursday 20th January 2022

(2 years, 11 months ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Asked by
Baroness Kidron Portrait Baroness Kidron
- Hansard - -

To ask Her Majesty’s Government what assessment they have made of the role played by social media in the deaths of children in the United Kingdom, including by suicide, self-harm and murder.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I declare my interests, particularly as chair of 5Rights and as a member of the Joint Committee on the Draft Online Safety Bill.

My Lords, many of you will have read reports of how, in 2017, 14 year-old Molly Russell took her own life after being bombarded by self-harm and pro-suicide images. In the days after her death, her father Ian tried to access her phone simply to try to understand what had happened to his daughter. The notes from his diary from that time make for grim reading. The woman at the so-called genius bar in the Apple store “could not help”. The promised follow-up call failed to materialise—despite Ian sitting grief-struck, pen in hand, waiting at the appointed hour. Even after he finally found a person enabled to deal with him, they were only allowed to send a template information request form by email, which required a great deal of information from Ian but did not result in him receiving the information he requested. Apple has never helped Ian to access Molly’s phone, and without the assistance—indeed, the persistence—of the coroner and the police, the data it contained would not be available to Molly’s inquest, which is still investigating the contributory causes to Molly’s death four years later.

Judy and Andy Thomas struggled similarly after the suicide of their 15 year-old daughter Frankie, unable to get anything more than an automated response. Their letters to Instagram’s CEO Adam Mosseri, copied to the European headquarters, went unanswered. It was only after a year of desperate letter writing to anyone who might help that I was able to arrange a call on their behalf, only for them to hear that they were not going to get the information they wanted. During Frankie’s inquest, despite evidence that her suicide was highly influenced by what she had seen online, Wattpad refused to disclose full details of Frankie’s activity on its platform, even while confirming that self-harm and suicide stories on its site should be rated mature and should not have been accessible to a user registered as a child.

Olly Stephens, who was 13 when he was murdered, had repeated problems online. He was groomed by a wannabe county lines gang, extorted by a group who stole his bike and, finally, lured to a park where he was killed, the murder having been organised online. His father Stuart says that in the hours immediately after his murder, Olly’s mother and sister had to trawl through social media sites to get evidence because they were aware that they would never get it from the tech companies.

When a child dies, parents are asked to clear out the school locker: they inherit the artefacts of a child’s life. If the authorities have access to information that may shed light on the circumstances of their death, it is shared as a matter of course—but not if that information is online. The argument made by the tech sector is that it is protecting other users, but that does not account for parents’ need for closure and evidence necessary for police and coroners, and it conveniently obscures the role of the tech companies themselves as they continue to recommend harmful material and facilitate violent abuse to other children.

In the other place two days ago, Ian Paisley MP introduced a 10-minute rule Bill to grant next of kin the right to access a smartphone and other digital devices of a person upon their death or incapacity. He made the important point that much precious material, both sentimental and material to understanding what happened, is withheld from the next of kin simply because people—particularly the young—do not think to leave a password in their will. Indeed, it is unlikely any child would even have a will. He also pointed out that access was eminently possible: in the US some states have brought in legislation, such as the Revised Uniform Fiduciary Access to Digital Assets Act, to retrieve financial assets. Once again, money trumps child safety.

The Joint Committee made two recommendations on this issue: that the Government should consult on how terms and conditions of online services can be reformed, by law, to give bereaved parents access to data; and that Ofcom, the ICO and the Chief Coroner should review the powers of coroners to ensure that they have unfettered access to digital data, including data recommended to children by tech companies, following the death of a child—and that both of those should happen before the Bill reaches Royal Assent.

I ask the Minister to put on record today that the draft Bill will be amended so that other families do not suffer as the Russell, Thomas and Stephens families have done. We cannot bring their children back, but we can create a lasting legacy for their extraordinary courage in speaking out.

The purpose of today’s debate is not only to secure justice for bereaved families, but to highlight steps that should be taken to prevent tragedy. Sitting on the Secretary of State’s desk is a comprehensive set of recommendations from the Joint Committee that would fundamentally change how the sector treats children. They are: mandatory safety by design to scale back harmful algorithms, design features and business practices; a binding child safety code that sets out risks and mitigations in accordance with the Convention on the Rights of the Child; alignment with the age-appropriate design code to make sure the Bill applies to all services likely to be accessed by children, so that there is nowhere to hide; mandatory cross-platform co-operation, so that risks known by one service are routinely shared with others; statutory codes for moderation and complaints, to ensure that swift action is taken before tragedy strikes; and a regulatory focus on risk rather than size. Again and again we see that small is not safe. I refer back to the content Frankie saw on Wattpad, a service that many of you will never have heard of.

There should also, of course, be the immediate introduction of age assurance, without which we will fail to deliver any of the protections that I have set out. This list is neither aspirational nor nice to have: these are essential and interdependent elements of a proportionate and enforceable regime to make our children safe. All other business sectors apply rules of product safety, and it is tragic that it has taken the death of children to give urgency to our calls for regulation.

TikTok, Meta, Apple and Alphabet are among the most valuable and profitable companies in the world, and the tech sector is now alone responsible for 25% of global GDP. But these same companies are algorithmically promoting and spreading material that nudges children into states of despair; priming kids into gambling habits with reward features that induce dopamine hits, which cause addiction; granting unfettered access to age-restricted spaces; fuelling an epidemic of eating disorders, self-harm and radicalisation; and systematically hiding the evidence. Even in a world focused on the balance sheet of loss and profit, children’s lives should not be the collateral damage of the tech sector. It is time to bring that to a halt—and halt it we can.

The Joint Committee recommendations have unprecedented support across the political spectrum, as they do across civil society. All that is required is for the Government to act. I ask the Minister, when he answers, to acknowledge that failure to have these things in place is costing children their lives—and I ask for a commitment to all the Joint Committee’s recommendations that relate to children. This is a time not for cherry-picking headline-grabbing changes, but rather for setting out an enforceable product safety regime that will keep our children safe.

Given the tech companies’ determined efforts to frustrate basic child safety requirements, I ask the Minister again to explain to the Committee how the Government can justify delaying the introduction of age assurance. They have failed to implement Part 3 of the DEA and rejected my Private Member’s Bill for privacy-preserving age assurance, instead putting their faith in a voluntary scheme which their own officials estimate would take a minimum of two years and do nothing to impact on those who do not volunteer. This implicitly goes against statements made last week in the other place by the Minister for Digital that self-regulation has failed. If the Government acted today, Ofcom could set out expectations of age assurance by the end of the year, unleashing an arms race of innovation to meet those expectations. Failing to act means that more families will suffer heartbreak and more children harm.

In spite of my many years on this beat, Olly’s father Stuart shocked me to the core when he said that, since Olly’s death, he has received over 300 taunting and abusive messages via social media—images of people waving knives, celebrating Olly’s death and threatening his wife and daughter with rape, along with pictures identifying where they live. This sector does not have the authority or willingness to police itself. My deepest thanks go to those noble Lords who have chosen to speak; given our sad subject matter, I anticipate their words with trepidation.

Freedom of Speech

Baroness Kidron Excerpts
Friday 10th December 2021

(3 years ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, as a young child I had an operation that meant I was unable to speak for a year. I arrived at secondary school, aged 11, complete with a horn to attract attention, like Harpo Marx, and attached to my waist a pen and paper on which I could ask or answer questions. It was embarrassing, alienating and occasionally hilarious, but from my silence I noticed who spoke, who listened and who got ignored. Even at that young age, I understood that having a voice requires the circumstances in which to be heard, as well as the freedom to speak.

I declare my interests as set out in the register, particular as chair of 5Rights Foundation and a member of the Joint Committee on the Draft Online Safety Bill. I am really sorry if I disappoint noble Lords by failing to offer spoilers; I am far too frightened of the chair, Damian Collins MP. I hope my words speak to the urgency of our recommendations.

Earlier this year, the Center for Countering Digital Hate found that just 12 accounts were responsible for 65% of Covid misinformation across 800,000 posts subsequently seen by 59 million people. In 2016, a Facebook internal review found that 64% of people who joined an extremist group did so only because the company’s algorithm recommended it. This year thousands of people, including many children, have undertaken TikTok challenges that have resulted in hospitalisations, fires, dangerous driving and the death of a 10 year-old girl from accidental asphyxiation. After 6 January, the assault on the Capitol, Twitter removed 70,000 accounts known for sharing QAnon content and thereby reduced the amount of QAnon content on its platform by 70% to 80%. If it had been done a little earlier, it may have changed those very same events. The penalty shoot-out in England’s best European Championship performance for decades meant that the young men who should have been national heroes were instead subject to sustained abuse.

In each case, speech that might in other contexts be ill-informed, frustrating, foolish or full-on hateful—but totally manageable—was supercharged and spread to epidemic proportions online. As it spreads, it mutates: disappointment turns to rage, uncertainty and suspicion; difference turns into dispute; the marginal turns into the mainstream; and the digital turns into injury and death.

I am neither a technophobe nor a tech pessimist. On the contrary, it is still possible to do anything. The digital world is synthetic, entirely human-engineered and eye-wateringly well resourced; it can set its sights on any outcome. But it is optimised for three holy grails: growth, engagement and time spent, which simply means keeping as many people online, engaging as often as possible for as long as possible. This engagement drives the value and revenue of a sector now responsible for 25% of the world’s GDP, and it has made giants of those who, often in the name of freedom of speech, have built personal fortunes by controlling what we see, read and hear from the relative safety of Silicon Valley.

When she gave evidence to the Joint Committee, Frances Haugen, the Facebook whistleblower, said that

“engagement-based ranking does two things. One, it prioritises and amplifies divisive polarising and extreme content and, two, it concentrates it … It does not matter if you are on the left or on the right … Anger and hate is the easiest way to grow on Facebook … The … system is biased towards bad actors and biased towards those who push people to the extremes.”

Those extremes become our new normal, in which children who look for exercise videos end up with material that valorises eating disorders, in which Covid misinformation is more prevalent than advice from the WHO and in which whole peoples are set against each other in tribal or religious conflict, such as those in Ethiopia and Myanmar, in both of which Facebook has played a role. Perhaps most ironic of all, they create a new normal in which girls, women, people of colour, minorities and the oppressed can be silenced by algorithmically fuelled abuse and hate in the name of other people’s freedom to speak but, perhaps more truthfully, in the company’s freedom to monetise and whip up difference.

It is frequently said that the digital world offers great opportunities but brings terrible harms. This framing is a false binary. A car with no brakes is not an opportunity, and neither is a supermarket with a poisonous product hidden on every other shelf. They are, respectively, a case for product recall and shutting up shop. But both, with some judicious redesign, would be rather useful.

Not all the harms of the world can be attributed to one sector, however powerful, but our freedoms are being exploited by a system that allows any amount of algorithmic distortion but holds no liability. Our discourse is undermined by the monetisation of engagement, and children are being denied a childhood for profit. Three weeks ago, in this Chamber, I set the Government a series of challenges that they have yet to answer. In short—and, believe me, it was not in short on that occasion—I asked why the Government did not act immediately to introduce privacy-preserving age assurance online to give children the protections that they so desperately need. And I say it here again. Children also have the right to participate, speak and assemble online, but they also have a right to protections from violent and sexual commercial exploitation.

This is not the theoretical plaything of a debating club that pits freedoms against protections but rather a matter of life and death. This is not about undermining our freedom but about finding our voice. I do not wish to be standing here in the new year reporting to the Minister more compelling evidence or a new tragedy for which his department will bear some responsibility. I am grateful to the most reverend Primate for bringing this debate forward, and I urge the House, as we go forward to the online safety Bill, the data Bill and multiple trade Bills, not to sacrifice our freedoms or those of our children on the altar of Silicon Valley. Instead, I urge that we find our voice and, with it, the circumstances in which others can both speak and be heard.