All 22 Baroness Harding of Winscombe contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Wed 1st Feb 2023
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1 & Report stage: Minutes of Proceedings
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Wed 12th Jul 2023
Mon 17th Jul 2023
Wed 19th Jul 2023

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, it is an enormous privilege to follow so many powerful speeches. My second daughter was born in the year Facebook launched in the UK and Apple sold its first iPhone. Today she is 15; she has lived her whole life in a digitally enabled world. She has undoubtedly benefited from the great things that digital technology brings, but, throughout that life, she has had no meaningful legal protection from its harms.

A number of noble Lords have referenced the extraordinarily moving and disturbing briefing that Ian Russell and his lawyer, Merry Varney, gave us on Monday. When I went home from that briefing, first, I hugged my two teenage girls really close, and then I talked to them about it. My 15 year-old daughter said, “Mum, of course, I know about Molly Russell and all the awful content there is on social media. Didn’t you realise? When are all you adults going to realise what’s going on and do something about it?” The Bill is important, because it is the beginning of us doing something about it.

It is also a huge Bill, so we need to be careful not to let perfect be the enemy of the good. Like other noble Lords, I urge this House to focus on the critical areas where we can improve this already much debated and discussed Bill and try to resist the temptation to attach so many baubles to it that it no longer delivers on its core purpose of protecting our children online. So, like others, I will focus my remarks on three structural changes that I hope will help make the Bill more effective at driving the positive changes that, I think, everyone in this House intends: first, the consequences for senior managers of not complying with the legislation; secondly, how compliance is defined and by whom; and, finally, which services are included.

To change digital platforms and services to protect children is not impossible—but it is hard, and it will not happen by itself. Tech business models are simply too driven by other things; development road maps are always too contested with revenue-raising projects, and competition for clicks is just too intense. So we need to ask ourselves whether the incentives in the Bill to drive compliance are strong enough to counter the very strong incentives not to.

It is clear that self-regulation will not work, and relying on corporate fines is also not enough. We have learned in other safety-critical industries and sectors that have needed dramatic culture change, such as financial services, that fines alone do not drive change. However, once you name an individual as responsible for something, with serious consequences if they fail, change happens. I look forward to the government amendment that I hope will clearly set out the consequences for named senior managers who do not deliver on their overall online safety responsibilities.

The second area I highlight is how compliance is defined. Specifically, the powers that the Bill grants the Secretary of State to amend Ofcom’s proposed code of conduct are far too wide. Just as with senior tech managers, the political incentives not to focus on safety are too strong. Almost every Minister I have ever met is keen to support tech sector growth. Giving the Secretary of State the ability to change codes of conduct for economic reasons is asking them to trade off economic growth against children’s safety—the same trade-off that tech companies have failed to make over the last 15 years. That is not right, it is not fair on the Ministers themselves, and it will not deliver the child protections we are looking for.

The third area I will cover—I will be very brief—has been highlighted by the noble Baroness, Lady Kidron. It is important that we capture all the services that are accessed by children. If not, we risk creating a dangerous false sense of security. Specifically, I am worried about why app stores are not covered. In the physical world—I say this as an erstwhile retailer—retailers have long come to terms with the responsibilities they bear for ensuring that they do not sell age-restricted products to children. Why are we shying away from the same thing in the digital world?

There are many other things I would support, not least the amendments proposed by the noble Baroness, Lady Kidron. I finish by simply saying that the most important thing is that the Bill is here. We need to do this work—our children and grandchildren have waited far too long.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow other noble Lords who have spoken. I too support this key first amendment. Clarity of purpose is essential in any endeavour. The amendment overall sets out the Bill’s aims and enhances what will be vital legislation for the world, I hope, as well as for the United Kingdom. The Government have the very welcome ambition of making Britain the safest country in the world to go online. The OSB is a giant step in that direction.

As has been said, there has been remarkable consensus across the Committee on what further measures may still be needed to improve the Bill and on this first amendment, setting out these seven key purposes. Noble Lords may be aware that in the Christian tradition the number seven is significant: in the medieval period the Church taught the dangers of the seven deadly sins, the merits of the seven virtues and the seven acts of mercy. Please speak to me later if a refresher course is needed.

Amendment 1 identifies seven deadly dangers—I think they are really deadly. They are key risks which we all acknowledge are unwelcome and destructive companions of the new technologies which bring so many benefits: risks to public health or national security; the risk of serious harm to children; the risk of new developments and technologies not currently in scope; the disproportionate risk to those who manifest one or more protected characteristics; risks that occur through poor design; risks to freedom of expression and privacy; and risks that come with low transparency and low accountability. Safety and security are surely one of the primary duties of government, especially the safety and security of children and the vulnerable. There is much that is good and helpful in new technology but much that can be oppressive and destructive. These seven risks are real and present dangers. The Bill is needed because of actual and devastating harm caused to people and communities.

As we have heard, we are living through a period of rapid acceleration in the development of AI. Two days ago, CBS broadcast a remarkable documentary on the latest breakthroughs by Google and Microsoft. The legislation we craft in these weeks needs future-proofing. That can happen only through a clear articulation of purpose so that the framework provided by the Bill continues to evolve under the stewardship of the Secretary of State and of Ofcom.

I have been in dialogue over the past five years with tech companies in a variety of contexts and I have seen a variety of approaches, from the highly responsible in some companies to the frankly cavalier. Good practice, especially in design, needs stronger regulation to become uniform. I really enjoyed the analogy from the noble Lord, Lord Allan, a few minutes ago. We would not tolerate for a moment design and safety standards in aeroplanes, cars or washing machines which had the capacity to cause harm to people, least of all to children. We should not tolerate lesser standards in our algorithms and technologies.

There is no map for the future of technology and its use, even over the rest of this decade, but this amendment provides a compass—a fixed point for navigation in the future, for which future generations will thank this Government and this House. These seven deadly dangers need to be stated clearly in the Bill and, as the noble Baroness, Lady Kidron, said, to be a North Star for both the Secretary of State and Ofcom. I support the amendment.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I too support this amendment. I was at a dinner last night in the City for a group of tech founders and investors—about 500 people in a big hotel ballroom, all focused on driving the sort of positive technology growth in this country that I think everyone wants to see. The guest speaker runs a large UK tech business. He commented in his speech that tech companies need to engage with government because—he said this as if it was a revelation—all Governments turned out not to speak with one voice and that understanding what was required of tech companies by Governments is not always easy. Business needs clarity, and anyone who has run a large or small business knows that it is not really the clarity in the detail that matters but the clarity of purpose that enables you to lead change, because then your people understand why they need to change, and if they understand why, then in each of the micro-decisions they take each day they can adjust those decisions to fit with the intent behind your purpose. That is why this amendment is so important.

I have worked in this space of online safety for more than a decade, both as a technology leader and in this House. I genuinely do not believe that business is wicked and evil, but what it lacks is clear direction. The Bill is so important in setting those guardrails that if we do not make its purpose clear, we should not be surprised if the very businesses which really do want Governments to be clear do not know what we intend.

I suspect that my noble friend the Minister might object to this amendment and say that it is already in the Bill. As others have already said, I actually hope it is. If it is not, we have a different problem. The point of an upfront summary of purpose is to do precisely that: to summarise what is in what a number of noble Lords have already said is a very complicated Bill. The easier and clearer we can make it for every stakeholder to engage in the Bill, the better. If alternatively my noble friend the Minister objects to the detailed wording of this amendment, I argue that that simply makes getting this amendment right even more important. If the four noble Lords, who know far more about this subject than I will ever do in a lifetime, and the joint scrutiny committee, which has done such an outstanding job at working through this, have got the purposes of the Bill wrong, then what hope for the rest of us, let alone those business leaders trying to interpret what the Government want?

That is why it is so important that we put the purposes of the Bill absolutely at the front of the Bill, as in this amendment. If we have misunderstood that in the wording, I urge my noble friend the Minister to come back with wording on Report that truly encapsulates what the Government want.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome this opportunity to clarify the purposes of the Bill, but I am not sure that the amendment helps as my North Star. Like the Bill, it throws up as many questions as answers, and I found myself reading it and thinking “What does that word mean?”, so I am not sure that clarity was where I ended up.

It is not a matter of semantics, but in some ways you could say—and certainly this is as publicly understood—that the name of the Bill, the Online Safety Bill, gives it its chief purpose. Yet however well-intentioned, and whatever the press releases say or the headlines print, even a word such as “safety” is slippery, because safety as an end can be problematic in a free society. My worry about the Bill is unintended consequences, and that is not rectified by the amendment. As the Bill assumes safety as the ultimate goal, we as legislators face a dilemma. We have the responsibility of weighing up the balance between safety and freedom, but the scales in the Bill are well and truly weighted towards safety at the expense of freedom before we start, and I am again not convinced the amendment weights them back again.

Of course, freedom is a risky business, and I always like the opportunity to quote Karl Marx, who said:

“You cannot pluck the rose without its thorns!”


However, it is important to recognise that “freedom” is not a dirty word, and we should avoid saying that risk-free safety is more important than freedom. How would that conversation go with the Ukrainian people who risk their safety daily for freedom? Also, even the language of safety, or indeed what constitutes the harms that the Bill and the amendments promise to keep the public safe from, need to be considered in the cultural and social context of the norms of 2023. A new therapeutic ethos now posits safety in ever-expanding pseudo-psychological and subjective terms, and this can be a serious threat to free speech. We know that some activists often exploit that concept of safety to claim harm when they merely encounter views they disagree with. The language of safety and harm is regularly used to cancel and censor opponents—and the Government know that, so much so that they considered it necessary to introduce the Higher Education (Freedom of Speech) Bill to secure academic freedom against an escalating grievance culture that feigns harm.

Part of the triple shield is a safety duty to remove illegal content, and the amendment talks about speech within the law. That sounds unobjectionable—in my mind it is far better than “legal but harmful”, which has gone—but, while illegality might sound clear and obvious, in some circumstances it is not always clear. That is especially true in any legal limitations of speech. We all know about the debates around hate speech, for example. These things are contentious offline and even the police, in particular the College of Policing, seem to find the concept of that kind of illegality confusing and, at the moment, are in a dispute with the Home Secretary over just that.

Is it really appropriate that this Bill enlists and mandates private social media companies to judge criminality using the incredibly low bar of “reasonable grounds to infer”? It gets even murkier when the legal standard for permissible speech online will be set partly by compelling platforms to remove content that contravenes their terms and conditions, even if these terms of service restrict speech far more than domestic UK law does. Big tech is being incited to censor whatever content it wishes as long as it fits in with their Ts & Cs. Between this and determining, for example, what is in filters—a whole different issue—one huge irony here, which challenges one of the purposes of the Bill, is that despite the Government and many of us thinking that this legislation will de-fang and regulate big tech’s powers, actually the legislation could inadvertently give those same corporates more control of what UK citizens read and view.

Another related irony is that the Bill was, no doubt, designed with Facebook, YouTube, Twitter, Google, TikTok and WhatsApp in mind. However, as the Bill’s own impact assessment notes, 80% of impacted entities have fewer than 10 employees. Many sites, from Wikipedia to Mumsnet, are non-profit or empower their own users to make moderation or policy decisions. These sites, and tens of thousands of British businesses of varying sizes, perhaps unintentionally, now face an extraordinary amount of regulatory red tape. These onerous duties and requirements might be actionable if not desirable for larger platforms, but for smaller ones with limited compliance budgets they could prove a significant if not fatal burden. I do not think that is the purpose of the Bill, but it could be an unintended outcome. This also means that regulation could, inadvertently, act as barrier to entry to new SMEs, creating an ever more monopolistic stronghold for big tech, at the expense of trialling innovations or allowing start-ups to emerge.

I want to finish with the thorny issue of child protection. I have said from the beginning—I mean over the many years since the Bill’s inception—that I would have been much happier if it was more narrowly titled as the Children’s Online Safety Bill, to indicate that protecting children was its sole purpose. That in itself would have been very challenging. Of course, I totally agree with Amendment 1’s intention

“to provide a higher level of protection for children than for adults”.

That is how we treat children and adults offline.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I refer the Committee to my interests as put in the register and declared in full at Second Reading. I will speak to Amendment 2 in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Harding, to Amendments 3 and 5 in my name, and briefly to Amendments 19, 22, 298 and 299 in the name of the noble Baroness, Lady Harding.

The digital world does not have boundaries in the way that the Bill does. It is an ecosystem of services and products that are interdependent. A user journey is made up of incremental signals, nudges and enticements that mean that, when we use our devices, very often we do not end up where we intended to start. The current scope covers user-to-user, search and commercial porn services, but a blog or website that valorises self-harm and depression or suggests starving yourself to death is still exempt because it has limited functionality. So too are games without a user-to-user function, in spite of the known harm associated with game addiction highlighted recently by Professor Henrietta Bowden-Jones, national expert adviser on gambling harms, and the World Health Organization in 2019 when it designated gaming disorder as a behavioural addiction.

There is also an open question about immersive technologies, whose protocols are still very much in flux. I am concerned that the Government are willing to assert that these environments will meet the bar of user-to-user when those that are still building immersive environments make quite clear that that is not a given. Indeed, later in Committee I will be able to demonstrate that already the very worst harms are happening in environments that are not clearly covered by the Bill.

Another unintended consequence of the current drafting is that the task of working out whether you are on a regulated or unregulated service is left entirely to children. That is not what we had been promised. In December the Secretary of State wrote in a public letter to parents,

“I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders”.

It is likely that the Minister will suggest that the limited- functionality services will be caught by the gatekeepers. But, as in the case of immersive technology, it is dangerous to suggest that, just because search and user- to-user are the primary access points in 2023, that will remain the case. We must be more forward thinking and ensure that services likely to be accessed that promote harm are in scope by default.

Amendments 3 and 5 are consequential, so I will not debate them now. I have listened to the Government and come back with a reasonable and implementable amendment that applies only to services that are likely to be accessed by children and that enable harm. I now ask the Government to listen and do likewise.

Amendments 92 and 193 cover the child user condition. The phrase “likely to be accessed”, introduced in this House into what became the Data Protection Act 2018, is one of the most unlikely successful British exports. Both the phrase and its definition, set out by the ICO, have been embedded in regulations in countries the world over—yet the Bill replaces this established language while significantly watering down the definition.

The Bill requires

“a significant number of children”

to use the service, or for the service to be

“likely to attract a significant number of users who are children”.

“Significant” in the Bill is defined relative to the overall UK user base, which means that extremely large platforms could deem a few thousand child users not significant compared with the several million-strong user base. Since only services that cross this threshold need comply with the child safety duties, thousands of children will not benefit from the safety duties that the Minister told us last week were at the heart of the Bill.

Amendment 92 would put the ICO’s existing and much-copied definition into the Bill. It says a service is

“likely to be accessed by children”

if

“the service is designed or intended for use by children … children form a substantive and identifiable user group … the possibility of a child accessing the service is more probable than not, taking into consideration … the nature and content of the service and whether that has particular appeal for children … the way in which the service is accessed and any measures in place to prevent children gaining access … market research, current evidence on user behaviour, the user base of similar or existing services”

that are likely to be accessed.

Having two phrases and definitions is bad for business and even worse for regulators. The ICO has first-mover advantage and a more robust test. It is my contention that parents, media and perhaps even our own colleagues would be very shocked to know that the definition in the Bill has the potential for many thousands, and possibly tens of thousands, of children to be left without the protections that the Bill brings forward. Perhaps the Minister could explain why the Government have not chosen regulatory alignment, which is good practice.

Finally, I will speak briefly in support of Amendments 19, 22, 298 and 299. I am certain that the noble Baroness, Lady Harding, will spell out how the app stores of Google and Apple are simply a subset of “search”, in that they are gatekeepers to accessing more than 5 million apps worldwide and the first page of each is indeed a search function. Their inclusion should be obvious, but I will add a specific issue about which I have spoken directly with both companies and about which the 5Rights Foundation, of which I am chair, has written to the ICO.

When we looked at the age ratings of apps across Google Play Store and Apple, four things emerged. First, apps are routinely rated much lower than their terms and conditions: for example, Amazon Shopping says 18 but has an age rating of 4 on Apple. This pattern goes across both platforms, covering social sites, gaming, shopping, et cetera.

Secondly, the same apps and services did not have the same age rating across both services, which, between them, are gatekeepers for more than 95% of the app market. In one extreme case, an app rated four on one of them was rated 16 on the other, with other significant anomalies being extremely frequent.

Thirdly, almost none of the apps considered their data protection duties in coming to a decision on their age rating, which is a problem, since privacy and safety and inextricably linked.

Finally, in the case of Apple, using a device registered to a 15 year-old, we were able to download age-restricted apps including a dozen or more 18-plus dating sites. In fairness, I give a shoutout to Google, which, because of the age-appropriate design code, chose more than a year ago not to show 18-plus content to children in its Play Store. So this is indeed a political and business choice and not a question of technology. Millions of services are accessed via the App Store. Given the Government’s position—that gatekeepers have specific responsibilities in relation to harmful content and activity—surely the amendments in the name of the noble Baroness, Lady Harding, are necessary.

My preference was for a less complicated Bill based on principles and judged on outcomes. I understand that that ship has sailed, but it is not acceptable for the Government now to use the length and complexity of the Bill as a reason not to accept amendments that would fill loopholes where harm has been proven. It is time to deliver on the promises made to parents and children, and to put the onus for keeping young people safe online squarely on tech companies’ shoulders. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I rise to speak to Amendments 19, 22, 298 and 299 in my name and those of the noble Baroness, Lady Stowell, and the noble Lords, Lord Knight and Lord Clement-Jones. I will also briefly add at the end of my speech my support for the amendments in the name of my friend, the noble Baroness, Lady Kidron. It has been a huge privilege to be her support act all the way from the beginnings of the age-appropriate design code; it feels comfortable to speak after her.

I want briefly to set out what my amendments would do. Their purpose is to bring app stores into the child protection elements of the Bill. Amendment 19 would require app stores to prepare

“risk assessments equal to user-to-user services due to their role in distributing online content through apps to children and as a primary facilitator of user-to-user”

services reaching children. Amendment 22 would mandate app stores

“to use proportionate and proactive measures, such as age assurance, to prevent children”

coming into contact with

“primary priority content that is harmful to children”.

Amendments 298 and 299 would simply define “app” and “app stores”.

Let us be clear what app stores do. They enable customers to buy apps and user-to-user services. They enable customers to download free apps. They offer up curated content in the app store itself and decide what apps someone would like to see. They enable customers to search for apps for user-to-user content. They provide age ratings; as the noble Baroness, Lady Kidron, said, they may be different age ratings in different app stores for the same app. They sometimes block the download of apps based on the age rating and their assessment of someone’s age, but not always, and it is different for different app stores.

Why should they be included in this Bill—if it is not obvious from what I have already said? First, two companies are profiting from selling user-to-user products to children. Two app stores account for some 98%-plus of all downloads of user-to-user services, with no requirements to assess the risk of selling those products to children or to mitigate those risks. We do not allow that in the physical world so we should not allow it in the digital world.

Secondly, parents and teenagers tell us that this measure would help. A number of different studies have been done; I will reference just two. One was by FOSI, the Family Online Safety Institute, which conducted an international research project in which parents consistently said that having age assurance at the app store level would make things simpler and more effective for them; ironically, the FOSI research was conducted with Google.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The noble Lord makes a good point. I certainly think we are heading into a world where there will be more regulation of app stores. Google and Apple are commercial competitors with some of the people who are present in their stores. A lot of the people in their stores are in dispute with them over things such as the fees that they have to pay. It is precisely for that reason that I do not think we should be throwing online safety into the mix.

There is a role for regulating app stores, which primarily focuses on these commercial considerations and their position in the market. There may be something to be done around age-rating; the noble Baroness made a very good point about how age-rating works in app stores. However, if we look at the range of responsibilities that we are describing in this Bill and the tools that we are giving to intermediaries, we see that they are the wrong, or inappropriate, set of tools.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

Would the noble Lord acknowledge that app stores are already undertaking these age-rating and blocking decisions? Google has unilaterally decided that, if it assesses that you are under 18, it will not serve up over-18 apps. My concern is that this is already happening but it is happening indiscriminately. How would the noble Lord address that?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.

--- Later in debate ---
Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as my name is on Amendment 9, I speak to support these amendments and say that they are worthy of debate. As your Lordships know, I am extremely supportive of the Bill and hope that it will be passed in short order. It is much needed and overdue that we have the opportunity for legislation to provide us with a regulator that is able to hold platforms to account, protect users where it can and enhance child safety online. I can think of no better regulator for that role than Ofcom.

I have listened to the debate with great interest. Although I support the intentions of my noble friend Lord Moylan’s amendment, I am not sure I agree with him that there are two cultures in this House, as far as the Bill is concerned; I think everybody is concerned about child safety. However, these amendments are right to draw attention to the huge regulatory burden that this legislation can potentially bring, and to the inadvertent bad consequences it will bring for many of the sites that we all depend upon and use.

I have not signed many amendments that have been tabled in this Committee because I have grown increasingly concerned, as has been said by many others, that the Bill has become a bit like the proverbial Christmas tree where everyone hangs their own specific concern on to the legislation, turning it into something increasingly unwieldy and difficult to navigate. I thought the noble Baroness, Lady Fox, put it extremely well when she effectively brought to life what it would be like to run a small website and have to comply with this legislation. That is not to say that certain elements of micro-tweaking are not welcome—for example, the amendment by the noble Baroness, Lady Kidron, on giving coroners access to data—but we should be concerned about the scope of the Bill and the burden that it may well put on individual websites.

This is in effect the Wikipedia amendment, put forward and written in a sort of wiki way by this House—a probing amendment in Committee to explore how we can find the right balance between giving Ofcom the powers it needs to hold platforms to account and not unduly burdening websites that all of us agree present a very low risk and whose provenance, if you like, does not fit easily within the scope of the Bill.

I keep saying that I disagree with my noble friend Lord Moylan. I do not—I think he is one of the finest Members of this House—but, while it is our job to provide legislation to set the framework for how Ofcom regulates, we in this House should also recognise that in the real world, as I have also said before, this legislation is simply going to be the end of the beginning. Ofcom will have to find its way forward in how it exercises the powers that Parliament gives it, and I suspect it will have its own list of priorities in how it approaches these issues, who it decides to hold to account and who it decides to enforce against. A lot of its powers will rest not simply on the legislation that we give it but on the relationship that it builds with the platforms it is seeking to regulate.

For example, I have hosted a number of lunches for Google in this House with interested Peers, and it has been interesting to get that company’s insight into its working relationship with Ofcom. By the way, I am by no means suggesting that that is a cosy relationship, but it is at least a relationship where the two sides are talking to each other, and that is how the effectiveness of these powers will be explored.

I urge noble Lords to take these amendments seriously and take what the spirit of the amendments is seeking to put forward, which is to be mindful of the regulatory burden that the Bill imposes; to be aware that the Bill will not, simply by being passed, solve the kinds of issues that we are seeking to tackle in terms of the most egregious content that we find on the internet; and that, effectively, Ofcom’s task once this legislation is passed will be the language of priorities.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, this is not the first time in this Committee, and I suspect it will not be the last, when I rise to stand somewhere between my noble friend Lord Vaizey and the noble Baroness, Lady Kidron. I am very taken by her focus on risk assessments and by the passionate defences of Wikipedia that we have heard, which really are grounded in a sort of commoner’s risk assessment that we can all understand.

Although I have sympathy with the concerns of the noble Baroness, Lady Fox, about small and medium-sized businesses being overburdened by regulation, I am less taken with the amendments on that subject precisely because small tech businesses become big tech businesses extremely quickly. It is worth pointing out that TikTok did not even exist when Parliament began debating this Bill. I wonder what our social media landscape would have been like if the Bill had existed in law before social media started. We as a country should want global tech companies to be born in the UK, but we want their founders—who, sadly, even today, are predominantly young white men who do not yet have children—to think carefully about the risks inherent in the services they are creating, and we know we need to do that at the beginning of those tech companies’ journeys, not once they have reached 1 million users a month.

While I have sympathy with the desire of the noble Baroness, Lady Fox, not to overburden, just as my noble friend Lord Vaizey has said, we should take our lead from the intervention of the noble Baroness, Lady Kidron: we need a risk assessment even for small and medium-sized businesses. It just needs to be a risk assessment that is fit for their size.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Everything the noble Baroness has said is absolutely right, and I completely agree with her. The point I simply want to make is that no form of risk-based assessment will achieve a zero-tolerance outcome, but—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I am so sorry, but may I offer just one final thought from the health sector? While the noble Lord is right that where there are human beings there will be error, there is a concept in health of the “never event”—that when that error occurs, we should not tolerate it, and we should expect the people involved in creating that error to do a deep inspection and review to understand how it occurred, because it is considered intolerable. I think the same exists in the digital world in a risk assessment framework, and it would be a mistake to ignore it.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, I am now going to attempt for the third time to beg the House’s leave to withdraw my amendment. I hope for the sake of us all, our dinner and the dinner break business, for which I see people assembling, that I will be granted that leave.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I add my support for all the amendments in this group. I thank the noble Baroness, Lady Ritchie, for bringing the need for the consistent regulation of pornographic content to your Lordships’ attention. Last week, I spoke about my concerns about pornography; I will not repeat them here. I said then that the Bill does not go far enough on pornography, partly because of the inconsistent regulation regimes between Part 3 services and Part 5 ones.

In February, the All-Party Parliamentary Group on Commercial Sexual Exploitation made a series of recommendations on the regulation of pornography. Its first recommendation was this:

“Make the regulation of pornography consistent across different online platforms, and between the online and offline spheres”.


It went on to say:

“The reforms currently contained in the Online Safety Bill not only fail to remedy this, they introduce further inconsistencies in how different online platforms hosting pornography are regulated”.


This is our opportunity to get it right but we are falling short. The amendments in the name of the noble Baroness, Lady Ritchie, go to the heart of the issue by ensuring that the duties that currently apply to Part 5 services will also apply to Part 3 services.

Debates about how these duties should be amended or implemented will be dealt with later on in our deliberations; I look forward to coming back to them in detail then. Today, the question is whether we are willing to have inconsistent regulation of pornographic content across the services that come into the scope of the Bill. I am quite sure that, if we asked the public in an opinion poll whether this was the outcome they expected from the Bill, they would say no.

An academic paper published in 2021 reported on the online viewing of 16 and 17 year-olds. It said that pornography was much more frequently viewed on social media, showing that the importance of the regulation of such sites remains. The impact of pornography is no different whether it is seen on a social media or pornography site with user-to-user facilities that fall within Part 3 or on a site that has only provider content that would fall within Part 5. There should not be an either/or approach to different services providing the same content, which is why I think that Amendment 125A is critical. If all pornographic content is covered by Part 5, what does and does not constitute user-generated material ceases to be our concern. Amendment 125A highlights this issue; I too look forward to hearing the Minister’s response.

There is no logic to having different regulatory approaches in the same Bill. They need to be the same and come into effect at the same time. That is the simple premise of these amendments; I fully support them.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, earlier today the noble Baroness, Lady Benjamin, referred to a group of us as kindred spirits. I suggest that all of us contributing to this debate are kindred spirits in our desire to see consistent outcomes. All of us would like to see a world where our children never see pornography on any digital platform, regardless of what type of service it is. At the risk of incurring the ire of my noble friend Lord Moylan, we should have zero tolerance for children seeing and accessing pornography.

I agree with the desire to be consistent, as the noble Baroness, Lady Ritchie, and the noble Lord, Lord Browne, said, but it is consistency in outcomes that we should focus on. I am very taken with the point made by the noble Lord, Lord Allan, that we must be very careful about the unintended consequences of a consistent regulatory approach that might end up with inconsistent outcomes.

When we get to it later—I am not sure when—I want to see a regulatory regime that is more like the one reflected in the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell. We need in the Bill a very clear definition of what age assurance and age verification are. We must be specific on the timing of introducing the regulatory constraints on pornography. We have all waited far too long for that to happen and that must be in the Bill.

I am nervous of these amendments that we are debating now because I fear other unintended consequences. Not only does this not incentivise general providers, as the noble Lord, Lord Allan, described them, to remove porn from their sites but I fear that it incentivises them to remove children from their sites. That is the real issue with Twitter. Twitter has very few child users; I do not want to live in a world where our children are removed from general internet services because we have not put hard age gates on the pornographic content within them but instead encouraged those services to put an age gate on the front door. Just as the noble Lord, Lord Allan, said earlier today, I fear that, with all the best intentions, the desire to have consistent outcomes and these current amendments would regulate the high street rather than the porn itself.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, there is absolutely no doubt that across the Committee we all have the same intent; how we get there is the issue between us. It is probably about the construction of the Bill, rather than the duties that we are imposing.

It is a pleasure again to follow the noble Baroness, Lady Harding. If you take what my noble friend Lord Allan said about a graduated response and consistent outcomes, you then get effective regulation.

I thought that the noble Baroness, Lady Kidron, had it right. If we passed her amendments in the second group, and included the words “likely to be accessed”, Clause 11 would bite and we would find that there was consistency of outcomes for primary priority content and so on, and we would then find ourselves in much the same space. However, it depends on the primary purpose. The fear that we have is this. I would not want to see a Part 5 service that adds user-generated content then falling outside Part 5 and finding itself under Part 3, with a different set of duties.

I do not see a huge difference between Part 3 and Part 5, and it will be very interesting when we come to debate the later amendments tabled by the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron. Again, why do we not group these things together to have a sensible debate? We seem to be chunking-up things in a different way and so will have to come back to this and repeat some of what we have said. However, I look forward to the debate on those amendments, which may be a much more effective way of dealing with this than trying to marry Part 5 and Part 3.

I understand entirely the motives of the noble Baroness, Lady Ritchie, and that we want to ensure that we capture this. However, it must be the appropriate way of regulating and the appropriate way of capturing it. I like the language about consistent outcomes without unintended consequences.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I agree in part with the noble Lord, Lord Moylan. I was the person who said that small was not safe, and I still feel that. I certainly do not think that anything in the Bill will make the world online 100% safe, and I think that very few noble Lords do, so it is important to say that. When we talk about creating a high bar or having zero tolerance, we are talking about ensuring that there is a ladder within the Bill so that the most extreme cases have the greatest force of law trying to attack them. I agree with the noble Lord on that.

I also absolutely agree with the noble Lord about implementation: if it is too complex and difficult, it will be unused and exploited in certain ways, and it will have a bad reputation. The only part of his amendment that I do not agree with is that we should look at size. Through the process of Committee, if we can look at risk rather than size, we will get somewhere. I share his impatience—or his inquiry—about what categories 2A and 2B mean. If category 2A means the most risky and category 2B means those that are less risky, I am with him all the way. We need to look into the definition of what they mean.

Finally, I mentioned several times on Tuesday that we need to look carefully at Ofcom’s risk profiles. Is this the answer to dealing with where risk gets determined, rather than size?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I rise to speak along similar lines to the noble Baroness, Lady Kidron. I will address my noble friend Lord Moylan’s comments. I share his concern that we must not make the perfect the enemy of the good but, like the noble Baroness, I do not think that size is the key issue here, because of how tech businesses grow. Tech businesses are rather like building a skyscraper: if you get the foundations wrong, it is almost impossible to change how safe the building is as it goes up and up. As I said earlier this week, small tech businesses can become big very quickly, and, if you design your small tech business with the risks to children in mind at the very beginning, there is a much greater chance that your skyscraper will not wobble as it gets taller. On the other hand, if your small business begins by not taking children into account at all, it is almost impossible to address the problem once it is huge. I fear that this is the problem we face with today’s social media companies.

The noble Baroness, Lady Kidron, hit the nail on the head, as she so often does, in saying that we need to think about risk, rather than size, as the means of differentiating the proportionate response. In Clause 23, which my noble friend seeks to amend, the important phrase is “use proportionate measures” in subsection (2). Provided that we start with a risk assessment and companies are then under the obligation to make proportionate adjustments, that is how you build safe technology companies—it is just like how you build safe buildings.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will build on my noble friend’s comments. We have what I call the Andrew Tate problem. That famous pornographer and disreputable character started a business in a shed in Romania with a dozen employees. By most people’s assessment, it would have been considered a small business but, through his content of pornography and the physical assault of women, he extremely quickly built something that served an estimated 3 billion pages, and it has had a huge impact on the children of the English-speaking world. A small business became a big, nasty business very quickly. That anecdote reinforces the point that small does not mean safe, and, although I agree with many of my noble friend’s points, the lens of size is perhaps not the right one to look through.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to all noble Lords who have spoken in this debate. I hope that the noble Baroness, Lady Deech, and the noble Lord, Lord Weir of Ballyholme, will forgive me if I do not comment on the amendment they spoke to in the name of my noble friend Lord Pickles, except to say that of course they made their case very well.

I will briefly comment on the remarks of the noble Baroness, Lady Kidron. I am glad to see a degree of common ground among us in terms of definitions and so forth—a small piece of common ground that we could perhaps expand in the course of the many days we are going to be locked up together in your Lordships’ House.

I am grateful too to the noble Lord, Lord Allan of Hallam. I am less clear on “2B or not 2B”, if that is the correct way of referring to this conundrum, than I was before. The noble Baroness, Lady Kidron, said that size does not matter and that it is all about risk, but my noble friend the Minister cunningly conflated the two and said at various points “the largest” and “the riskiest”. I do not see why the largest are necessarily the riskiest. On the whole, if I go to Marks & Spencer as opposed to going to a corner shop, I might expect rather less risk. I do not see why the two run together.

I address the question of size in my amendment because that is what the Bill focuses on. I gather that the noble Baroness, Lady Kidron, may want to explore at some stage in Committee why that is the case and whether a risk threshold might be better than a size threshold. If she does that, I will be very interested in following and maybe even contributing to that debate. However, at the moment, I do not think that any of us is terribly satisfied with conflating the two—that is the least satisfactory way of explaining and justifying the structure of the Bill.

On the remarks of my noble friend Lady Harding of Winscombe, I do not want in the slightest to sound as if there is any significant disagreement between us—but there is. She suggested that I was opening the way to businesses building business models “not taking children into account at all”. My amendment is much more modest than that. There are two ways of dealing with harm in any aspect of life. One is to wait for it to arrive and then to address it as it arises; the other is constantly to look out for it in advance and to try to prevent it arising. The amendment would leave fully in place the obligation to remove harm, which is priority illegal content or other illegal content, that the provider knows about, having been alerted to it by another person or become aware of it in any other way. That duty would remain. The duty that is removed, especially from small businesses—and really this is quite important—is the obligation constantly to be looking out for harm, because it involves a very large, and I suggest possibly ruinous, commitment to constant monitoring of what appears on a search engine. That is potentially prohibitive, and it arises in other contexts in the Bill as well.

There should be occasions when we can say that knowing that harmful stuff will be removed as soon as it appears, or very quickly afterwards, is adequate for our purposes, without requiring firms to go through a constant monitoring or risk-assessment process. The risk assessment would have to be adjudicated by Ofcom, I gather. Even if no risk was found, of course, that would not be the end of the matter, because I am sure that Ofcom would, very sensibly, require an annual renewal of that application, or after a certain period, to make sure that things had not changed. So even to escape the burden is quite a large burden for small businesses, and then to implement the burden is so onerous that it could be ruinous, whereas taking stuff down when it appears is much easier to do.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

Perhaps I might briefly come in. My noble friend Lord Moylan may have helped explain why we disagree: our definition of harm is very different. I am most concerned that we address the cumulative harms that online services, both user-to-user services and search, are capable of inflicting. That requires us to focus on the design of the service, which we need to do at the beginning, rather than the simpler harm that my noble friend is addressing, which is specific harmful content—not in the sense in which “content” is used in the Bill but “content” as in common parlance; that is, a piece of visual or audio content. My noble friend makes the valid point that that is the simplest way to focus on removing specific pieces of video or text; I am more concerned that we should not exclude small businesses from designing and developing their services such that they do not consider the broader set of harms that are possible and that add up to the cumulative harm that we see our children suffering from today.

So I think our reason for disagreement is that we are focusing on a different harm, rather than that we violently disagree. I agree with my noble friend that I do not want complex bureaucratic processes imposed on small businesses; they need to design their services when they are small, which makes it simpler and easier for them to monitor harm as they grow, rather than waiting until they have grown. That is because the backwards re-engineering of a technology stack is nigh-on impossible.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My noble friend makes a very interesting point, and there is much to ponder in it—too much to ponder for me to respond to it immediately. Since I am confident that the issue is going to arise again during our sitting in Committee, I shall allow myself the time to reflect on it and come back later.

While I understand my noble friend’s concern about children, the clause that I propose to remove is not specific to children; it relates to individuals, so it covers adults as well. I think I understand what my noble friend is trying to achieve—I shall reflect on it—but this Bill and the clauses we are discussing are a very blunt way of going at it and probably need more refinement even than the amendments we have seen tabled so far. But that is for her to consider.

I think this debate has been very valuable. I did not mention it, but I am grateful also for the contribution from the noble Baroness, Lady Merron. I beg leave to withdraw the amendment.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I support this group of amendments, so ably introduced by my noble friend and other noble Lords this afternoon.

I am not a lawyer and I would not say that I am particularly experienced in this business of legislating. I found this issue incredibly confusing. I hugely appreciate the briefings and discussions—I feel very privileged to have been included in them—with my noble friend the Minister, officials and the Secretary of State herself in their attempt to explain to a group of us why these amendments are not necessary. I was so determined to try to understand this properly that, yesterday, when I was due to travel to Surrey, I took all my papers with me. I got on the train at Waterloo and started to work my way through the main challenges that officials had presented.

The first challenge was that, fundamentally, these amendments cut across the Bill’s definitions of “primary priority content” and “priority content”. I tried to find them in the Bill. Unfortunately, in Clause 54, there is a definition of primary priority content. It says that, basically, primary priority content is what the Secretary of State says it is, and that content that is harmful to children is primary priority content. So I was none the wiser on Clause 54.

One of the further challenges that officials have given us is that apparently we, as a group of noble Lords, were confusing the difference between harm and risk. I then turned to Clause 205, which comes out with the priceless statement that a risk of harm should be read as a reference to harm—so maybe they are the same thing. I am still none the wiser.

Yesterday morning, I found myself playing what I can only describe as a parliamentary game of Mornington Crescent, as I went round and round in circles. Unfortunately, it was such a confusing game of Mornington Crescent that I forgot that I needed to change trains, ended up in Richmond instead of Redhill, and missed my meeting entirely. I am telling the Committee this story because, as the debate has shown, it is so important that we put in the Bill a definition of the harms that we are intending to legislate for.

I want to address the points made by the noble Baroness, Lady Fox. She said that we might not all agree on what harms are genuinely harmful for children. That is precisely why Parliament needs to decide this, rather than abdicate it to a regulator who, as other noble Lords said earlier today, is then put into a political space. It is the job of Parliament to decide what is dangerous for our children and what is not. That is the approach that we take in the physical world, and it should be the approach that we take in the online world. We should do that in broad categories, which is why the four Cs is such a powerful framework. I know that we are all attempting to predict the known unknowns, which is impossible, but this framework, which gives categories of harm, is clear that it can be updated, developed and, as my noble friend Lord Bethell, said, properly consulted on. We as parliamentarians should decide; that is the purpose of voting in Parliament.

I have a couple of questions for my noble friend the Minister. Does he agree that Parliament needs to decide what the categories of online harms are that the Bill is attempting to protect our children from? If he does, why is it not the four Cs? If he really thinks it is not the four Cs, will he bring back an alternative schedule of harms?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will echo the sentiments of the noble Baroness, Lady Harding, in my contribution to another very useful debate, which has brought to mind the good debate that we had on the first day in Committee, in response to the amendment tabled by the noble Lord, Lord Stevenson, in which we were seeking to get into the Bill what we are actually trying to do.

I thought that the noble Baroness, Lady Fox, was also welcoming additional clarity, specifically in the area of psychological harm, which I agree with. Certainly in its earlier incarnations, the Bill was scattered throughout with references, some of which have been removed, but they are very much open to interpretation. I hope that we will come back to that.

I was struck by the point made by the noble Lord, Lord Russell, around what took place in that coroner’s hearing. You had two different platforms with different interpretations of what they thought that their duty of care would be. That is very much the point. In my experience, platforms will follow what they are told to follow. The challenge is when each of them comes to their own individual view around what are often complex areas. There we saw platforms presenting different views about their risk assessments. If we clarify that for them through amendments such as these, we are doing everyone a favour.

I again compliment my noble friend Lady Benjamin for her work in this area. Her speech was also a model of clarity. If we can bring some of that clarity to the legislation and to explaining what we want, that will be an enormous service.

The noble Lord, Lord Knight, made some interesting points around how this would add value to the Bill, teasing out some of the specific gaps that we have there. I look forward to hearing the response on that.

I was interested in the comments from the noble Lord, Lord Bethell, on mobile phone penetration. We should all hold in common that we are not going back to a time BC—before connection. Our children will be connected, which creates the imperative for us to get this right. There has perhaps been a tendency for us to bury our heads in the sand, and occasionally you hear that still—it is almost as if we would wish this world away. However, the noble Baroness, Lady Kidron, is at the other end of the spectrum; she has come alive on this subject, precisely because she recognises that that will not happen. We are in a world where our children will be connected, so it is on us to figure out how we want those connections to work and to instruct the people who provide those connective services on what they should do. It is certainly not for us to imagine that somehow they will all go away. We will come to that in later groups when we talk about minimum ages; if younger children are online, there is a real issue around how we are going to deal with that.

The right reverend Prelate the Bishop of Oxford highlighted some really important challenges based on real experiences that families today are suffering—let us use the word as it should be—and made the case for clarity. I do not know how much we are allowed to talk in praise of EU legislation, but I am looking at the Digital Services Act—I have looked at a lot of EU legislation—and this Bill, and there is a certain clarity to EU regulation, particularly the process of adding recitals, which are attached to the law and explain what it is meant to do. That is sometimes missing here. I know that there are different legal traditions, but you can sometimes look at an EU regulation and the UK law and the former appears to be much clearer in its intent.

That brings me to the substance of my comments in response to this group, so ably introduced by the noble Baroness, Lady Kidron. I hope that the Government heed and recognise that, at present, no ordinary person can know what is happening in the Bill—other than, perhaps, the wife of the noble Lord, Lord Stevenson, who will read it for fun—and what we intend to do.

I was thinking back to the “2B or not 2B” debate we had earlier about the lack of clarity around something even as simple as the classification of services. I was also thinking that, if you ask what the Online Safety Bill does to restrict self-harm content, the answer would be this: if it is a small social media platform, it will probably be categorised as a 2B service, then we can look at Schedule 7, where it is prohibited from assisting suicide, but we might want to come back to some of the earlier clauses with the specific duties—and it will go on and on. As the noble Baroness, Lady Harding, described, you are leaping backwards and forwards in the Bill to try to understand what we are trying to do with the legislation. I think that is a genuine problem.

In effect, the Bill is Parliament setting out the terms of service for how we want Ofcom to regulate online services. We debated terms of service earlier. What is sauce for the goose is sauce for the gander. We are currently failing our own tests of simplicity and clarity on the terms of service that we will give to Ofcom.

As well as platforms, if ordinary people want to find out what is happening, then, just like those platforms with the terms of service, we are going to make them read hundreds of pages before they find out what this legislation is intended to do. We can and should make this simpler for children and parents. I was able to meet Ian Russell briefly at the end of our Second Reading debate. He has been an incredibly powerful and pragmatic voice on this. He is asking for reasonable things. I would love to be able to give a Bill to Ian Russell, and the other families that the right reverend Prelate the Bishop of Oxford referred to, that they can read and that tells them very clearly how Parliament has responded to their concerns. I think we are a long way short of that simple clarity today.

It would be extraordinarily important for service providers, as I already mentioned in response to the noble Lord, Lord Russell. They need that clarity, and we want to make sure that they have no reason to say, “I did not understand what I was being asked to do”. That should be from the biggest to the smallest, as the noble Lord, Lord Moylan, keeps rightly raising with us. Any small service provider should be able to very clearly and simply understand what we are intending to do, and putting more text into the Bill that does that would actually improve it. This is not about adding a whole load of new complications and the bells and whistles we have described but about providing clarity on our intention. Small service providers would benefit from that clarity.

The noble Baroness, Lady Ritchie, rightly raised the issue of the speed of the development of technology. Again, we do not want the small service provider in particular to think it has to go back and do a whole new legal review every time the technology changes. If we have a clear set of principles, it is much quicker and simpler for it to say, “I have developed a new feature. How does it match up against this list?”, rather than having to go to Clause 12, Clause 86, Clause 94 and backwards and forwards within the Bill.

It will be extraordinarily helpful for enforcement bodies such as Ofcom to have a yardstick—again, this takes us back to our debate on the first day—for its prioritisation, because it will have to prioritise. It will not be able to do everything, everywhere, all at once. If we put that prioritisation into the legislation, it will, frankly, save potential arguments between Parliament, the Government and Ofcom later on, when they have decided to prioritise X and we wanted them to prioritise Y. Let us all get aligned on what we are asking them to do up front.

Dare I say—the noble Baroness, Lady Harding, reminded me of this—that it may also be extraordinarily helpful for us as politicians so that we can understand the state of the law. I mean not just the people who are existing specialists or are becoming specialists in this area and taking part in this debate but the other hundreds of Members of both Houses, because this is interesting to everyone. I have experience of being in the other place, and every Member of the other place will have constituents coming to them, often with very tragic circumstances, and asking what Parliament has done. Again, if they have the Online Safety Bill as currently drafted, I think it is hard for any Member of Parliament to be able to say clearly, “This is what we have done”. With those words and that encouraging wind, I hope the Government are able to explain, if not in this way, that they have a commitment to ensuring that we have that clarity for everybody involved in this process.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to all noble Lords who have spoken on this group and for the clarity with which the noble Lord, Lord Stevenson, has concluded his remarks.

Amendments 20, 74, 93 and 123, tabled by the noble Baroness, Lady Kidron, would mean a significant revising of the Bill’s approach to content that is harmful to children. It would set a new schedule of harmful content and risk to children—the 4 Cs—on the face of the Bill and revise the criteria for user-to-user and search services carrying out child safety risk assessments.

I start by thanking the noble Baroness publicly—I have done so privately in our discussions—for her extensive engagement with the Government on these issues over recent weeks, along with my noble friends Lord Bethell and Lady Harding of Winscombe. I apologise that it has involved the noble Baroness, Lady Harding, missing her stop on the train. A previous discussion we had also very nearly delayed her mounting a horse, so I can tell your Lordships how she has devoted hours to this—as they all have over recent weeks. I would like to acknowledge their campaigning and the work of all organisations that the noble Baroness, Lady Kidron, listed at the start of her speech, as well as the families of people such as Olly Stephens and the many others that the right reverend Prelate the Bishop of Oxford mentioned.

I also reassure your Lordships that, in developing this legislation, the Government carried out extensive research and engagement with a wide range of interested parties. That included reviewing international best practice. We want this to be world-leading legislation, including the four Cs framework on the online risks of harm to children. The Government share the objectives that all noble Lords have echoed in making sure that children are protected from harm online. I was grateful to the noble Baroness, Lady Benjamin, for echoing the remarks I made earlier in Committee on this. I am glad we are on the same page, even if we are still looking at points of detail, as we should be.

As the noble Baroness, Lady Kidron, knows, it is the Government’s considered opinion that the Bill’s provisions already deliver these objectives. I know that she remains to be convinced, but I am grateful to her for our continuing discussions on that point, and for continuing to kick the tyres on this to make sure that this is indeed legislation of which we can be proud.

It is also clear that there is broad agreement across the House that the Bill should tackle harmful content to children such as content that promotes eating disorders, illegal behaviour such as grooming and risk factors for harm such as the method by which content is disseminated, and the frequency of alerts. I am pleased to be able to put on record that the Bill as drafted already does this in the Government’s opinion, and reflects the principles of the four Cs framework, covering each of those: content, conduct, contact and commercial or contract risks to children.

First, it is important to understand how the Bill defines content, because that question of definition has been a confusing factor in some of the discussions hitherto. When we talk in general terms about content, we mean the substance of a message. This has been the source of some confusion. The Bill defines “content”, for the purposes of this legislation, in Clause 207 extremely broadly as

“anything communicated by means of an internet service”.

Under this definition, in essence, all user communication and activity, including recommendations by an algorithm, interactions in the metaverse, live streams, and so on, is facilitated by “content”. So, for example, unwanted and inappropriate contact from an adult to a child would be treated by the Bill as content harm. The distinctions that the four Cs make between content, conduct and contact risks is therefore not necessary. For the purposes of the Bill, they are all content risks.

Secondly, I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

Where are the commercial harms? I cannot totally get my head around my noble friend’s definition of content. I can sort of understand how it extends to conduct and contact, but it does not sound as though it could extend to the algorithm itself that is driving the addictive behaviour that most of us are most worried about.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

In that vein, will the noble Lord clarify whether that definition of content does not include paid-for content?

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

It is a great pleasure to follow my noble friend Lord Russell and to thank him for his good wishes. I assure the Committee that there is nowhere I would rather spend my birthday, in spite of some competitive offers. I remind noble Lords of my interests in the register, particularly as the chair of 5Rights Foundation.

As my noble friend has set out, these amendments fall in three places: the risk assessments, the safety duties and the codes of practice. However, together they work on the overarching theme of safety by design. I will restrict my detailed remarks to a number of amendments in the first two categories. This is perhaps a good moment to recall the initial work of Carnegie, which provided the conceptual approach of the Bill several years ago in arguing for a duty of care. The Bill has gone many rounds since then, but I think the principle remains that a regulated service should consider its impact on users before it causes them harm. Safety by design, to which all the amendments in this group refer, is an embodiment of a duty of care. In thinking about these amendments as a group, I remind the Committee that both the proportionality provisions and the fact that this is a systems and processes Bill means that no company can, should or will be penalised for a single piece of content, a single piece of design or, indeed, low-level infringements.

Amendments 24, 31, 77 and 84 would delete “content” from the Government’s description of what is harmful to children, meaning that the duty is to consider harm in the round rather than just harmful content. The definition of “content” is drawn broadly in Clause 207 as

“anything communicated by means of an internet service”,

but the examples in the Bill, including

“written material … music and data of any description”,

once again fail to include design features that are so often the key drivers of harm to children.

On day three of Committee, the Minister said:

“The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service … This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children”.—[Official Report, 27/4/23; col. 1385.]


However, in looking at the child safety duties, Clause 11(5) says:

“The duties … in subsections (2) and (3) apply across all areas of a service, including the way it is designed, operated and used”,


but subsection (14) says:

“The duties set out in subsections (3) and (6)”—


which are the duties to operate proportionate systems and processes to prevent and protect children from encountering harmful content and to include them in terms of service—

“are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.

I hesitate to say whether that is contradictory. I am not actually sure, but it is confusing. I am concerned that while we are reassured that “content” means content and activity and that the risk assessment considers functionality, “harm” is then repeatedly expressed only in the form of content.

Over the weekend, I had an email exchange with the renowned psychoanalyst and author, Norman Doidge, whose work on the plasticity of the brain profoundly changed how we think about addiction and compulsion. In the exchange, he said that

“children’s exposures to super doses, of supernormal images and scenes, leaves an imprint that can hijack development”.

Then, he said that

“the direction seems to be that AI would be working out the irresistible image or scenario, and target people with these images, as they target advertising”.

His argument is that it is not just the image but the dissemination and tailoring of that image that maximises the impact. The volume and frequency of those images create habits in children that take a lifetime to change—if they change at all. Amendments 32 and 85 would remove this language to ensure that content that is harmful by virtue of its dissemination is accounted for.

I turn now to Amendments 28 and 82, which cut the reference to the

“size and capacity of the provider of the service”

in deeming what measures are proportionate. We have already discussed that small is not safe. Such platforms such as Yubo, Clapper and Discord have all been found to harm children and, as both the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, told us, small can become big very quickly. It is far easier to build to a set of rules than it is to retrofit them after the event. Again, I point out that Ofcom already has duties of proportionality; adding size and capacity is unnecessary and may tip the scale to creating loopholes for smaller services.

Amendment 138 seeks to reverse the exemption in Clause 54 of financial harms. More than half of the 100 top-grossing mobile phone apps contain loot boxes, which are well established as unfair and unhealthy, priming young children to gamble and leading to immediate hardship for parents landed with extraordinary bills.

By rights, Amendments 291 and 292 could fit in the future-proof set of amendments. The way that the Bill in Clause 204 separates out functionalities in terms of search and user-to-user is in direct opposition to the direction of travel in the tech sector. TikTok does shopping, Instagram does video, Amazon does search; autocomplete is an issue across the full gamut of services, and so on and so forth. This amendment simply combines the list of functionalities that must be risk-assessed and makes them apply on any regulated service. I cannot see a single argument against this amendment: it cannot be the Government’s intention that a child can be protected, on search services such as Google, from predictive search or autocomplete, but not on TikTok.

Finally, Amendment 295 will embed the understanding that most harm is cumulative. If the Bereaved Parents for Online Safety were in the Chamber, or any child caught up in self-harm, depression sites, gambling, gaming, bullying, fear of exposure, or the inexorable feeling of losing their childhood to an endless scroll, they would say at the top of their voices that it is not any individual piece of content, or any one moment or incident, but the way in which they are nudged, pushed, enticed and goaded into a toxic, harmful or dangerous place. Adding the simple words

“the volume of the content and the frequency with which the content is accessed”

to the interpretation of what can constitute harm in Clause 205 is one of the most important things that we can do in this Chamber. This Bill comes too late for a whole generation of parents and children but, if these safety by design amendments can protect the next generation of children, I will certainly be very glad.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, it is an honour, once again, to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, in this Committee. I am going to speak in detail to the amendments that seek to change the way the codes of practice are implemented. Before I do, however, I will very briefly add my voice to the general comments that the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, have just taken us through. Every parent in the country knows that both the benefit and the harm that online platforms can bring our children is not just about the content. It is about the functionality: the way these platforms work; the way they suck us in. They do give us joy but they also drive addiction. It is hugely important that this Bill reflects the functionality that online platforms bring, and not just content in the normal sense of the word “content”.

I will now speak in a bit more detail about the following amendments: Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A—I will finish soon, I promise—112, 122ZA, 122ZB and 122ZC.

Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- Hansard - - - Excerpts

My noble friend may have left one out.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I am afraid I may well have done.

That list shows your Lordships some of the challenges we all have with the Bill. All these amendments seek to ensure that the codes of practice relating to child safety are binding. Such codes should be principles-based and flexible to allow companies to take the most appropriate route of compliance, but implementing these codes should be mandatory, rather than, as the Bill currently sets out, platforms being allowed to use “alternative measures”. That is what all these amendments do—they do exactly the same thing. That was a clear and firm recommendation from the joint scrutiny committee. The government’s response to that joint scrutiny committee report was really quite weak. Rather than rehearse the joint scrutiny committee’s views, I will rehearse the Government’s response and why it is not good enough to keep the Bill as it stands.

The first argument the Government make in their response to the joint scrutiny report is that there is no precedent for mandatory codes of conduct. But actually there are. There is clear precedent in child protection. In the physical world, the SEND code for how we protect some of our most vulnerable children is mandatory. Likewise, in the digital world, the age-appropriate design code, which we have mentioned many a time, is also mandatory. So there is plenty of precedent.

The second concern—this is quite funny—was that stakeholders were concerned about having multiple codes of conduct because it could be quite burdensome on them. Well, forgive me for not crying too much for these enormous tech companies relative to protecting our children. The burden I am worried about is the one on Ofcom. This is an enormous Bill, which places huge amounts of work on a regulator that already has a very wide scope. If you make codes of conduct non-mandatory, you are in fact making the work of the regulator even harder. The Government themselves in their response say that Ofcom has to determine what the minimum standards should be in these non-binding codes of practice. Surely it is much simpler and more straightforward to make these codes mandatory and, yes, to add potentially a small additional burden to these enormous tech companies to ensure that we protect our children.

The third challenge is that non-statutory guidance already looks as if it is causing problems in this space. On the video-sharing platform regime, which is non-mandatory, Ofcom has already said that in its first year of operation it has

“seen a large variation in platforms’ readiness to engage with Ofcom”.

All that will simply make it harder and harder, so the burden will lie on this regulator—which I think all of us in this House are already worried is being asked to do an awful lot—if we do not make it very clear what is mandatory and what is not. The Secretary of State said of the Bill that she is

“determined to put these vital protections for … children … into law as quickly as possible”.

A law that puts in place a non-mandatory code of conduct is not what parents across the country would expect from that statement from the Secretary of State. People out there—parents and grandparents across the land—would expect Ofcom to be setting some rules and companies to be required to follow them. That is exactly what we do in the physical world, and I do not understand why we would not want to do it in the digital world.

Finally—I apologise for having gone on for quite a long time—I will very briefly talk specifically to Amendment 32A, in the name of the noble Lord, Lord Knight, which is also in this group. It is a probing amendment which looks at how the Bill will address and require Ofcom and participants to take due regard of VPNs: the ability for our savvy children—I am the mother of two teenage girls—to get round all this by using a VPN to access the content they want. This is an important amendment and I am keen to hear what my noble friend Minister will say in response. Last week, I spoke about my attempts to find out how easy it would be for my 17 year-old daughter to access pornography on her iPhone. I spoke about how I searched in the App Store on her phone and found that immediately a whole series of 17-plus-rated apps came up that were pornography sites. What I did not mention then is that with that—in fact, at the top of the list—came a whole series of VPN apps. Just in case my daughter was naive enough to think that she could just click through and watch it, and Apple was right that 17 year-olds were allowed to watch pornography, which obviously they are not, the App Store was also offering her an easy route to access it through a VPN. That is not about content but functionality, and we need to properly understand why this bundle of amendments is so important.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Well, I must regard myself as doubly rebuked, and unfairly, because my reflections are very relevant to the amendments, and I have developed them in that direction. In respect of the parents, they have suffered very cruelly and wrongly, but although it may sound harsh, as I have said in this House before on other matters, hard cases make bad law. We are in the business of trying to make good law that applies to the whole population, so I do not think that these are wholly—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

If my noble friend could, would he roll back the health and safety regulations for selling toys, in the same way that he seems so happy to have no health and safety regulations for children’s access to digital toys?

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, if the internet were a toy, aimed at children and used only by children, those remarks would of course be very relevant, but we are dealing with something of huge value and importance to adults as well. It is the lack of consideration of the role of adults, the access for adults and the effects on freedom of expression and freedom of speech, implicit in these amendments, that cause me so much concern.

I seem to have upset everybody. I will now take issue with and upset the noble Baroness, Lady Benjamin, with whom I have not engaged on this topic so far. At Second Reading and earlier in Committee, she used the phrase, “childhood lasts a lifetime”. There are many people for whom this is a very chilling phrase. We have an amendment in this group—a probing amendment, granted—tabled by the noble Lord, Lord Knight of Weymouth, which seeks to block access to VPNs as well. We are in danger of putting ourselves in the same position as China, with a hermetically sealed national internet, attempting to put borders around it so that nobody can breach it. I am assured that even in China this does not work and that clever and savvy people simply get around the barriers that the state has erected for them.

Before I sit down, I will redeem myself a little, if I can, by giving some encouragement to the noble Baroness, Lady Kidron, on Amendments 28 and 32 —although I think the amendments are in the name of the noble Lord, Lord Russell of Liverpool. These amendments, if we are to assess the danger posed by the internet to children, seek to substitute an assessment of the riskiness of the provider for the Government’s emphasis on the size of the provider. As I said earlier in Committee, I do not regard size as being a source of danger. When it comes to many other services— I mentioned that I buy my sandwich from Marks & Spencer as opposed to a corner shop—it is very often the bigger provider I feel is going to be safer, because I feel I can rely on its processes more. So I would certainly like to hear how my noble friend the Minister responds on that point in relation to Amendments 28 and 32, and why the Government continue to put such emphasis on size.

More broadly, in these understandable attempts to protect children, we are in danger of using language that is far too loose and of having an effect on adult access to the internet which is not being considered in the debate—or at least has not been until I have, however unwelcomely, raised it.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.

The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.

Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.

Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.

As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.

I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.

My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My noble friend the Minister did not address the concern I set out that the Bill’s approach will overburden Ofcom. If Ofcom has to review the suitability of each set of alternative measures, we will create an even bigger monster than we first thought.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I do not think that it will. We have provided further resource for Ofcom to take on the work that this Bill will give it; it has been very happy to engage with noble Lords to talk through how it intends to go about that work and, I am sure, would be happy to follow up on that point with my noble friend to offer her some reassurance.

Responding to the point from my noble friend Lord Vaizey, the Bill is part of the UK’s overall digital regulatory landscape, which will deliver protections for children alongside the data protection requirements for children set out in the Information Commissioner’s age-appropriate design code. Ofcom has strong existing relationships with other bodies in the regulatory sphere, including through the Digital Regulation Co-operation Forum. The Information Commissioner has been added to this Bill as a statutory consultee for Ofcom’s draft codes of practice and relevant pieces of guidance formally to provide for the ICO’s input into its areas of expertise, especially relating to privacy.

Amendment 138 from the noble Lord, Lord Russell of Liverpool, would amend the criteria for non-designated content which is harmful to children to bring into scope content whose risk of harm derives from its potential financial impact. The Bill already requires platforms to take measures to protect all users, including children, from financial crime online. All companies in scope of the Bill will need to design and operate their services to reduce the risk of users encountering content amounting to a fraud offence, as set out in the list of priority offences in Schedule 7. This amendment would expand the scope of the Bill to include broader commercial harms. These are dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This amendment therefore risks creating regulatory overlap, which would cause confusion for business while not providing additional protections to consumers and internet users.

Amendment 261 in the name of the right reverend Prelate the Bishop of Oxford seeks to modify the existing requirements for the Secretary of State’s review into the effectiveness of the regulatory framework. The purpose of the amendment is to ensure that all aspects of a regulated service are taken into account when considering the risk of harm to users and not just content.

As we have discussed already, the Bill defines “content” very broadly and companies must look at every aspect of how their service facilitates harm associated with the spread of content. Furthermore, the review clause makes explicit reference to the systems and processes which regulated services use, so the review can already cover harm associated with, for example, the design of services.

--- Later in debate ---
Lord Weir of Ballyholme Portrait Lord Weir of Ballyholme (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I rise on this group of amendments, particularly with reference to Amendments 25, 78, 187 and 196, to inject a slight note of caution—I hope in a constructive manner—and to suggest that it would be the wrong step to try to incorporate them into this legislation. I say at the outset that I think the intention behind these amendments is perfectly correct; I do not query the intention of the noble Lord, Lord Russell, and others. Indeed, one thing that has struck me as we have discussed the Bill is the commonality of approach across the Chamber. There is a strong common desire to provide a level of protection for children’s rights, but I question whether these amendments are the right vehicle by which to do that.

It is undoubtedly the case that the spirit of the UNCRC is very strongly reflected within the Bill, and I think it moves in a complementary fashion to the Bill. Therefore, again, I do not query the UNCRC in particular. It can act as a very strong guide to government as to the route it needs to take, and I think it has had a level of influence on the Bill. I speak not simply as someone observing the Bill but as someone who, in a previous existence, served as an Education Minister in Northern Ireland and had direct responsibility for children’s rights. The guidance we received from the UNCRC was, at times, very useful to Ministers, so I do not question any of that.

For three reasons, I express a level of concern about these amendments. I mentioned that the purpose of the UNCRC is to act as a guide—a yardstick—for government as to what should be there in terms of domestic protections. That is its intention. The UNCRC itself was never written as a piece of legislation, and I do not think it was the original intention to have it directly incorporated and implemented as part of law. The UNCRC is aspirational in nature, which is very worth while. However, it is not written in a legislative form. At times, it can be a little vague, particularly if we are looking at the roles that companies will play. At times, it sets out very important principles, but ones which, if left for interpretation by the companies themselves, could create a level of tension.

To give an example, there is within the UNCRC a right to information and a right to privacy. That can sometimes create a tension for companies. If we are to take the purpose of the UNCRC, it is to provide that level of guidance to government, to ensure that it gets it right rather than trying to graft UNCRC directly on to domestic law.

Secondly, the effect of these amendments would be to shift the interpretation and implementation of what is required of companies from government to the companies themselves. They would be left to try to determine this, whereas I think that the UNCRC is principally a device that tries to make government accountable for children’s rights. As such, it is appropriate that government has the level of responsibility to draft the regulations, in conjunction with key experts within the field, and to try to ensure that what we have in these regulations is fit for purpose and bespoke to the kind of regulations that we want to see.

To give a very good example, there are different commissioners across the United Kingdom. One of the key groups that the Government should clearly be consulting with to make sure they get it right is the Children’s Commissioners of the different jurisdictions in the United Kingdom. Through that process, but with that level of ownership still lying with government and Ofcom, we can create regulations that provide the level of protection for our children that we all desire to see; whereas, if the onus is effectively shifted on to companies simply to comply with what is a slightly vague, aspirational purpose in these regulations, that is going to lead to difficulties as regards interpretation and application.

Thirdly, there is a reference to having due regard to what is in the UNCRC. From my experience, both within government and even seeing the way in which government departments do that—and I appreciate that “due regard” has case law behind it—even different government departments have tended to interpret that differently and in different pieces of legislation. At one extreme, on some occasions that effectively means that lip service has been paid to that by government departments and, in effect, it has been largely ignored. Others have seen it as a very rigorous duty. If we see that level of disparity between government departments within the same Government, and if this is to be interpreted as a direct instruction to and requirement of companies of varying sizes—and perhaps with various attitudes and feelings of responsibility on this subject—that creates a level of difficulty in and of itself.

My final concern in relation to this has been mentioned in a number of debates on various groups of amendments. Where a lot of Peers would see either a weakness in the legislation or something else that needs to be improved, we need to have as much consistency and clarity as possible in both interpretation and implementation. As such, the more we move away from direct regulations, which could then be put in place, to relying on the companies themselves interpreting and implementing, perhaps in different fashions, with many being challenged by the courts at times, the more we create a level of uncertainty and confusion, both for the companies themselves and for users, particularly the children we are looking to protect.

While I have a lot of sympathy for the intention of the noble Lord, Lord Russell, and while we need to find a way to incorporate into the Bill in some form how we can drive children’s rights more centrally within this, the formulation of the direct grafting of the UNCRC on to this legislation, even through due regard, is the wrong vehicle for doing it. It is inappropriate. As such, it is important that we take time to try to find a better vehicle for the sort of intention that the noble Lord, Lord Russell, and others are putting forward. Therefore, I urge the noble Lord not to press his amendments. If he does, I believe that the Committee should oppose the amendments as drafted. Let us see if, collectively, we can find a better and more appropriate way to achieve what we all desire: to try to provide the maximum protection in a very changing world for our children as regards online safety.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I support these amendments. We are in the process of having a very important debate, both in the previous group and in this one. I came to this really important subject of online safety 13 years ago, because I was the chief executive of a telecoms company. Just to remind noble Lords, 13 years ago neither Snap, TikTok nor Instagram—the three biggest platforms that children use today—existed, and telecoms companies were viewed as the bad guys in this space. I arrived, new to the telecoms sector, facing huge pressure—along with all of us running telecoms companies—from Governments to block content.

I often felt that the debate 13 years ago too quickly turned into what was bad about the internet. I was spending the vast majority of my working day trying to encourage families to buy broadband and to access this thing that you could see was creating huge value in people’s lives, both personal and professional. Sitting on these Benches, I fundamentally want to see a society with the minimum amount of regulation, so I was concerned that regulating internet safety would constrain innovation; I wanted to believe that self-regulation would work. In fact, I spent many hours in workshops with the noble Baroness, Lady Kidron, and many others in this Chamber, as we tried to persuade and encourage the tech giants—as everyone started to see that it was not the telecoms companies that were the issue; it was the emerging platforms—to self-regulate. It is absolutely clear that that has failed. I say that with quite a heavy heart; it has genuinely failed, and that is why the Bill is so important: to enshrine in law some hard regulatory requirements to protect children.

That does not change the underlying concern that I and many others—and everyone in this Chamber—have, that the internet is also potentially a force for good. All technology is morally neutral: it is the human beings who make it good or bad. We want our children to genuinely have access to the digital world, so in a Bill that is enshrining hard gates for children, it is really important that it is also really clear about the rights that children have to access that technology. When you are put under enormous pressure, it is too easy—I say this as someone who faced it 13 years ago, and I was not even facing legislation—to try to do what you think your Government want to do, and then end up causing harm to the individuals you are actually trying to protect. We need this counterbalance in this Bill. It is a shame that my noble friend Lord Moylan is not in his place, because, for the first time in this Committee, I find myself agreeing with him. It is hugely important that we remember that this is also about freedom and giving children the freedom to access this amazing technology.

Some parts of the Bill are genuinely ground-breaking, where we in this country are trying to work out how to put the legal scaffolding in place to regulate the internet. Documenting children’s rights is not something where we need to start from scratch. That is why I put my name to this amendment: I think we should take a leaf from the UN Convention on the Rights of the Child. I recognise that the noble Lord, Lord Weir of Ballyholme, made some very thought-provoking comments about how we have to be careful about the ambiguity that we might be creating for companies, but I am afraid that ambiguity is there whether we like it or not. These are not just decisions for government: the tension between offering services that will brighten the lives of children but risking them as well are exactly behind the decisions that technology companies take every day. As the Bill enshrines some obligations on them to protect children from the harms, I firmly believe it should also enshrine obligations on them to offer the beauty and the wonder of the internet, and in doing that enshrine their right to this technology.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I want to challenge the noble Baroness’s assertion that the Bill is not about children’s rights. Anyone who has a teenage child knows that their right to access the internet is keenly held and fought out in every household in the country.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The quip works, but political rights are not quips. Political rights have responsibilities, and so on. If we gave children rights, they would not be dependent on adults and adult society. Therefore, it is a debate; it is a row about what our rights are. Guess what. It is a philosophical row that has been going on all around the world. I am just suggesting that this is not the place—

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

That is right. What is interesting about that useful intervention from the noble Lord, Lord Bethell, is that that kind of gets search off the hook in respect of gambling. You are okay to follow the link from the search engine, but then you are age-gated at the point of the content. Clearly, with thumbnail images and so on in search, we need something better than that. The Bill requires something better than that already; should we go further? My question to the Minister is whether this could be similar to the discussion we had with the noble Baroness, Lady Harding, around non-mandatory codes and alternative methods. I thought that the Minister’s response in that case was quite helpful.

Could it be that if Part 3 and category 2A services chose to use age verification, they could be certain that they are compliant with their duties to protect children from pornographic and equivalent harmful content, but if they chose age-assurance techniques, it would then be on them to show Ofcom evidence of how that alternative method would still provide the equivalent protection? That would leave the flexibility of age assurance; it would not require age verification but would still set the same bar. I merely offer that in an attempt to be helpful to the Minister, in the spirit of where the Joint Committee and the noble Lord, Lord Clement-Jones, were coming from. I look forward to the Minister’s reply.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

Before the noble Lord sits down, can I ask him whether his comments make it even more important that we have a clear and unambiguous definition of age assurance and age verification in the Bill?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I would not want to disagree with the noble Baroness for a moment.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I understand that, for legislation to have any meaning, it has to have some teeth and you have to be able to enforce it; otherwise, it is a waste of time, especially with something as important as the legislation that we are discussing here.

I am a bit troubled by a number of the themes in these amendments and I therefore want to ask some questions. I saw that the Government had tabled these amendments on senior manager liability, then I read amendments from both the noble Lord, Lord Bethell, and the Labour Party, the Opposition. It seemed to me that even more people would be held liable and responsible as a result. I suppose I have a dread that—even with the supply chain amendment—this means that lots of people are going to be sacked. It seems to me that this might spiral dangerously out of control and everybody could get caught up in a kind of blame game.

I appreciate that I might not have understood, so this is a genuine attempt to do so. I am concerned that these new amendments will force senior managers and, indeed, officers and staff to take an extremely risk-averse approach to content moderation. They now have not only to cover their own backs but to avoid jail. One of my concerns has always been that this will lead to the over-removal of legal speech, and more censorship, so that is a question I would like to ask.

I also want to know how noble Lords think this will lie in relation to the UK being a science and technology superpower. Understandably, some people have argued that these amendments are making the UK a hostile environment for digital investment, and there is something to be balanced up there. Is there a risk that this will lead to the withdrawal of services from the UK? Will it make working for these companies unattractive to British staff? We have already heard that Jimmy Wales has vowed that the Wikimedia foundation will not scrutinise posts in the way demanded by the Bill. Is he going to be thrown in prison, or will Wikipedia pull out? How do we get the balance right?

What is the criminal offence that has a threat of a prison sentence? I might have misunderstood, but a technology company manager could fail to prevent a child or young person encountering legal but none the less allegedly harmful speech, be considered in breach of these amendments and get sent to prison. We have to be very careful that we understand what this harmful speech is, as we discussed previously. The threshold for harm, which encompasses physical and psychological harm, is vast and could mean people going to prison without the precise criminal offence being clear. We talked previously about VPNs. If a tech savvy 17-year-old uses a VPN and accesses some of this harmful material, will someone potentially be criminally liable for that young person getting around the law, find themselves accused of dereliction of duty and become a criminal?

My final question is on penalties. When I was looking at this Bill originally and heard about the eye-watering fines that some Silicon Valley companies might face, I thought, “That will destroy them”. Of course, to them it is the mere blink of an eye, and I do get that. This indicates to me, given the endless conversations we have had on whether size matters, that in this instance size does matter. The same kind of liabilities will be imposed not just on the big Silicon Valley monsters that can bear these fines, but on Mumsnet—or am I missing something? Mumsnet might not be the correct example, but could not smaller platforms face similar liabilities if a young person inadvertently encounters harmful material? It is not all malign people trying to do this; my unintended consequence argument is that I do not want to create criminals when a crime is not really being committed. It is a moral dilemma, and I do understand the issue of enforcement.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

I rise very much to support the comments of my noble friend Lord Bethell and, like him, to thank the Minister for bringing forward the government amendments. I will try to address some of the comments the noble Baroness, Lady Fox, has just made.

One must view this as an exercise in working out how one drives culture change in some of the biggest and most powerful organisations in the world. Culture change is really hard. It is hard enough in a company of 10 people, let alone in a company with hundreds of thousands of employees across the world that has more money than a single country. That is what this Bill requires these enormous companies to do: to change the way they operate when they are looking at an inevitably congested, contested technology pipeline, by which I mean—to translate that out of tech speak—they have more work to do than even they can cope with. Every technology company, big or small, always has this problem: more good ideas than their technologists can cope with. They have to prioritise what to fix and what to implement. For the last 15 years, digital companies have prioritised things that drive income, but not the safety of our children. That requires a culture change from the top of the company.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Burt of Solihull Portrait Baroness Burt of Solihull (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is my first contribution to the Bill, and I feel I need to apologise in advance for my lack of knowledge and expertise in this whole field. In her initial remarks, the noble Baroness, Lady Morgan of Cotes, was saying “Don’t worry, because you don’t need to be a lawyer”. Unfortunately, I do not have any expertise in the field of the internet and social media and all of that as well, so I will be very brief in all of my remarks on the Bill. But I feel that I cannot allow the Bill to go past without at least making a few remarks, as equalities spokesperson for the Lib Dems. The issues are of passionate importance to me, and of course to victims of online abuse, and it is those victims for whom I speak today.

In this group, I will address my remarks to Amendments 34 and 35, in which we have discussed content deemed to be harmful—suicide, self-harm, eating disorders and abuse and hate content—under the triple shield approach, although this content discussion has strayed somewhat during the course of the debate.

Much harmful material, as we have heard, initially comes to the user uninvited. I do not pretend to understand how these algorithms work, but my understanding is that if you open one, they literally click into action, increasing more and more of this kind of content being fed to you in your feed. The suicide of young Molly Russell is a typical example of the devastating consequences of how much damage these algorithms can contribute. I am glad that the Bill will go further to protect children, but it still leaves adults—some young and vulnerable—without some protection and with the same amount of automatic exposure to harmful content, which algorithms can increase with engagement, which could have overwhelming impacts on their mental health, as my noble friend Lady Parminter so movingly and eloquently described.

So this amendment means a user would have to make an active, conscious choice to be exposed to such content: an opt out rather than an opt in. This has been discussed at length by noble Lords a great deal more versed in the subject than me. But surely the only persons or organisations who would not support this would be the ones who do not have the best interests of the vulnerable users we have been talking about this afternoon at heart. I hope the Minister will confirm in his remarks that the Government do.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I had not intended to speak in this debate because I now need to declare an unusual interest, in that Amendment 38A has been widely supported outside this Chamber by my husband, the Member of Parliament for Weston-super-Mare. I am not intending to speak on that amendment but, none the less, I mention it just in case.

I rise to speak because I have been so moved by the speeches, not least the right reverend Prelate’s speech. I would like just to briefly address the “default on” amendments and add my support. Like others, on balance I favour the amendments in the name of the noble Lord, Lord Clement-Jones, but would willingly throw my support behind my noble friend Lady Morgan were that the preferred choice in the Chamber.

I would like to simply add two additional reasons why I ask my noble friend the Minister to really reflect hard on this debate. The first is that children become teenagers, who become young adults, and it is a gradual transition—goodness, do I feel it as the mother of a 16 year-old and a 17 year-old. The idea that on one day all the protections just disappear completely and we require our 18 year-olds to immediately reconfigure their use of all digital tools just does not seem a sensible transition to adulthood to me, whereas the ability to switch off user empowerment tools as you mature as an adult seems a very sensible transition.

Secondly, I respect very much the free speech arguments that the noble Baroness, Lady Fox, made but I do not think this is a debate about the importance of free speech. It is actually about how effective the user empowerment tools are. If they are so hard for non-vulnerable adults to turn off, what hope have vulnerable adults to be able to turn them on? For the triple shield to work and the three-legged stool to be effective, the onus needs to be on the tech companies to make these user empowerment tools really easy to turn on and turn off. Then “default on” is not a restriction on freedom of speech at all; it is simply a means of protecting our most vulnerable.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very thoughtful and thought-provoking debate. I start very much from the point of view expressed by the noble Baroness, Lady Kidron, and this brings the noble Baroness, Lady Buscombe, into agreement—it is not about the content; this is about features. The noble Baroness, Lady Harding, made exactly the same point, as did the noble Baroness, Lady Healy—this is not about restriction on freedom of speech but about a design feature in the Bill which is of crucial importance.

When I was putting together the two amendments that I have tabled, I was very much taken by what Parent Zone said in a recent paper. It described user empowerment tools as “a false hope”, and rightly had a number of concerns about undue reliance on tools. It said:

“There is a real danger of users being overwhelmed and bewildered”.


It goes on to say that

“tools cannot do all the work, because so many other factors are in play—parental styles, media literacy and technological confidence, different levels of vulnerability and, crucially, trust”.

The real question—this is why I thought we should look at it from the other side of things in terms of default—is about how we mandate the use of these user empowerment tools in the Bill for both children and adults. In a sense, my concerns are exactly the opposite of those of the noble Baroness, Lady Fox—for some strange, unaccountable reason.

The noble Baroness, Lady Morgan, the noble Lord, Lord Griffiths, the right reverend Prelate and, notably, my noble friend Lady Parminter have made a brilliant case for their amendment, and it is notable that these amendments are supported by a massive range of organisations. They are all in this area of vulnerable adults: the Mental Health Foundation, Mind, the eating disorder charity Beat, the Royal College of Psychiatrists, the British Psychological Society, Rethink Mental Illness, Mental Health UK, and so on. It is not a coincidence that all these organisations are discussing this “feature”. This is a crucial aspect of the Bill.

Again, I was very much taken by some of the descriptions used by noble Lords during the debate. The right reverend Prelate the Bishop of Oxford said that young people do not suddenly become impervious to content when they reach 18, and he particularly described the pressures as the use of AI only increases. I thought the way the noble Baroness, Lady Harding, described the progression from teenagehood to adulthood was extremely important. There is not some sort of point where somebody suddenly reaches the age of 18 and has full adulthood which enables then to deal with all this content.

Under the Bill as it stands, adult users could still see and be served some of the most dangerous content online. As we have heard, this includes pro-suicide, pro-anorexia and pro-bulimia content. One has only to listen to what my noble friend Lady Parminter had to say to really be affected by the operation, if you like, of social media in those circumstances. This is all about the vulnerable. Of course, we know that anorexia has the highest mortality rate of any mental health problem; the NHS is struggling to provide specialist treatment to those who need it. Meanwhile, suicide and self-harm-related content remains common and is repeatedly implicated in deaths. All Members here who were members of the Joint Committee remember the evidence of Ian Russell about his daughter Molly. I think that affected us all hugely.

We believe now you can pay your money and take your choice of whichever amendment seems appropriate. Changing the user empowerment provisions to require category 1 providers to have either the safest options as default for users or the terms of my two amendments is surely a straightforward way of protecting the vast majority of internet users who do not want this material served to them.

You could argue that the new offence of encouragement to serious self-harm, which the Government have committed to introducing, might form part of the solution here, but you cannot criminalise all the legal content that treads the line between glorification and outright encouragement. Of course, we know the way the Bill has been changed. No similar power is proposed, for instance, to address eating disorder content.

The noble Baroness, Lady Healy, quoted our own Communications and Digital Committee and its recommendations about a comprehensive toolkit of settings overseen by Ofcom, allowing users to decide what types of content they see and from whom. I am very supportive of Amendment 38A from the noble Lord, Lord Knight, which gives a greater degree of granularity about the kind of user, in a sense, that can communicate to users.

Modesty means that of course I prefer my own amendments and I agree with the noble Baronesses, Lady Fraser, Lady Bull and Lady Harding, and I am very grateful for their support. But we are all heading in the same direction. We are all arguing for a broader “by default” approach. The onus should not be on these vulnerable adults in particular to switch them on, as the noble Baroness, Lady Bull, said. It is all about those vulnerable adults and we must, as my noble friend Lady Burt, said, have their best interests at heart, and that is why we have tabled these amendments.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

I will not detain noble Lords very long either. Two things have motivated me to be involved in this Bill. One is protection for vulnerable adults and the second is looking at this legislation with my Scottish head on, because nobody else seems to be looking at it from the perspective of the devolved Administrations.

First, on protection for vulnerable adults, we have already debated the fact that in an earlier iteration of this Bill, there were protections. These have been watered down and we now have the triple shield. Whether they fit here, with the amendment from my noble friend Lady Stowell, or fit earlier, what we are all asking for is the reinstatement of risk assessments. I come at this from a protection of vulnerable groups perspective, but I recognise that others come at it from a freedom of expression perspective. I do not think the Minister has answered my earlier questions. Why have risk assessments been taken out and why are they any threat? It seems to be the will of the debate today that they do nothing but strengthen the transparency and safety aspects of the Bill, wherever they might be put.

I speak with trepidation to Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead. I flatter myself that his amendment and mine are trying to do a similar thing. I will speak to my amendment when we come to the group on devolved issues, but I think what both of us are trying to establish is, given that the Bill is relatively quiet on how freedom of expression is defined, how do platforms balance competing rights, particularly in the light of the differences between the devolved Administrations?

The Minister will know that the Hate Crime and Public Order (Scotland) Act 2021 made my brain hurt when trying to work out how this Bill affects it, or how it affects the Bill. What is definitely clear is that there are differences between the devolved Administrations in how freedom of expression is interpreted. I will study the noble and learned Lord’s remarks very carefully in Hansard; I need a little time to think about them. I will listen very carefully to the Minister’s response and I look forward to the later group.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I too will be very brief. As a member of the Communications and Digital Committee, I just wanted to speak in support of my noble friend Lady Stowell of Beeston and her extremely powerful speech, which seems like it was quite a long time ago now, but it was not that long. I want to highlight two things. I do not understand how, as a number of noble Lords have said, having risk assessments is a threat to freedom of expression. I think the absolute opposite is the case. They would enhance all the things the noble Baroness, Lady Fox, is looking to see in the Bill, just as much as they would enhance the protections that my noble friend, who I always seem to follow in this debate, is looking for.

Like my noble friend, I ask the Minister: why not? When the Government announced the removal of legal but harmful and the creation of user empowerment tools, I remember thinking—in the midst of being quite busy with Covid—“What are user empowerment tools and what are they going to empower me to do?” Without a risk assessment, I do not know how we answer that question. The risk is that we are throwing that question straight to the tech companies to decide for themselves. A risk assessment provides the framework that would enable user empowerment tools to do what I think the Government intend.

Finally, I too will speak against my noble friend Lord Moylan’s Amendment 294 on psychological harm. It is well documented that tech platforms are designed to drive addiction. Addiction can be physiological and psychological. We ignore that at our peril.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to have been part of this debate and to have heard how much we are on common ground. I very much hope that, in particular, the Minister will have listened to the voices on the Conservative Benches that have very powerfully put forward a number of amendments that I think have gained general acceptance across the Committee.

I fully understand the points that the noble Lord, Lord Black, made and why he defends Clause 14. I hope we can have a more granular discussion about the contents of that clause rather than wrap it up on this group of amendments. I do not know whether we will be able to have that on the next group.

I thank the noble Baroness, Lady Stowell, for putting forward her amendment. It is very interesting, as the noble Baronesses, Lady Bull and Lady Fraser, said, that we are trying to get to the same sort of mechanisms of risk assessment, perhaps out of different motives, but we are broadly along the same lines and want to see them for adult services. We want to know from the Minister why we cannot achieve that, basically. I am sure we could come to some agreement between us as to whether user empowerment tools or terms of service are the most appropriate way of doing it.

We need to thank the committee that the noble Baroness chairs for having followed up on the letter to the Secretary of State for DCMS, as was, on 30 January. It is good to see a Select Committee using its influence to go forward in this way.

The amendments tabled by the noble Lord, Lord Kamall, and supported by my noble friend Lady Featherstone—I am sorry she is unable to be here today, as he said—are important. They would broaden out consideration in exactly the right kind of way.

However, dare I say it, probably the most important amendment in this group is Amendment 48 in the name of the noble Lord, Lord Stevenson. Apart from the Clause 14 stand part notice, it is pretty much bang on where the Joint Committee got to. He was remarkably tactful in not going into any detail on the Government’s response to that committee. I will not read it out because of the lateness of the hour, but the noble Viscount, Lord Colville, got pretty close to puncturing the Government’s case that there is no proper definition of public interest. It is quite clear that there is a perfectly respectable definition in the Human Rights Act 1998 and, as the noble Viscount said, in the Defamation Act 2013, which would be quite fit for purpose. I do not quite know why the Government responded as they did at paragraph 251. I very much hope that the Minister will have another look at that.

The amendment from the noble and learned Lord, Lord Hope, which has the very respectable support of Justice, is also entirely apposite. I very much hope that the Government will take a good look at that.

Finally, and extraordinarily, I have quite a lot of sympathy with the amendments from the noble Lord, Lord Moylan. It was all going so well until we got to Amendment 294; up to that point I think he had support from across the House, because placing that kind of duty on Ofcom would be a positive way forward.

As I say, getting a clause of the kind that the noble Lord, Lord Stevenson, has put forward, with that public interest content point and with an umbrella duty on freedom of expression, allied to the definition from the noble and learned Lord, Lord Hope, would really get us somewhere.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

It has let me know as well. In a way, the amendment seeks to formalise what is already an informal mechanism. I was minded initially to support Amendment 56 in the name of my noble friend Lord Clement-Jones and the noble Lord, Lord Stevenson.

This landscape is quite varied. We have to create some kind of outlet, as the noble Baroness, Lady Kidron, rightly said. That parent or individual will want to go somewhere, so we have to send them somewhere. We want that somewhere to be effective, not to get bogged down in spurious and vexatious complaints. We want it to have a high signal-to-noise ratio—to pull out the important complaints and get them to the platforms. That will vary from platform to platform. In some ways, we want to empower Ofcom to look at what is and is not working and to be able to say, “Platform A has built up an incredible set of mechanisms. It’s doing a good job. We’re not seeing things falling through the cracks in the same way as we are seeing with platform B. We are going to have to be more directive with platform B”. That very much depends on the information coming in and on how well the platforms are doing their job already.

I hope that the Government are thinking about how these individual complaints will be dealt with and about the demand that will be created by the Bill. How can we have effective mechanisms for people in the United Kingdom who genuinely have hard cases and have tried, but where there is no intermediary for the platform they are worried about? In many cases, I suspect that these will be newer or smaller platforms that have arrived on the scene and do not have established relationships. Where are these people to go? Who will help them, particularly in cases where the platform may not systemically be doing anything wrong? Its policies are correct and it is enforcing them correctly, but any jury of peers would say that an injustice is being done. Either an exception needs to be made or there needs to be a second look at that specific case. We are not asking Ofcom to do this in the rest of the legislation.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, it is always somewhat intimidating to follow the noble Lord, Lord Allan, though it is wonderful to have him back from his travels. I too will speak in favour of Amendments 250A and 250B in the name of my noble friend, from not direct experience in the social media world but tangentially, from telecoms regulation.

I have lived, as the chief executive of a business, in a world where my customers could complain to me but also to an ombudsman and to Ofcom. I say this with some hesitation, as my dear old friends at TalkTalk will be horrified to hear me quoting this example, but 13 years ago, when I took over as chief executive, TalkTalk accounted for more complaints to Ofcom than pretty much all the other telcos put altogether. We were not trying to be bad—quite the opposite, actually. We were a business born out of very rapid growth, both organic and acquisitive, and we did not have control of our business at the time. We had an internal complaints process and were trying our hardest to listen to it and to individual customers who were telling us that we were letting them down, but we were not doing that very well.

While my noble friend has spoken so eloquently about the importance of complaints mechanisms for individual citizens, I am actually in favour of them for companies. I felt the consequences of having an independent complaints system that made my business listen. It was a genuine failsafe system. For someone to have got as far as complaining to the telecoms ombudsman and to Ofcom, they had really lost the will to live with my own business. That forced my company to change. It has forced telecoms companies to change so much that they now advertise where they stand in the rankings of complaints per thousand customers. Even in the course of the last week, Sky was proclaiming in its print advertising that it was the least complained-about to the independent complaints mechanism.

So this is not about thinking that companies are bad and are trying to let their customers down. As the noble Lord, Lord Allan, has described, managing these processes is really hard and you really need the third line of defence of an independent complaints mechanism to help you deliver on your best intentions. I think most companies with very large customer bases are trying to meet those customers’ needs.

For very practical reasons, I have experienced the power of these sorts of systems. There is one difference with the example I have given of telecoms: it was Ofcom itself that received most of those complaints about TalkTalk 13 years ago, and I have tremendous sympathy with the idea that we might unleash on poor Ofcom all the social media complaints that are not currently being resolved by the companies. That is exactly why, as Dame Maria Miller said, we need to set up an independent ombudsman to deal with this issue.

From a very different perspective from that of my noble friend, I struggle to understand why the Government do not want to do what they have just announced they want to do in other sectors such as gambling.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will make a short contribution on this substantive question of whether concerns about ministerial overreach are legitimate. Based on a decade of being on the receiving end of representations from Ministers, the short answer is yes. I want to expand on that with some examples.

My experience of working on the other side, inside a company, was that you often got what I call the cycle of outrage: something is shared on social media that upsets people; the media write a front-page story about it; government Ministers and other politicians get involved; that then feeds back into the media and the cycle spins up to a point where something must be done. The “something” is typically that the Minister summons people, such as me in my old job, and brings them into an office. That itself often becomes a major TV moment, where you are brought in, browbeaten and sent out again with your tail between your legs, and the Minister has instructed you to do something. That entire process takes place in the political rather than the regulatory domain.

I readily concede that, in many cases, something of substance needed to be addressed and there was a genuine problem. It is not that this was illegitimate, but these amendments are talking about the process for what we should do when that outrage is happening. I agree entirely with the tablers of the amendments that, to the extent that that process can be encapsulated within the regulator rather than a Minister acting on an ad hoc basis, it would be a significant improvement.

I also note that this is certainly not UK-specific, and it would happen in many countries with varying degrees of threat. I remember being summoned to the Ministry of the Interior in Italy to meet a gentleman who has now sadly passed. He brought me into his office, sat me down, pointed to his desk and said “You see that desk? That was Mussolini’s desk”. He was a nice guy and I left with a CD of his rhythm and blues band, but it was clear that I was not supposed to say no to him. He made a very clear and explicit political direction about content that was on the platform.

One big advantage of this Bill is that it has the potential to move beyond that world. It could move from individual people in companies—the noble Baroness, Lady Stowell of Beeston, made this point very powerfully—to changing the accountability model away from either platforms being entirely accountable themselves or platforms and others, including Ministers, somehow doing deals that will have an impact, as the noble Baroness, Lady Fox, and the noble Viscount, Lord Colville, said, on the freedom of expression of people across the country. We do not want that.

We want to move on in the Bill and I think we have a model which could work. The regulator will take on the outrage and go as far as it can under the powers granted in the Bill. If the regulator believes that it has insufficient powers, it will come back to Parliament and ask for more. That is the way in which the system can and should work. I think I referred to this at Second Reading; we have an opportunity to create clear accountability. Parliament instructs Ofcom, which instructs the platforms. The platforms do what Ofcom says, or Ofcom can sanction them. If Ofcom feels that its powers are deficient, it comes back to Parliament. The noble Lord, Lord Stevenson, and others made the point about scrutiny and us continually testing whether Ofcom has the powers and is exercising them correctly. Again, that is entirely beneficial and the Government should certainly be minded to accept those amendments.

With the Secretary of State powers, as drafted in the Bill and without the amendments we are considering today, we are effectively taking two steps forward and one step back on transparency and accountability. We have to ask: why take that step back when we are able to rely on Ofcom to do the job without these directions?

The noble Baroness, Lady Stowell of Beeston, made the point very clearly that there are other ways of doing this. The Secretary of State can express their view. I am sure that the Minister will be arguing that the Secretary of State’s powers in the Bill are better than the status quo because at least what the Secretary of State says will be visible; it will not be a back-room deal. The noble Baroness, Lady Stowell of Beeston, has proposed a very good alternative, where the Secretary of State makes visible their intentions, but not in the form of an order—rather in the form of advice. The public—it is their speech we are talking about—then have the ability to see whether they agree with Ofcom, the companies or the Secretary of State if there is any dispute about what should happen.

It is certainly the case that visible instructions from the Secretary of State would be better, but the powers as they are still leave room for arm-twisting. I can imagine a future scenario in which future employees of these platforms are summoned to the Secretary of State. But now the Secretary of State would have a draft order sitting there. The draft order is Mussolini’s desk. They say to the people from the platforms, “Look, you can do what I say, or I am going to send an order to Ofcom”. That takes us back to this world in which the public are not seeing the kind of instructions being given.

I hope that the Government will accept that some amendment is needed here. All the ones that have been proposed suggest different ways of achieving the same objective. We are trying to protect future Secretaries of State from an unhealthy temptation to intervene in ways that they should not.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, on day eight of Committee, I feel that we have all found our role. Each of us has spoken in a similar vein on a number of amendments, so I will try to be brief. As the noble Lord, Lord Allan, has spoken from his experience, I will once again reference my experience as the chief executive, for seven years, of a business regulated by Ofcom; as the chair of a regulator; and as someone who sat on the court of, arguably, the most independent of independent regulators, the Bank of England, for eight years.

I speak in support of the amendments in the name of my noble friend Lady Stowell, because, as a member of the Communications and Digital Committee, my experience, both of being regulated and as a regulator, is that independent regulators might be independent in name—they might even be independent in statute—but they exist in the political soup. It is tempting to think that they are a sort of granite island, completely immovable in the political soup, but they are more like a boat bobbing along in the turbulence of politics.

As the noble Lord, Lord Allan, has just described, they are influenced both overtly and subtly by the regulated companies themselves—I am sure we have both played that game—by politicians on all sides, and by the Government. We have played these roles a number of times in the last eight days; however, this is one of the most important groups of amendments, if we are to send the Bill back in a shape that will really make the difference that we want it to. This group of amendments challenges whether we have the right assignment of responsibility between Parliament, the regulator, government, the regulated and citizens.

It is interesting that we—every speaker so far—are all united that the Bill, as it currently stands, does not get that right. To explain why I think that, I will dwell on Amendment 114 in the name of my noble friend Lady Stowell. The amendment would remove the Secretary of State’s ability to direct Ofcom to modify a draft of the code of practice “for reasons of public policy”. It leaves open the ability to direct in the cases of terrorism, child sexual abuse, national security or public safety, but it stops the Secretary of State directing with regard to public policy. The reason I think that is so important is that, while tech companies are not wicked and evil, they have singularly failed to put internet safety, particularly child internet safety, high enough up their pecking order compared with delivering for their customers and shareholders. I do not see how a Secretary of State will be any better at that.

Arguably, the pressures on a Secretary of State are much greater than the pressures on the chief executives of tech companies. Secretaries of State will feel those pressures from the tech companies and their constituents lobbying them, and they will want to intervene and feel that they should. They will then push that bobbing boat of the independent regulator towards whichever shore they feel they need to in the moment—but that is not the way you protect people. That is not the way that we treat health and safety in the physical world. We do not say, “Well, maybe economics is more important than building a building that’s not going to fall down if we have a hurricane”. We say that we need to build safe buildings. Some 200 years ago, we were having the same debates about the physical world in this place; we were debating whether you needed to protect children working in factories, and the consequences for the economics. Well, how awful it is to say that today. That is the reality of what we are saying in the Bill now: that we are giving the Secretary of State the power to claim that the economic priority is greater than protecting children online.

I am starting to sound very emotional because at the heart of this is the suggestion that we are not taking the harms seriously enough. If we really think that we should be giving the Secretary of State the freedom to direct the regulator in such a broad way, we are diminishing the seriousness of the Bill. That is why I wholeheartedly welcome the remark from the noble Lord, Lord Stevenson, that he intends to bring this back with the full force of all of us across all sides of the Committee, if we do not hear some encouraging words from my noble friend the Minister.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is pleasure to follow the noble Baroness, Lady Harding, whose very powerful speech took us to the heart of the principles behind these amendments. I will add my voice, very briefly, to support the amendments for all the key reasons given. The regulator needs to be independent of the Secretary of State and seen to be so. That is the understandable view of the regulator itself, Ofcom; it was the view of the scrutiny committee; and it appears to be the view of all sides and all speakers in this debate. I am also very supportive of the various points made in favour of the principle of proper parliamentary scrutiny of the regulator going forward.

One of the key hopes for the Bill, which I think we all share, is that it will help set the tone for the future global conversation about the regulation of social media and other channels. The Government’s own impact assessment on the Bill details parallel laws under consideration in the EU, France, Australia, Germany and Ireland, and the noble Viscount, Lord Colville, referred to standards set by UNESCO. The standards set in the OSB at this point will therefore be a benchmark across the world. I urge the Government to set that benchmark at the highest possible level for the independence and parliamentary oversight of the regulator.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the amendments concern the independence of Ofcom and the role of parliamentary scrutiny. They are therefore indeed an important group, as those things will be vital to the success of the regime that the Bill sets up. Introducing a new, ground-breaking regime means balancing the need for regulatory independence with a transparent system of checks and balances. The Bill therefore gives powers to the Secretary of State comprising a power to direct Ofcom to modify a code of practice, a power to issue a statement of strategic priorities and a power to issue non-binding guidance to the regulator.

These powers are important but not novel; they have precedent in the Communications Act 2003, which allows the Secretary of State to direct Ofcom in respect of its network and spectrum functions, and the Housing and Regeneration Act 2008, which allows the Secretary of State to make directions to the Regulator of Social Housing to amend its standards. At the same time, I agree that it is important that we have proportionate safeguards in place for the use of these powers, and I am very happy to continue to have discussions with noble Lords to make sure that we do.

Amendment 110, from the noble Lord, Lord Stevenson, seeks to introduce a lengthier process regarding parliamentary approval of codes of practice, requiring a number of additional steps before they are laid in Parliament. It proposes that each code may not come into force unless accompanied by an impact assessment covering a range of factors. Let me reassure noble Lords that Ofcom is already required to consider these factors; it is bound by the public sector equality duty under the Equality Act 2010 and the Human Rights Act 1998 and must ensure that the regime and the codes of practice are compliant with rights under the European Convention on Human Rights. It must also consult experts on matters of equality and human rights when producing its codes.

Amendment 110 also proposes that any designated Select Committee in either House has to report on each code and impact assessment before they can be made. Under the existing process, all codes must already undergo scrutiny by both Houses before coming into effect. The amendment would also introduce a new role for the devolved Administrations. Let me reassure noble Lords that the Government are working closely with them already and will continue to do so over the coming months. As set out in Schedule 5 to the Scotland Act 1998, however, telecommunications and thereby internet law and regulation is a reserved policy area, so input from the devolved Administrations may be more appropriately sought through other means.

Amendments 111, 113, 114, 115, and 117 to 120 seek to restrict or remove the ability of the Secretary of State to issue directions to Ofcom to modify draft codes of practice. Ofcom has great expertise as a regulator, as noble Lords noted in this debate, but there may be situations where a topic outside its remit needs to be reflected in a code of practice. In those situations, it is right for the Government to be able to direct Ofcom to modify a draft code. This could, for example, be to ensure that a code reflects advice from the security services, to which Ofcom does not have access. Indeed, it is particularly important that the Secretary of State be able to direct Ofcom on matters of national security and public safety, where the Government will have access to information which Ofcom will not.

I have, however, heard the concerns raised by many in your Lordships’ House, both today and on previous occasions, that these powers could allow for too much executive control. I can assure your Lordships that His Majesty’s Government are committed to protecting the regulatory independence of Ofcom, which is vital to the success of the framework. With this in mind, we have built a number of safeguards into the use of the powers, to ensure that they do not impinge on regulatory independence and are used only in limited circumstances and for the appropriate reasons.

I have heard the strong feelings expressed that this power must not unduly restrict regulatory independence, and indeed share that feeling. In July, as noble Lords noted, the Government announced our intention to make substantive changes to the power; these changes will make it clear that the power is for use only in exceptional circumstances and will replace the “public policy” wording in Clause 39 with a defined list of reasons for which a direction can be made. I am happy to reiterate that commitment today, and to say that we will be making these changes on Report when, as the noble Lord, Lord Clement-Jones, rightly said, noble Lords will be able to see the wording and interrogate it properly.

Additionally, in light of the debate we have just had today—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

Can my noble friend the Minister clarify what he has just said? When he appeared in front of the Communications and Digital Committee, I think he might have been road-testing some of that language. In the specific words used, he would still have allowed the Secretary of State to direct Ofcom for economic reasons. Is that likely to remain the case? If it is, I feel it will not actually meet what I have heard is the will of the Committee.

--- Later in debate ---
Baroness Foster of Aghadrumsee Portrait Baroness Foster of Aghadrumsee (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I too should have spoken before the noble Lord, Lord Allan; I should have known, given his position on the Front Bench, that he was speaking on behalf of the Liberal Democrats. I was a little reticent to follow him, knowing his expertise in the technical area, but I am very pleased to do so now. I support this very important group of amendments and thank noble Lords for placing them before us. I echo the thanks to all the children’s NGOs that have been working in this area for so long.

For legislators, ambiguity is rarely a friend, and this is particularly true in legislation dealing with digital communications, where, as we all acknowledge, the law struggles to keep pace with technical innovation. Where there is ambiguity, sites will be creative and will evade what they see as barriers—of that I have no doubt. Therefore, I strongly believe that there is a need to have clarity where it can be achieved. That is why it is important to have in the Bill a clear definition of age verification for pornography.

As we have heard this evening, we know that pornography is having a devastating impact on our young people and children: it is impacting their mental health and distorting their views of healthy sexual relationships. It is very upsetting for me that evidence shows that children are replicating the acts they see in pornographic content, thinking that it is normal. It is very upsetting that, in particular, young boys who watch porn think that violence during intimacy is a normal thing to do. The NSPCC has told us that four in 10 boys aged 11 to 16 who regularly view porn say they want to do that because they want to get ideas as to the type of sex they want to try. That is chilling. Even more chilling is the fact that content is often marketed towards children, featuring characters from cartoons, such as “Frozen”, “Scooby Doo” and “The Incredibles”, to try to draw young people on to those sites. Frankly, that is unforgivable; it is why we need robust age verification to protect our children from this content. It must apply to all content, regardless of where it is found; we know, for instance, that Twitter is often a gateway to pornographic sites for young people.

The noble Lord, Lord Bethell, referred to ensuring, beyond all reasonable doubt, that the user is over 18. I know that that is a very high standard—it is the criminal law level—but I believe it is what is needed. I am interested to hear what the Minister has to say about that, because, if we are to protect children and if we take on the role of the fireguard, which the right reverend Prelate referred to, we need to make sure that it is as strong as possible.

Also, this is not just about making sure that users are over 18; we need to make sure that adults, not children, are involved in the content. The noble Baroness, Lady Benjamin, talked about adults being made to look like children, but there is also the whole area of young people being trafficked and abused into pornography production; therefore, Amendment 184 on performer age checks is very important.

I finish by indicating my strong support for Amendment 185 in the name of the noble Baroness, Lady Benjamin. Some, if not most, mainstream pornography content sites are degrading, extremely abusive and violent. Such content would be prohibited in the offline world and is illegal to own and to have; this includes sexual violence including strangulation, incest and sexualising children. We know that this is happening online because, as we have heard, some of the most frequently searched terms on porn sites are “teens”, “schoolgirls” or “girls”, and the lack of regulation online has allowed content to become more and more extreme and abusive. That is why I support Amendment 185 in the name of noble Baroness, Lady Benjamin, which seeks to bring parity between the online and offline regulation of pornographic content.

This Bill has been eagerly awaited. There is no doubt about that. It has been long in the gestation—some people would say too long. We have had much discussion in this Committee but let us get it right. I urge the Minister to take on board the many points made this afternoon. That fireguard needs not only to be put in place, but it needs to be put in place so that it does not move, it is not knocked aside and so that it is at its most effective. I support the amendments.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I also failed to stand up before the noble Lord, Lord Allan, did. I too am always slightly nervous to speak before or after him for fear of not having the detailed knowledge that he does. There have been so many powerful speeches in this group. I will try to speak swiftly.

My role in this amendment was predefined for me by the noble Baroness, Lady Kidron, as the midwife. I have spent many hours debating these amendments with my noble friend Lord Bethell, the noble Baroness, Lady Kidron, and with many noble Lords who have already spoken in this debate. I think it is very clear from the debate why it is so important to put a definition of age assurance and age verification on the face of the Bill. People feel so passionately about this subject. We are creating the digital legal scaffolding, so being really clear what we mean by the words matters. It really matters and we have seen it mattering even in the course of this debate.

My two friends—they are my friends—the noble Baroness, Lady Kidron, and my noble friend Lord Bethell both used the word “proportionate”, with one not wanting us to be proportionate and the other wanting us to be proportionate. Yet, both have their names to the same amendment. I thought it might be helpful to explain what I think they both mean—I am sure they will interrupt me if I get this wrong—and explain why the words of the amendment matter so much.

Age assurance should not be proportionate for pornography. It should be the highest possible bar. We should do everything in our power to stop children seeing it, whether it is on a specific porn site or on any other site. We do not want our children to see pornography; we are all agreed on that. There should not be anything proportionate about that. It should be the highest bar. Whether “beyond reasonable doubt” is the right wording or it should instead be “the highest possible bar practically achievable”, I do not know. I would be very keen to hear my noble friend the Minister’s thoughts on what the right wording is because, surely, we are all clear it should be disproportionate; it should absolutely be the hardest we can take.

Equally, age assurance is not just about pornography, as the noble Lord, Lord Allan, has said. We need to have a proportionate approach. We need a ladder where age assurance for pornography sits at the top, and where we are making sure that nine year-olds cannot access social media sites if they are age-rated for 13. We all know that we can go into any primary school classroom in the land and find that the majority of nine year-olds are on social media. We do not have good age assurance further down.

As both the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, have said, we need age assurance to enable providers to adapt the experience to make it age-appropriate for children on services we want children to use. It needs to be both proportionate and disproportionate, and that needs to be defined on the face of the Bill. If we do not, I fear that we will fall into the trap that the noble Lord, Lord Allan, mentioned: the cookie trap. We will have very well-intentioned work that will not protect children and will go against the very thing that we are all looking for.

In my role as the pragmatic midwife, I implore my noble friend the Minister to hear what we are all saying and to help us between Committee and Report, so that we can come back together with a clear definition of age assurance and age verification on the face of the Bill that we can all support.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, about half an hour ago I decided I would not speak, but as we have now got to this point, I thought I might as well say what I was going to say after all. I reassure noble Lords that in Committee it is perfectly permissible to speak after the winder, so no one is breaking any procedural convention. That said, I will be very brief.

My first purpose in rising is to honour a commitment I made last week when I spoke against the violence against women and girls code. I said that I would none the less be more sympathetic to and supportive of stronger restrictions preventing child access to pornography, so I want to get my support on the record and honour that commitment in this context.

My noble friend Lady Harding spoke on the last group about bringing our previous experiences to bear when contributing to some of these issues. As I may have said in the context of other amendments earlier in Committee, as a former regulator, I know that one of the important guiding principles is to ensure that you regulate for a reason. It is very easy for regulators to have a set of rules. The noble Baroness, Lady Kidron, referred to rules of the road for the tech companies to follow. It is very easy for regulators to examine whether those rules are being followed and, having decided that they have, to say that they have discharged their responsibility. That is not good enough. There must be a result, an outcome from that. As the noble Lord, Lord Allan, emphasised, this must be about outcomes and intended benefits.

I support making it clear in the Bill that, as my noble friend Lady Harding said, we are trying to prevent, disproportionately, children accessing pornography. We will do all we can to ensure that it happens, and that should be because of the rules being in place. Ofcom should be clear on that. However, I also support a proportionate approach to age assurance in all other contexts, as has been described. Therefore, I support the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell, and the role my noble friend Lady Harding has played in arriving at a pragmatic solution.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank everyone for their contributions this evening. As the noble Lord, Lord Stevenson, said, it is very compelling when your Lordships’ House gets itself together on a particular subject and really agrees, so I thank noble Lords very much for that.

I am going to do two things. One is to pick up on a couple of questions and, as has been said by a number of noble Lords, concentrate on outcomes rather than contributions. On a couple of issues that came up, I feel that the principle of pornography being treated in the same way in Parts 3 and 5 is absolute. We believe we have done it. After Committee we will discuss that with noble Lords who feel that is not clear in the amendment to make sure they are comfortable that it is so. I did not quite understand in the Minister’s reply that pornography was being treated in exactly the same way in Parts 3 and 5. When I say “exactly the same way”, like the noble Lord, Lord Allan, I mean not necessarily by the same technology but to the same level of outcome. That is one thing I want to emphasise because a number of noble Lords, including the noble Baroness, Lady Ritchie, the noble Lord, Lord Farmer, and others, are rightly concerned that we should have an outcome on pornography, not concentrate on how to get there.

The second thing I want to pick up very briefly, because it was received so warmly, is the question of devices and on-device age assurance. I believe that is one method, and I know that at least one manufacturer is thinking about it as we speak. However, it is an old battle in which companies that do not want to take responsibility for their services say that people over here should do something different. It is very important that devices, app stores or any of the supposed gatekeepers are not given an overly large responsibility. It is the responsibility of everyone to make sure that age assurance is adequate.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I hope that what the noble Baroness is alluding to is that we need to include gatekeepers, app stores, device level and sideloading in another part of the Bill.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

But of course—would I dare otherwise? What I am saying is that these are not silver bullets and we must have a mixed economy, not only for what we know already but for what we do not know. We must have a mixed economy, and we must not make an overly powerful one platform of age assurance. That is incredibly important, so I wanted to pick up on that.

I also want to pick up on user behaviour and unintended consequences. I think there was a slight reference to an American law, which is called COPPA and is the reason that every website says 13. That is a very unhelpful entry point. It would be much better if children had an age-appropriate experience from five all the way to 18, rather than on and off at 13. I understand that issue, but that is why age assurance has to be more than one thing. It is not only a preventive thing but an enabling thing. I tried to make that very clear so I will not detain the Committee on that.

On the outcome, I say to the Minister, who has indeed given a great deal of time to this, that more time is needed because we want a bar of assurance. I speak not only for all noble Lords who have made clear their rightful anxiety about pornography but also on behalf of the bereaved parents and other noble Lords who raised issues about self-harming of different varieties. We must have a measurable bar for the things that the Bill says that children will not encounter—the primary priority harms. In the negotiation, that is non-negotiable.

On the time factor, I am sorry to say that we are all witness to what happened to Part 3. It was pushed and pushed for years, and then it did not happen—and then it was whipped out of the Bill last week. This is not acceptable. I am happy, as I believe other noble Lords are, to negotiate a suitable time that gives Ofcom comfort, but it must be possible, with this Bill, for a regulator to bring something in within a given period of time. I am afraid that history is our enemy on this one.

The third thing is that I accept the idea that there has to be more than principles, which is what I believe Ofcom will provide. But the principles have to be 360 degrees, and the questions that I raised about security, privacy and accessibility should be in the Bill so that Ofcom can go away and make some difficult judgments. That is its job; ours is to say what the principle is.

I will tell one last tiny story. About 10 years ago, I met in secret with one of the highest-ranking safety officers in one of the companies that we always talk about. They said to me, “We call it the ‘lost generation’. We know that regulation is coming, but we know that it is not soon enough for this generation”. On behalf of all noble Lords who spoke, I ask the Government to save the next generation. With that, I withdraw the amendment.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Moved by
125: Clause 49, page 47, line 22, at end insert—
“(c) machine-generated content is to be regarded as user-generated content of a service if—(i) the creation or use of the machine-generated content involves interacting with user-generated content,(ii) it takes the form or identity of a user,(iii) it provides content that constitutes illegal, primary priority content or priority content, or would constitute it if created in another format, or(iv) a user has in any way facilitated any element of the generation by way of a command, prompt, or any other instruction, however minimal.”Member’s explanatory statement
This amendment would add machine-generated content to regulated content in the bill and gives meaning to how it could be regarded as ‘user-generated content’ of the service, and allows virtual and augmented reality material to be treated on an equal basis as on other formats.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I rise to introduce this group. On Tuesday in Committee, I said that having reached day 8 of the Committee we had all found our roles; now, I find myself in a different role. The noble Baroness, Lady Kidron, is taking an extremely well-earned holiday and was never able to be in the House today. She has asked me to introduce this group and specifically to speak to Amendment 125 in her name.

I strongly support all the amendments in the group, particularly those that would result in a review, but will limit my words to Amendment 125. I also thank the other co- signatories, the noble Baroness, Lady Finlay, who is in her place, and my noble friend Lord Sarfraz, who made such a compelling speech at Second Reading on the need for the Bill to consider emerging technologies but who is also, sadly, abroad, on government business.

I start with something said by Lord Puttnam, and I paraphrase: that we were forbidden from incorporating the word “digital” throughout the whole process of scrutiny of the communications Act in 2002. As a number of us observed at the time, he said, it was a terrible mistake not to address or anticipate these issues when it was obvious that we would have to return to it all at some later date. The Online Safety Bill is just such a moment: “Don’t close your eyes and hope”, he said, “but look to the future and make sure that it is represented in the Bill”.

With that in mind, this amendment is very modest. I will be listening carefully, as I am sure the noble Baroness, Lady Kidron, will from a distance, to my noble friend the Minister because if each aspect of this amendment is already covered in the Bill, as I suspect he will want to say, then I would be grateful if he could categorically explain how that is the case at the Dispatch Box, in sufficient detail that a future court of law can clearly understand it. If he cannot state that then I will be asking the House, as I am sure the noble Baroness, Lady Kidron, would, to support the amendment’s inclusion in the Bill.

There are two important supporters of this amendment. If the Committee will forgive me, I want to talk briefly about each of them because of the depth of understanding of the issues they have. The first is an enforcement officer who I shall not name, but I and the noble Baroness, Lady Kidron, want to thank him and his team for the extraordinary work that they do, searching out child sexual abuse in the metaverse. The second, who I will come to in a little bit, is Dr Geoff Hinton, the inventor of the neural network and most often referred to as “the godfather of AI”, whom the noble Baroness, Lady Kidron, met last week. Both are firm supporters of this amendment.

The amendment is part of a grouping labelled future-proofing but, sadly, this is not in the future. It is with us now. The rise of child sexual abuse in the metaverse is growing phenomenally. Two months ago, at the behest of the Institution of Engineering and Technology, the noble Baroness, Lady Kidron, hosted a small event at which members of a specialist police unit explained to colleagues from both Houses that what they were finding online was amongst the worst imaginable, but was not adequately caught by existing laws. I should just warn those listening to or reading this—I am looking up at the Public Gallery, where I see a number of young people listening to us—that I am about to briefly recount some really horrific stuff from what we saw and heard.

The quality of AI imagery is now at the point where a realistic AI image of a child can be produced. Users are able to produce or order indecent AI images, based on a child known to them. Simply by uploading a picture of a next door neighbour’s child or a family member, or taking a child’s image from social media and putting that face on existing abuse images, they can create a body for that picture or, increasingly, make it 3D and take it into an abuse room. The type of imagery produced can vary from suggestive or naked to penetrative sex; for the most part, I do not think I should be repeating in this Chamber the scenarios that play out.

VR child avatars can be provided with a variety of bespoke abuse scenarios, which the user can then interact with. Tailor-made VR experiences are being advertised for production on demand. They can be made to meet specific fetishes or to feature a specific profile of a child. The production of these VR abuse images is a commercial venture. Among the many chilling facts we learned was that the Oculus Meta Quest 2, which is the best-selling VR headset in the UK, links up to an app that is downloaded on to the user’s mobile phone. Within that app, the user can search for other users to follow and engage with—either through the VR headset or via instant messaging in their mobile app. A brief search through the publicly viewable user profiles on this app shows a huge number of profiles with usernames indicative of a sexual interest in children.

Six weeks after the event, the noble Baroness, Lady Kidron, spoke to the same officer. He said that already the technology was a generation on—in just six weeks. The officer made a terrible and terrifying prediction: he said that in a matter of months this violent imagery, based on and indistinguishable from an actual known child, will evolve to include moving 3D imagery and that at that point, the worlds of VR and AI will meet and herald a whole new phase in offending. I will quote this enforcement officer. He said:

“I hate to think where we will be in six months from now”.


While this group is labelled as future-proofing the Bill, I remind noble Lords that in six months’ time, the provisions of the Bill will not have been implemented. So this is not about the future; it is actually about the now.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, like others, I thank the Whips for intervening to protect children from hearing details that are not appropriate for the young. I have to say that I was quite relieved because I was rather squirming myself. Over the last two days of Committee, I have been exposed to more violent pornographic imagery than any adult, never mind a child, should be exposed to. I think we can recognise that this is certainly a challenging time for us.

I do not want any of the comments I will now make to be seen as minimising understanding of augmented reality, AI, the metaverse and so on, as detailed so vividly by the noble Baronesses, Lady Harding and Lady Finlay, in relation to child safety. However, I have some concerns about this group, in terms of proportionality and unintended outcomes.

Amendment 239, in the names of the right reverend Prelate the Bishop of Oxford, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, sums up some of my concerns about a focus on future-proofing. This amendment would require Ofcom to produce reports about future risks, which sounds like a common-sense demand. But my question is about us overly focusing on risk and never on opportunities. There is a danger that the Bill will end up recommending that we see these new technologies only in a negative way, and that we in fact give more powers to expand the scope for harmful content, in a way that stifles speech.

Beyond the Bill, I am more generally worried about what seems to be becoming a moral panic about AI. The precautionary principle is being adopted, which could mean stifling innovation at source and preventing the development of great technologies that could be of huge benefit to humanity. The over-focus on the dangers of AI and augmented reality could mean that we ignore the potential large benefits. For example, if we have AI, everyone could have an immediately responsive GP in their pocket—goodness knows that, for those trying to get an appointment, that could be of great use and benefit. It could mean that students have an expert tutor in every subject, just one message away. The noble Baroness, Lady Finlay, spoke about the fantastic medical breakthroughs that augmented reality can bring to handling neurological damage. Last night, I cheered when I saw how someone who has never been able to walk now can, through those kinds of technologies. I thought, “Isn’t this a brilliant thing?” So all I am suggesting is that we have to be careful that we do not see these new technologies only as tools for the most perverted form of activity among a small minority of individuals.

I note, with some irony, that fewer qualms were expressed by noble Lords about the use of AI when it was proposed to scan and detect speech or images in encrypted messages. As I argued at the time, this would be a threat to WhatsApp, Signal and so on. Clauses 110 and 124 have us using AI as a blunt proactive technology of surveillance, despite the high risks of inaccuracy, error and false flags. But there was great enthusiasm for AI then, when it was having an impact on individuals’ freedom of expression—yet, here, all we hear are the negatives. So we need to be balanced.

I am also concerned about Amendment 125, which illustrates the problem of seeing innovation only as a threat to safety and a potential problem. For example, if the Bill considers AI-generated content to be user-generated content, only large technology companies will have the resources—lawyers and engineers—necessary to proceed while avoiding crippling liability.

In practice, UK users risk being blocked out from new technologies if we are not careful about how we regulate here. For example, users in the European Union currently cannot access Google Bard AI assistant because of GDPR regulations. That would be a great loss because Google Bard AI is potentially a great gain. Despite the challenges of the likes of ChatGPT and Bard AI that we keep reading about, with people panicking that this will lead to wide-scale cheating in education and so on, this has huge potential as a beneficial technology, as I said.

I have mentioned that one of the unintended consequences—it would be unintended—of the whole Bill could be that the UK becomes a hostile environment for digital investment and innovation. So start-ups that have been invested in—like DeepMind, a Google-owned and UK-based AI company—could be forced to leave the UK, doing huge damage to the UK’s digital sector. How can the UK be a science and technology superpower if we end up endorsing anti-innovation, anti-progress and anti-business measures by being overly risk averse?

I have the same concerns about Amendment 286, which requires periodic reviews of new technology content environments such as the metaverse and other virtual augmented reality settings. I worry that it will not be attractive for technology companies to confidently invest in new technologies if there is this constant threat of new regulations and new problems on the horizon.

I have a query that mainly relates to Amendment 125 but that is also more general. If virtual augmented reality actually involves user-to-user interaction, like in the metaverse, is it not already covered in the Bill? Why do we need to add it in? The noble Baroness, Lady Harding, said that it has got to the point where we are not able to distinguish fake from real, and augmented reality from reality. But she concludes that that means that we should treat fake as real, which seems to me to rather muddy the waters and make it a fait accompli. I personally—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I am sorry to interrupt, but I will make a clarification; the noble Baroness is misinterpreting what I said. I was actually quoting the godfather of AI and his concerns that we are fast approaching a space where it will be impossible—I did not say that it currently is—to distinguish between a real child being abused and a machine learning-generated image of a child being abused. So, first, I was quoting the words of the godfather of AI, rather than my own, and, secondly, he was looking forward—only months, not decades—to a very real and perceived threat.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I personally think that it is pessimistic view of the future to suggest that humanity cannot rise to the task of being able to distinguish between deep fakes and real images. Organising all our lives, laws and liberties around the deviant predilections of a minority of sexual offenders on the basis that none of us will be able to tell the difference in the future, when it comes to that kind of activity, is rather dangerous for freedom and innovation.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful to the noble Baroness. That is very helpful.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

That is exactly the same issue with child sexual abuse images—it is about the way in which criminal law is written. Not surprisingly, it is not up to date with evolution of technology.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful for that intervention as well. That summarises the core questions that we have for the Minister. Of the three areas that we have for him, the first is the question of scope and the extent to which he can assure us that the Bill as drafted will be robust in covering the metaverse and bots, which are the issues that have been raised today. The second is on behaviours and to the two interventions that we have just had. We have been asking whether, with the behaviours that are criminal today, that criminality will stretch to new, similar forms of behaviour taking place in new environments—let us put it that way. The behaviour, the intent and the harm are the same, but the environment is different. We want to understand the extent to which the Government are thinking about that, where that thinking is happening and how confident they are that they can deal with that.

Finally, on the question of agency, how do the Government expect to deal with the fact that we will have machines operating in a user-to-user environment when the connection between the machine and another individual user is qualitatively different from anything that we have seen before? Those are just some small questions for the Minister on this Thursday afternoon.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I certainly concur that we should discuss the issue in greater detail. I am very happy to do so with the noble Lord, the noble Baroness and others who want to do so, along with officials. If we can bring some worked examples of what “in control” and “out of control” bots may be, that would be helpful.

I hope the points I have set out in relation to the other issues raised in this group and the amendments before us are satisfactory to noble Lords and that they will at this point be content not to press their amendments.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I thank all noble Lords who have contributed to a thought-provoking and, I suspect, longer debate than we had anticipated. At Second Reading, I think we were all taken aback when this issue was opened up by my noble friend Lord Sarfraz; once again, we are realising that this requires really careful thought. I thank my noble friend the Minister for his also quite long and thoughtful response to this debate.

I feel that I owe the Committee a small apology. I am very conscious that I talked in quite graphic detail at the beginning when there were still children in the Gallery. I hope that I did not cause any harm, but it shows how serious this is that we have all had to think so carefully about what we have been saying—only in words, without any images. We should not underestimate how much this has demonstrated the importance of our debates.

On the comments of the noble Baroness, Lady Fox, I am a huge enthusiast, like the noble Lord, Lord Knight, for the wonders of the tech world and what it can bring. We are managing the balance in this Bill to make sure that this country can continue to benefit from and lead the opportunities of tech while recognising its real and genuine harms. I suggest that today’s debate has demonstrated the potential harm that the digital world can bring.

I listened carefully—as I am certain the noble Baroness, Lady Kidron, has been doing in the digital world—to my noble friend’s words. I am encouraged by what he has put on the record on Amendment 125, but there are some specific issues that it would be helpful for us to talk about, as he alluded to, after this debate and before Report. Let me highlight a couple of those.

First, I do not really understand the technical difference between a customer service bot and other bots. I am slightly worried that we are defining in the specific one type of bot that would not be captured by this Bill. I suspect that there might be others in future. We must think carefully through whether we are getting too much into the specifics of the technology and not general enough in making sure we capture where it could go. That is one example.

Secondly, as my noble friend Lady Berridge would say, I am not sure that we have got to the bottom of whether this Bill, coupled with the existing body of criminal law, will really enable law enforcement officers to progress the cases as they see fit and protect vulnerable women—and men—in the digital world. I very much hope we can extend the conversation there. We perhaps risk getting too close to the technical specifics if we are thinking about whether a haptic suit is in or out of scope of the Bill; I am certain that there will be other technologies that we have not even thought about yet that we will want to make sure that the Bill can capture.

I very much welcome the spirit in which this debate has been held. When I said that I would do this for the noble Baroness, Lady Kidron, I did not realise quite what a huge debate we were opening up, but I thank everyone who has contributed and beg leave to withdraw the amendment.

Amendment 125 withdrawn.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, following on from the excellent points that the noble Baroness has made, I want to pursue the same direction. In this group of amendments we are essentially trying to reduce the incidence of tragedies such as those that the families there in the Gallery have experienced and trying to ensure that no one—that is probably unrealistic, but at least far fewer people—will have the same experience.

I particularly want to focus the Minister and the Bill team on trying to think through how to ensure that, as and when something tragic happens, what happens to the families faced with that—the experience that they have and the help that I hope in future they will be able to receive—will make it a less traumatic, lonely and baffling experience than it clearly has been to date.

At the heart of this, we are talking about communication; about the relationship between Ofcom and the platforms; probably about the relationships between platforms and other platforms, in sharing knowledge; about the relationship between Ofcom and government; about the relationship between Ofcom and regulators in other jurisdictions; and about the relationship between our Government and other Governments, including, most importantly, the Government in the US, where so many of these platforms are based. There is a network of communication that has to work. By its very nature, trying to capture something as all-encompassing as that in primary legislation will in some ways be out of date before it even hits the statute book. It is therefore incredibly important that there is a dynamic information-sharing and analytics process to understand what is going on in the online world, and what the experience is of individuals who are interacting with that world.

That brings me neatly back to an amendment that we have previously discussed, which I suspect the noble Viscount sitting on the Front Bench will remember in painful detail. When we were talking about the possibility of having an independent ombudsman to go to, what we heard from all around the House was, “Where do we go? If we have gone to the platforms and through the normal channels but are getting nowhere, where do we go? Are we on our own?”. The answer that we felt we were getting a few weeks ago was, “That’s it, you’ve got to lump it”. That is simply not acceptable.

I ask the Minister and the Bill team to ensure that there is recognition of the dynamic nature of what we are dealing with. We cannot capture it in primary legislation. I hope we cannot capture it in secondary instruments either; speaking as a member of the Secondary Legislation Scrutiny Committee, we have quite enough of them as it is so we do not want any more, thank you very much. However, it is incredibly important that the Government think about a dynamic form of having up-to-date information so that they and all the other parties in this area know what is going on.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I support this group of amendments. I pay tribute to the families who I see are watching us as we debate this important group. I also pay tribute to my noble friend Lady Newlove, who has just given one of the most powerful speeches in the full 10 days of Committee.

The real sadness is that we are debating what happens when things go horribly wrong. I thank my noble friend the Minister and the Secretary of State, who is currently on leave, for the very collaborative way in which I know they have approached trying to find the right package—we are all waiting for him to stand up and speak to show us this. Very often, Governments do not want to give concessions early in the process of a Bill going through because they worry that those of us campaigning for concessions will then ask for more. In this case, as the noble Lord, Lord Russell, has just pointed to, all we are asking for in this Bill is to remember that a concession granted here helps only when things have gone horribly wrong.

As the noble Baroness, Lady Kidron, said, what we really want is a safer internet, where fewer children die. I reiterate the comments that she made at the end of her speech: as we have gone through Committee, we have all learned how interconnected the Bill is. It is fantastic that we will be able to put changes into it that will enable bereaved families not to have to follow the path that the Russells and all the other bereaved families campaigning for this had to follow—but that will not be enough. We also need to ensure that we put in place the safety-by-design amendments that we have been discussing. I argue that one of the most important is the one that the noble Lord, Lord Russell, has just referenced: when you already know that your child is in trouble but you cannot get help, unfortunately no one wants then to be able to say, “It’s okay. Bereaved families have what they need”. We need to do more than that.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very moving debate for a very important cause. I thank the noble Baroness, Lady Kidron, for introducing it in the way that she did, along with those who have spoken in the debate.

The good news is that this is very much a cross-party and cross-Bench debate. It clearly appears to be a concern that the Government share, and I appreciate that. I agree with the noble Baroness, Lady Harding, that it is not a weakness for the Government to concede here but very much the logic of where we have now got to. Compared with what is in the Joint Committee report on the draft Bill, what seems to be proposed—and I very much look forward to hearing what the Minister has to say—goes further than what we were proposing, so it may be that we have reached another milestone. However, we wait to hear the detail.

Like other noble Lords, I pay tribute to the bereaved parents. We heard from parents during our consideration of the draft Online Safety Bill and we have heard further since then, particularly as a result of the two notable inquests into the deaths of Frankie Thomas and Molly Russell, which highlighted the difficulties that families and coroners face. Both families talked about the additional toll on their mental health as they battle for information, and the impossibility of finding closure in the absence of answers.

The noble Baroness, Lady Newlove, said in her very moving speech that a humane process must be established for bereaved families and coroners to access data pertinent to the death of a child. That is what we have been seeking, and I pay tribute to the relentless way in which the noble Baroness, Lady Kidron, has pursued this issue on behalf of us all, supported by 5Rights and the NSPCC. We must have a transparent process in which bereaved families and coroners can access information from regulated services in cases where social media may have played a part in the death of a child.

My noble friend Lord Allan—who I am delighted is so plugged in to what could be the practical way of solving some of these issues—expertly described how Ofcom’s powers could and should be used and harnessed for this purpose. That very much goes with the grain of the Bill.

I shall repeat a phrase that the noble Baroness, Lady Kidron, used: the current situation is immoral and a failure of justice. We absolutely need to keep that in mind as we keep ourselves motivated to find the solution as soon as we possibly can. I look forward to good news from the Minister about the use of information notices for the purpose that has been heralded by the noble Baroness, Lady Kidron, but of course the devil is in the detail. We will obviously want to see the detail of the amendment well before Report.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I too want to support this group of amendments, particularly Amendment 234, and will make just a couple of brief points.

First, one of the important qualities of the online safety regime is transparency, and this really speaks to that point. It is beyond clear that we are going to need all hands on deck, and again, this speaks to that need. I passionately agree with the noble Baroness, Lady Fox, on this issue and ask, when does an independent researcher stop being independent? I have met quite a lot on my journey who suddenly find ways of contributing to the digital world other than their independent research. However, the route described here offers all the opportunities to put those balancing pieces in place.

Secondly, I am very much aware of the fear of the academics in our universities. I know that a number of them wrote to the Secretary of State last week saying that they were concerned that they would be left behind their colleagues in Europe. We do not want to put up barriers for academics in the UK. We want the UK to be at the forefront of governance of the digital world, this amendment speaks to that, and I see no reason for the Government to reject it.

Finally, I want to emphasise the importance of research. Revealing Reality did research for 5Rights called Pathways, in which it built avatars for real children and revealed the recommendation loops in action. We could see how children were being offered self-harm, suicide, extreme diets and livestream porn within moments of them arriving online. Frances Haugen has already been mentioned. She categorically proved what we have been asserting for years, namely that Instagram impacts negatively on teenage girls. As we put this regime in place, it is not adequate to rely on organisations that are willing to work in the grey areas of legality to get their research or on whistleblowers—on individual acts of courage—to make the world aware.

One of the conversations I remember happened nearly five years ago, when the then Secretary of State asked me what the most important thing about the Bill was. I said, “To bring a radical idea of transparency to the sector”. This amendment goes some way to doing just that.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I, too, support Amendments 233 and 234, and Amendment 233A, from the noble Lord, Lord Allan. As the noble Baroness, Lady Kidron, said, it has been made clear in the past 10 days of Committee that there is a role for every part of society to play to make sure that we see the benefits of the digital world but also mitigate the potential harms. The role that researchers and academics can play in helping us understand how the digital world operates is critical—and that is going to get ever more so as we enter a world of large language models and AI. Access to data in order to understand how digital systems and processes work will become even more important—next week, not just in 10 years’ time.

My noble friend Lord Bethell quite rightly pointed out the parallels with other regulators, such as the MHRA and the Bank of England. A number of people are now comparing the way in which the MHRA and other medical regulators regulate the development of drugs with how we ought to think about the emergence of regulation for AI. This is a very good read-across: we need to set the rules of the road for researchers and ensure, as the noble Baroness, Lady Kidron, said—nailing it, as usual—that we have the most transparent system possible, enabling people to conduct their research in the light, not in the grey zone.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, as the noble Baroness, Lady Kidron, said, clearly, transparency is absolutely one of the crucial elements of the Bill. Indeed, it was another important aspect of the Joint Committee’s report. Like the noble Lord, Lord Knight—a fellow traveller on the committee—and many other noble Lords, I much prefer the reach of Amendments 233 and 234, tabled by the noble Lord, Lord Bethell, to Amendment 230, the lead amendment in this group.

We strongly support amendments that aim to introduce a duty for regulated platforms to enable access by approved independent researchers to information and data from regulated services, under certain conditions. Of course, there are arguments for speeding up the process under Clause 146, but this is really important because companies themselves currently decide who accesses data, how much of it and for what purposes. Only the companies can see the full picture, and the effect of this is that it has taken years to build a really solid case for this Online Safety Bill. Without a greater level of insight, enabling quality research and harm analysis, policy-making and regulatory innovation will not move forward.

I was very much taken by what the noble Baroness, Lady Harding, had to say about the future in terms of the speeding up of technological developments in AI, which inevitably will make the opening up of data, and research into it, of greater and greater importance. Of course, I also take extremely seriously my noble friend’s points about the need for data protection. We are very cognisant of the lessons of Cambridge Analytica, as he mentioned.

It is always worth reading the columns of the noble Lord, Lord Hague. He highlighted this issue last December, in the Times. He said:

“Social media companies should be required to make anonymised data available to third-party researchers to study the effect of their policies. Crucially, the algorithms that determine what you see—the news you are fed, the videos you are shown, the people you meet on a website—should not only be revealed to regulators but the choices made in crafting them should then be open to public scrutiny and debate”.


Those were very wise words. The status quo leaves transparency in the hands of big tech companies with a vested interest in opacity. The noble Lord, Lord Knight, mentioned Twitter announcing in February that it would cease allowing free research access to its application programming interface. It is on a whim that a billionaire owner can decide to deny access to researchers.

I much prefer Amendment 233, which would enable Ofcom to appoint an approved independent researcher. The Ofcom code of practice proposed in Amendment 234 would be issued for researchers and platforms, setting out the procedures for enabling access to data. I take the point made by the noble Baroness, Lady Fox, about who should be an independent accredited researcher, but I hope that that is exactly the kind of thing that a code of practice would deal with.

Just as a little contrast, Article 40 of the EU’s Digital Services Act gives access to data to a broad range of researchers—this has been mentioned previously—including civil society and non-profit organisations dedicated to public interest research. The DSA sets out in detail the framework for vetting and access procedures, creating an explicit role for new independent supervisory authorities. This is an example that we could easily follow.

The noble Lord, Lord Bethell, mentioned the whole question of skilled persons. Like him, I do not believe that this measure is adequate as a substitute for what is contained in Amendments 233 and 234. It will be a useful tool for Ofcom to access external expertise on a case-by-case basis but it will not provide for what might be described as a wider ecosystem of inspection and analysis.

The noble Lord also mentioned the fact that internet companies should not regard themselves as an exception. Independent scrutiny is a cornerstone of the pharmaceutical, car, oil, gas and finance industries. They are open to scrutiny from research; we should expect that for social media as well. Independent researchers are already given access in many other circumstances.

The case for these amendments has been made extremely well. I very much hope to see the Government, with the much more open approach that they are demonstrating today, accept the value of these amendments.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 1, to which I was happy to add my name alongside that of the Minister. I too thank the noble Lord, Lord Stevenson, for tabling the original amendment, and my noble and learned friend Lord Neuberger for providing his very helpful opinion on the matter.

I am especially pleased to see that ensuring that services are safe by design and offer a higher standard of protection for children is foundational to the Bill. I want to say a little word about the specificity, as I support the noble Baroness, Lady Merron, in trying to get to the core issue here. Those of your Lordships who travel to Westminster by Tube may have seen TikTok posters saying that

“we’re committed to the safety of teens on TikTok. That’s why we provide an age appropriate experience for teens under 16. Accounts are set to private by default, and their videos don’t appear in public feeds or search results. Direct messaging is also disabled”.

It might appear to the casual reader that TikTok has suddenly decided unilaterally to be more responsible, but each of those things is a direct response to the age-appropriate design code passed in this House in 2018. So regulation does work and, on this first day on Report, I want to say that I am very grateful to the Government for the amendments that they have tabled, and “Please do continue to listen to these very detailed matters”.

With that, I welcome the amendment. Can the Minister confirm that having safety by design in this clause means that all subsequent provisions must be interpreted through that lens and will inform all the decisions of Report and those of Ofcom, and the Secretary of State’s approach to setting and enforcing standards?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I too thank my noble friend the Minister for tabling Amendment 1, to which I add my support.

Very briefly, I want to highlight one word in it, to add to what the noble Baroness, Lady Kidron, has just said. The word is “activity”. It is extremely important that in Clause 1 we are setting out that the purpose is to

“require providers of services regulated by this Act to identify, mitigate and manage”

not just illegal or harmful content but “activity”.

I very much hope that, as we go through the few days on Report, we will come back to this and make sure that in the detailed amendments that have been tabled we genuinely live up to the objective set out in this new clause.

Lord Bishop of Manchester Portrait The Lord Bishop of Manchester
- View Speech - Hansard - - - Excerpts

My Lords, I too support the Minister’s Amendment 1. I remember vividly, at the end of Second Reading, the commitments that we heard from both Front-Benchers to work together on this Bill to produce something that was collaborative, not contested. I and my friends on these Benches have been very touched by how that has worked out in practice and grateful for the way in which we have collaborated across the whole House. My plea is that we can use this way of working on other Bills in the future. This has been exemplary and I am very grateful that we have reached this point.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the Government are committed to protecting children against accessing pornography online. As technology evolves, it is important that the regulatory framework introduced by the Bill keeps pace with emerging risks to children and exposure to pornography in new forms, such as generative artificial intelligence.

Part 5 of the Bill has been designed to be future-proof, and we assess that it would already capture AI-generated pornography. Our Amendments 206 and 209 will put beyond doubt that content is “provider pornographic content” where it is published or displayed on a Part 5 service by means of an automated tool or algorithm, such as a generative AI bot, made available on the service by a provider. Amendments 285 and 293 make clear that the definition of an automated tool includes a bot. Amendment 276 clarifies the definition of a provider of a Part 5 service, to make clear that a person who controls an AI bot that generates pornography can be regarded as the provider of a service.

Overall, our amendments provide important certainty for users, providers and Ofcom on the services and content in scope of the Part 5 duties. This will ensure that the new, robust duties for Part 5 providers to use age verification or age estimation to prevent children accessing provider pornographic content will also extend to AI-generated pornography. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, the noble Baroness, Lady Kidron, has unfortunately been briefly detained. If you are surprised to see me standing up, it is because I am picking up for her. I start by welcoming these amendments. I am grateful for the reaction to the thought-provoking debate that we had in Committee. I would like to ask a couple of questions just to probe the impact around the edges.

Amendment 27 looks as if it implies that purely content-generating machine-learning or AI bots could be excluded from the scope of the Bill, rather than included, which is the opposite of what we were hoping to achieve. That may be us failing to understand the detail of this large body of different amendments, but I would welcome my noble friend the Minister’s response to make sure that in Amendment 27 we are not excluding harm that could be generated by some form of AI or machine-learning instrument.

Maybe I can give my noble friend the Minister an example of what we are worried about. This is a recent scenario that noble Lords may have seen in the news, of a 15 year-old who asked, “How do I have sex with a 30 year-old?”. The answer was given in forensic detail, with no reference to the fact that it would in fact be statutory rape. Would the regulated service, or the owner of the regulated service that generated that answer, be included or excluded as a result of Amendment 27? That may be my misunderstanding.

This group is on AI-generated pornography. My friend, the noble Baroness, Lady Kidron, and I are both very concerned that it is not just about pornography, and that we should make sure that AI is included in the Bill. Specifically, many of us with teenage children will now be learning how to navigate the Snap AI bot. Would harm generated by that bot be captured in these amendments, or is it only content that is entirely pornographic? I hope that my noble friend the Minister can clarify both those points, then we will be able to support all these amendments.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise briefly to welcome the fact that there is a series of amendments here where “bot” is replaced by

“bot or other automated tool”.

I point out that there is often a lot of confusion about what a bot is or is not. It is something that was largely coined in the context of a particular service—Twitter—where we understand that there are Twitter bots: accounts that have been created to pump out lots of tweets. In other contexts, on other services, there is similar behaviour but the mechanism is different. It seems to me that the word “bot” may turn out to be one of those things that was common and popular at the end of the 2010s and in the early 2020s, but in five years we will not be using it at all. It will have served its time, it will have expired and we will be using other language to describe what it is that we want to capture: a human being has created some kind of automated tool that will be very context dependent, depending on the nature of the service, and they are pumping out material. It is very clear that we want to make sure that such behaviour is in scope and that the person cannot hide behind the fact that it was an automated tool, because we are interested in the mens rea of the person sitting behind the tool.

I recognise that the Government have been very wise in making sure that whenever we refer to a bot we are adding that “automated tool” language, which will make the Bill inherently much more future-proof.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I also speak in support of Amendments to 281, 281A and 281B, to which I have added my name, tabled by the noble Lord, Lord Russell. He and, as ever, the noble Baroness Kidron, have spoken eloquently, I am not going to spend much time on these amendments but I wanted to emphasise Amendment 281A.

In the old world of direct marketing—I am old enough to remember that when I was a marketing director it was about sending magazines, leaflets and letters—one spent all of one’s time working out how to build loyalty: how to get people to engage longer as a result of one’s marketing communication. In the modern digital world, that dwell time has been transformed into a whole behavioural science of its own. It has developed a whole set of tools. Today, we have been using the word “activity” at the beginning of the Bill in the new Clause 1 but also “features” and “functionality”. The reason why Amendment 281A is important is that there is a danger that the Bill keeps returning to being just about content. Even in Clause 208 on functionality, almost every item in subsection (2) mentions content, whereas Amendment 281A tries to spell out the elements of addiction-driving functionality that we know exist today.

I am certain that brilliant people will invent some more but we know that these ones exist today. I really think that we need to put them in the Bill to help everyone understand what we mean because we have spent days on this Bill—some of us have spent years, if not decades, on this issue—yet we still keep getting trapped in going straight back to content. That is another reason why I think it is so important that we get some of these functionalities in the Bill. I very much hope that, if he cannot accept the amendment today, my noble friend the Minister will go back, reflect and work out how we could capture these specific functionalities before it is too late.

I speak briefly on Amendments 28 to 30. There is unanimity of desire here to make sure that organisations such as Wikipedia and Streetmap are not captured. Personally, I am very taken—as I often am—by the approach of the noble Baroness, Lady Kidron. We need to focus on risk rather than using individual examples, however admirable they are today. If Wikipedia chose to put on some form of auto-scroll, the risk of that service would go up; I am not suggesting that Wikipedia is going to do so today but, in the digital world, we should not assume that, just because organisations are charities or devoted to the public good, they cannot inadvertently cause harm. We do not make that assumption in the physical world either. Charities that put on physical events have to do physical risk assessments. I absolutely think that we should hold all organisations to that same standard. However, viewed through the prism of risk, Wikipedia—brilliant as it is—does not have a risk for child safety and therefore should not be captured by the Bill.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, I broadly support all the amendments in this group but I will focus on the three amendments in the names of the noble Lord, Lord Russell, and others; I am grateful for their clear exposition of why these amendments are important. I draw particular attention to Amendment 281A and its helpful list of functions that are considered to be harmful and to encourage addiction.

There is a very important dimension to this Bill, whose object, as we have now established, is to encourage safety by design. An important aspect of it is cleaning up, and setting right, 20 years or more of tech development that has not been safe by design and has in fact been found to be harmful by way of design. As the noble Baroness, Lady Harding, just said, in many conversations and in talking to people about the Bill, one of the hardest things to communicate and get across is that this is about not only content but functionality. Amendment 281A provides a useful summary of the things that we know about in terms of the functions that cause harm. I add my voice to those encouraging the Minister and the Government to take careful note of it and to capture this list in the text of the Bill in some way so that this clean-up operation can be about not only content for the future but functionality and can underline the objectives that we have set for the Bill this afternoon.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, interestingly, because I have not discussed this at all with the noble Lord, Lord Moylan, I have some similar concerns to his. I have always wanted this to be a children’s online safety Bill. My concerns generally have been about threats to adults’ free speech and privacy and the threat to the UK as the home of technological innovation. I have been happy to keep shtum on things about protecting children, but I got quite a shock when I saw the series of government amendments.

I thought what most people in the public think: the Bill will tackle things such as suicide sites and pornography. We have heard some of that very grim description, and I have been completely convinced by people saying, “It’s the systems”. I get all that. But here we have a series of amendments all about content—endless amounts of content and highly politicised, contentious content at that—and an ever-expanding list of harms that we now have to deal with. That makes me very nervous.

On the misinformation and disinformation point, the Minister is right. Whether for children or adults, those terms have been weaponised. They are often used to delegitimise perfectly legitimate if contrary or minority views. I say to the noble Baroness, Lady Kidron, that the studies that say that youth are the fastest-growing far-right group are often misinformation themselves. I was recently reading a report about this phenomenon, and things such as being gender critical or opposing the small boats arriving were considered to be evidence of far-right views. That was not to do with youth, but at least you can see that this is quite a difficult area. I am sure that many people even in here would fit in the far right as defined by groups such as HOPE not hate, whose definition is so broad.

My main concerns are around the Minister’s Amendment 172. There is a problem: because it is about protected characteristics—or apes the protected characteristics of the Equality Act—we might get into difficulty. Can we at least recognise that, even in relation to the protected characteristics as noted in the Equality Act, there are raging rows politically? I do not know how appropriate it is that the Minister has tabled an amendment dragging young people into this mire. Maya Forstater has just won a case in which she was accused of being opposed to somebody’s protected characteristics and sacked. Because of the protected characteristics of her philosophical views, she has won the case and a substantial amount of money.

I worry when I see this kind of list. It is not just inciting hatred—in any case, what that would mean is ambivalent. It refers to abuse based on race, religion, sex, sexual orientation, disability and so on. This is a minefield for the Government to have wandered into. Whether you like it or not, it will have a chilling effect on young people’s ability to debate and discuss. If you worry that some abuse might be aimed at religion, does that mean that you will not be able to discuss Charlie Hebdo? What if you wanted to show or share the Charlie Hebdo cartoons? Will that count? Some people would say that is abusive or inciteful. This is not where the Bill ought to be going. At the very least, it should not be going there at this late stage. Under race, it says that “nationality” is one of the indicators that we should be looking out for. Maybe it is because I live in Wales, but there is a fair amount of abuse aimed at the English. A lot of Scottish friends dole it out as well. Will this count for young people who do that? I cannot get it.

My final question is in relation to proposed subsection (11). This is about protecting children, yet it lists a person who

“has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex”.

Are the Government seriously accepting that children have not just proposed to reassign but have been reassigned? That is a breach of the law. That is not meant to be happening. Your Lordships will know how bad this is. Has the Department for Education seen this? As we speak, it is trying to untangle the freedom for people not to have to go along with people’s pronouns and so on.

This late in the day, on something as genuinely important as protecting children, I just want to know whether there is a serious danger that this has wandered into the most contentious areas of political life. I think it is very dangerous for a government amendment to affirm gender reassignment to and about children. It is genuinely irresponsible and goes against the guidance the Government are bringing out at the moment for us to avoid. Please can the Minister clarify what is happening with Amendment 172?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I am not entirely sure how to begin, but I will try to make the points I was going to make. First, I would like to respond to a couple of the things said by the noble Baroness, Lady Fox. With the greatest respect, I worry that the noble Baroness has not read the beginning of the proposed new clause in Amendment 172, subsection (2), which talks about “Content which is abusive”, as opposed to content just about race, religion or the other protected characteristics.

One of the basic principles of the Bill is that we want to protect our children in the digital world in the same way that we protect them in the physical world. We do not let our children go to the cinema to watch content as listed in the primary priority and priority content lists in my noble friend the Minister’s amendments. We should not let them in the digital world, yet the reality is that they do, day in and day out.

I thank my noble friend the Minister, not just for the amendments that he has tabled but for the countless hours that he and his team have devoted to discussing this with many of us. I have not put my name to the amendments either because I have some concerns but, given the way the debate has turned, I start by thanking him and expressing my broad support for having the harms in the Bill, the importance of which this debate has demonstrated. We do not want this legislation to take people by surprise. The important thing is that we are discussing some fundamental protections for the most vulnerable in our society, so I thank him for putting those harms in the Bill and for allowing us to have this debate. I fear that it will be a theme not just of today but of the next couple of days on Report.

I started with the positives; I would now like to bring some challenges as well. Amendments 171 and 172 set out priority content and primary priority content. It is clear that they do not cover the other elements of harm: contact harms, conduct harms and commercial harms. In fact, it is explicit that they do not cover the commercial harms, because proposed new subsection (4) in Amendment 237 explicitly says that no amendment can be made to the list of harms that is commercial. Why do we have a perfect crystal ball that means we think that no future commercial harms could be done to our children through user-to-user and search services, such that we are going to expressly make it impossible to add those harms to the Bill? It seems to me that we have completely ignored the commercial piece.

I move on to Amendment 174, which I have put my name to. I am absolutely aghast that the Government really think that age-inappropriate sexualised content does not count as priority content. We are not necessarily talking here about a savvy 17 year-old. We are talking about four, five and six year-olds who are doomscrolling on various social media platforms. That is the real world. To suggest that somehow the digital world is different from the old-fashioned cinema, and a place where we do not want to protect younger children from age-inappropriate sexualised material, just seems plain wrong. I really ask my noble friend the Minister to reconsider that element.

I am also depressed about the discussion that we had about misinformation. As I said in Committee several times, I have two teenage girls. The reality is that we are asking today’s teenagers to try to work out what is truth and what is misinformation. My younger daughter will regularly say, “Is this just something silly on the internet?” She does not use the term “misinformation”; she says, “Is that just unreal, Mum?” She cannot tell about what appears in her social media feeds because of the degree of misinformation. Failing to recognise that misinformation is a harm for young people who do not yet know how to validate sources, which was so much easier for us when we were growing up than it is for today’s generations, is a big glaring gap, even in the content element of the harms.

I support the principle behind these amendments, and I am pleased to see the content harms named. We will come back next week to the conduct and contact harms—the functionality—but I ask my noble friend the Minister to reconsider on both misinformation and inappropriate sexualised material, because we are making a huge mistake by failing to protect our children from them.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, these words have obviously appeared in the Bill in one of those unverified sections; I have clicked the wrong button, so I cannot see them. Where does it say in Amendment 172 that it has to be a consistent flow?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

May I attempt to assist the Minister? This is the “amber” point described by the noble Lord, Lord Allan: “priority content” is not the same as “primary priority content”. Priority content is our amber light. Even the most erudite and scholarly description of baby eating is not appropriate for five year-olds. We do not let it go into “Bod” or any of the other of the programmes we all grew up on. This is about an amber warning: that user-to-user services must have processes that enable them to assess the risk of priority content and primary priority content. It is not black and white, as my noble friend is suggesting; it is genuinely amber.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, we may be slipping back into a Committee-style conversation. My noble friend Lord Moylan rightly says that this is the first chance we have had to examine this provision, which is a concession wrung out of the Government in Committee. As the noble Lord, Lord Stevenson, says, sometimes that is the price your Lordships’ House pays for winning these concessions, but it is an important point to scrutinise in the way that my noble friend Lord Moylan and the noble Baroness, Lady Fox, have done.

I will try to reassure my noble friend and the noble Baroness. This relates to the definition of a characteristic with which we began our debates today. To be a characteristic it has to be possessed by a person; therefore, the content that is abusive and targets any of the characteristics has to be harmful to an individual to meet the definition of harm. Further, it has to be material that would come to the attention of children in the way that the noble Baronesses who kindly leapt to my defence and added some clarity have set out. So my noble friend would be able to continue to criticise the polytheistic religions of the past and their tendencies to his heart’s content, but there would be protections in place if what he was saying was causing harm to an individual—targeting them on the basis of their race, religion or any of those other characteristics—if that person was a child. That is what noble Lords wanted in Committee, and that is what the Government have brought forward.

My noble friend and others asked why mis- and disinformation were not named as their own category of priority harmful content to children. Countering mis- and disinformation where it intersects with the named categories of primary priority or priority harmful content, rather than as its own issue, will ensure that children are protected from the mis- and disinformation narratives that present the greatest risk of harm to them. We recognise that mis- and disinformation is a broad and cross-cutting issue, and we therefore think the most appropriate response is to address directly the most prevalent and concerning harms associated with it; for example, dangerous challenges and hoax health advice for children to self-administer harmful substances. I assure noble Lords that any further harmful mis- and disinformation content will be captured as non-designated content where it presents a material risk of significant harm to an appreciable number of children.

In addition, the expert advisory committee on mis- and disinformation, established by Ofcom under the Bill, will have a wide remit in advising on the challenges of mis- and disinformation and how best to tackle them, including how they relate to children. Noble Lords may also have seen that the Government have recently tabled amendments to update Ofcom’s statutory media literacy duty. Ofcom will now be required to prioritise users’ awareness of and resilience to misinformation and disinformation online. This will include children and their awareness of and resilience to mis- and disinformation.

My noble friend Lady Harding of Winscombe talked about commercial harms. Harms exacerbated by the design and operation of a platform—that is, their commercial models—are covered in the Bill already through the risk assessment and safety duties. Financial harm, as used in government Amendment 237, is dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This exemption ensures that there is no regulatory overlap.

The noble Lord, Lord Russell of Liverpool, elaborated on remarks made earlier by the noble Lord, Lord Stevenson of Balmacara, about their meeting looking at the incel movement, if it can be called that. I assure the noble Lord and others that Ofcom has a review and report duty and will be required to stay on top of changes in the online harms landscape and report to government on whether it recommends changes to the designated categories of content because of the emerging risks that it sees.

The noble Baroness, Lady Kidron, anticipated the debate we will have on Monday about functionalities and content. I am grateful to her for putting her name to so many of the amendments that we have brought forward. We will continue the discussions that we have been having on this point ahead of the debate on Monday. I do not want to anticipate that now, but I undertake to carry on those discussions.

In closing, I reiterate what I know is the shared objective across your Lordships’ House—to protect children from harmful content and activity. That runs through all the government amendments in this group, which cover the main categories of harmful content and activity that, sadly, too many children encounter online every day. Putting them in primary legislation enables children to be swiftly protected from encountering them. I therefore hope that noble Lords will be heartened by the amendments that we have brought forward in response to the discussion we had in Committee.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise to speak to all the amendments in this group. It is a cause of great regret that, despite many private meetings with officials, government lawyers and Ministers, we have not yet come to an agreement that would explicitly include in the Bill harm that does not derive from content. I will be listening very carefully to the Minister, if he should change his mind during the debate.

The amendments in this group fall into three categories. First, there is a series of amendments in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford: Amendments 35, 36, 37A and 85. I hope the Government will accept them as consequential because, in meetings last week, they would not accept that harm to children can arise from the functionality and design of services and not just from the content. Each of these amendments simply makes it clear that harm can arise absent from content: nothing more, nothing less. If the Minister agrees that harm may derive from the design of products and services, can he please explain, when he responds, why these amendments are not acceptable? Simply put, it is imperative that the features, functionalities or behaviours that are harmful to children, including those enabled or created by the design or operation of the service, are in scope of the Bill. This would make it utterly clear that a regulated company has a duty to design its service in a manner that does not harm children.

The Government have primary priority harmful content, priority content or non-designated harmful content, the latter being a category that is yet to be defined, but not the harm that emerges from how the regulated company designs its service. For example, there are the many hundreds of small reward loops that make up a doomscroll or make a game addictive; commercial decisions such as Pokémon famously did for a time, which was to end every game in a McDonald’s car park; or, more sinister still, the content-neutral friend recommendations that introduce a child to other children like them, while pushing children into siloed groups. For example, they deliberately push 13 year-old boys towards Andrew Tate—not for any content reason, but simply on the basis that 13 year-old boys are like each other and one of them has already been on that site.

The impact of a content-neutral friend recommendation has rocked our schools as female teachers and girls struggle with the attitudes and actions of young boys, and has torn through families, who no longer recognise their sons and brothers. To push hundreds of thousands of children towards Andrew Tate for no reason other than to benefit commercially from the network effect is a travesty for children and it undermines parents.

The focus on content is old-fashioned and looks backwards. The Bill is drafted as if it has particular situations and companies in mind but does not think about how fast the business moves. When we started the Bill, none of us thought about the impact of TikTok; last week, we saw a new service, Threads, go from zero to 70 million users in a single day. It is an act of stunning hubris to be so certain of the form of harm. To be unprepared to admit that some harm is simply design means that, despite repeated denials, this is just a content Bill. The promise of systems and processes being at the heart of the Bill has been broken.

The second set of amendments in this group are in the name of my noble friend Lord Russell. Amendments 46 and 90 further reveal the attitude of the Government, in that they are protecting the companies rather than putting them four-square in the middle of their regime. The Government specifically exempt the manner of dissemination from the safety duties. My noble friend Lord Russell’s amendment would leave that out and ensure that the manner of dissemination, which is fundamental to the harm that children experience, is included. Similarly, Amendment 240 would take out “presented by content” so that harm that is the result of the design decisions is included in the Bill.

The third set are government Amendments 281C and 281D, and Amendment 281F, in my name. For absence of doubt, I am totally supportive of government Amendments 281C to 281E, which acknowledge the cumulative harms; for example, those that Molly Russell experienced as she was sent more and more undermining and harmful content. In as far as they are a response to my entreaties, and those of other noble Lords, that we ensure that cumulative harmful content is the focus of our concerns, I am grateful to the Government for tabling them. However, I note that the Government have conceded only the role of cumulative harm for content. Amendments 281D and 281E once again talk about content as the only harm to children.

The noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names to Amendment 281F, and I believe I am right in saying that if there were not a limit to four names, there were a great many Peers who would have added their names also. For the benefit of the House, I will quote directly from the amendment:

“When in relation to children, references to harm include the potential impact of the design and operation of a regulated service separately and additionally from harms arising from content, including the following considerations … the potential cumulative impact of exposure to harm or a combination of harms … the potential for harm to result from features, functionalities or behaviours enabled or created by the design and operation of services … the potential for some features and functionalities within a service to be higher risk than other aspects of the service … that a service may, when used in conjunction with other services, facilitate harm to a child on a different service … the potential for design strategies that exploit a child’s developmental vulnerabilities to create harm, including validation metrics and compulsive reward loops … the potential for real time services, features and functionalities such as geolocation, livestream broadcasts or events, augmented and virtual environments to put children at immediate risk … the potential for content neutral systems that curate or generate environments, content feeds or contacts to create harm to children … that new and emerging harms may arise from artificial intelligence, machine generated and immersive environments”.


Before I continue, I ask noble Lords to consider which of those things they would not like for their children, grandchildren or, indeed, other people’s children. I have accepted that the Government will not add the schedule of harms as I first laid it: the four Cs of content, conduct, contact and commercial harms. I have also accepted that the same schedule, written in the less comfortable language of primary priority, priority and non-designated harms, has also been rejected. However, the list that I just set out, and the amendment to the duties that reflect those risks, would finally put the design of the system at the heart of the Bill. I am afraid that, in spite of all our conversations, I cannot accept the Government’s argument that all harm comes from content.

Even if we are wrong today—which we are most definitely not—in a world of AI, immersive tech and augmented reality, is it not dangerous and, indeed, foolish, to exclude harm that might come from a source other than content? I imagine that the Minister will make the argument that the features are covered in the risk assessment duties and that, unlike content, features may be good or bad so they cannot be characterised as harmful. To that I say: if the risk assessment is the only process that matters, why do the Government feel it necessary to define the child safety duties and the interpretation of harm? The truth is, they have meaning. In setting out the duty of a company to a child, why would the Government not put the company’s design decisions right at the centre of that duty?

As for the second part of the argument, a geolocation feature may of course be great for a map service but less great if it shows the real-time location of a child to a predator, and livestreaming from a school concert is very different from livestreaming from your bedroom. Just as the noble Lord, Lord Allan, explained on the first day on Report, there are things that are red lines and things that are amber; in other words, they have to be age-appropriate. This amendment does not seek—nor would it mean—that individual features or functionalities would be prevented, banned or stopped. It would mean that a company had a duty to make sure that their features and functionalities were age-appropriate and did not harm children—full stop. There would be no reducing this to content.

Finally, I want to repeat what I have said before. Sitting in the court at Molly Russell’s inquest, I watched the Meta representative contest content that included blood cascading down the legs of a young woman, messages that said, “You are worthless”, and snippets of film of people jumping off buildings. She said that none of those things met the bar of harmful content according to Meta’s terms and conditions.

Like others, I believe that the Online Safety Bill could usher in a new duty of care towards children, but it is a category error not to see harm in the round. Views on content can always differ but the outcome on a child is definitive. It is harm, not harmful content, that the Bill should measure. If the Minister does not have the power to accede, I will, with great regret, be testing the opinion of the House. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, as so often in the course of the Bill, I associate myself wholeheartedly with the comments that the noble Baroness, Lady Kidron, just made. I, too, thank my noble friend the Minister and the Secretary of State for listening to our debates in Committee on the need to be explicit about the impact of cumulative harmful content. So I support Amendments 281C, 281D and 281E, and I thank them for tabling them.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, as somebody who is only five feet and two inches, I have felt that size does not matter for pretty much all my life and have long wanted to say that in a speech. This group of amendments is really about how size does not matter; risk does. I will briefly build on the speech just given by the noble Lord, Lord Allan, very eloquently as usual, to describe why risk matters more than size.

First, there are laws for which size does matter—small companies do not need to comply with certain systems and processes—but not those concerned with safety. I have in my mind’s eye the small village fête, where we expect a risk assessment if we are to let children ride on rides. That was not the case 100 years ago, but is today because we recognise those dangers. One of the reasons why we stumbled into thinking that size should matter in this Bill is that we are not being honest about the scale of the risk for our children. If the risk is large enough, we should not be worrying about size; we should be worrying about that risk. That is the first reason why we have to focus on risk and not size.

The second reason is subsequent to what I have just said—the principles of the physical world should apply to the online world. That is one of the core tenets of this Bill. That means that if you recognise the real and present risks of the digital world you have to say that it does not matter whether a small number of people are affected. If it is a small business, it still has an obligation not to put people in harm’s way.

Thirdly, small becomes big very quickly—unfortunately, that has not been true for me, but it is true in the digital world as Threads has just shown us. Fourthly, we also know that in the digital world re-engineering something once it has got very big is really difficult. There is also a practical reason why you want engineers to think about the risks before they launch services rather than after the event.

We keep being told, rightly, that this is a Bill about systems and processes. It is a Bill where we want not just the outcomes that the noble Lord, Lord Allan, has referred to in terms of services in the UK genuinely being safer; we are trying to effect a culture change. I would argue one of the most important culture changes is that any bright, young tech entrepreneur has to start by thinking about the risks and therefore the safety procedures they need to put in place as they build their tech business from the ground up and not once they have reached some artificial size threshold.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have to admit that it was incompetence rather than lack of will that meant I did not add my name to Amendment 39 in the name of the noble Lord, Lord Bethell, and I would very much like the Government to accept his argument.

In the meantime, I wonder whether the Minister would be prepared to make it utterly clear that proportionality does not mean a little bit of porn to a large group of children or a lot of porn to a small group of children; rather, it means that high-risk situations require effective measures and low-risk situations should be proportionate to that. On that theme, I say to the noble Lord, Lord Allan, whose points I broadly agree with, that while we would all wish to see companies brought into the fold rather than being out of the fold, it rather depends on their risk.

This brings me neatly to Amendments 43 and 87 from the noble Lord, Lord Russell, to which I managed to add my name. They make a very similar point to Amendment 39 but across safety duties. Amendment 242 in my name, to which the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names, makes the same point—yet again—in relation to Ofcom’s powers.

All these things are pointing in the same direction as Amendment 245 in the name of the noble Baroness, Lady Morgan, which I keep on trumpeting from these Benches and which offers an elegant solution. I urge the Minister to consider Amendment 245 before day four of Report because if the Government were to accept it, it would focus company resources, focus Ofcom resources and, as we discussed on the first day of Report, permit companies which do not fit the risk profile of the regime and are unable to comply with something that does not fit their model yet leaves them vulnerable to enforcement also to be treated in an appropriate way.

Collectively, the ambition is to make sure that we are treating things in proportion to the risk and that proportionate does not start meaning something else.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
I have another question, on Amendment 249, which is on information notices specifically about child deaths. I do not want to broaden this out, but we need to flag that we will need some clarity around what assistance can be given to people where the death is of someone who is not a child. There will be situations that are important to families and where everyone has a huge amount of sympathy but where we are not dealing with a child. Again, it is right that we have this specific set of measures around deceased children, but we should expect that Ofcom will be asked, “What about other circumstances?” We need a reasonable answer to that: that other things are in place. I hope that the answer will be that, if it is a serious enough case, without the information notice powers Ofcom could still, under Amendment 273 as I read it, look into other deaths that involve adults, as well as the specific powers it has in relation to children. I would appreciate clarification from the Minister.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, given the hour, I will be brief. I wanted to thank my noble friend the Minister and the Secretary of State, and to congratulate my friend the noble Baroness, Lady Kidron, on such an important group. It is late at night and not many of us are left in the Chamber, but this is an important thing that they have succeeded in doing together, and it is important that we mark that. It is also a hugely important thing that the bereaved families for justice have achieved, and I hope that they have achieved a modicum of calm from having made such a big difference for future families.

I will make one substantive point, referencing where my noble friend the Minister talked about future Bills. In this House and in this generation, we are building the legal scaffolding for a digital world that already exists. The noble Lord, Lord Allan of Hallam, referenced the fact that much of this was built without much thought—not maliciously but just without thinking about the real world, life and death. In Committee, I was taken by the noble Lord, Lord Knight, mentioning the intriguing possibility of using the Data Protection and Digital Information Bill to discuss data rights and to go beyond the dreadful circumstances that these amendments cover to make the passing on of your digital assets something that is a normal part of our life and death. So I feel that this is the beginning of a series of discussions, not the end.

I hope that my noble friend the Minister and whichever of his and my colleagues picks up the brief for the forthcoming Bill can take to heart how we have developed all this together. I know that today has perhaps not been our most wholly collaborative day, but, in general, I think we all feel that the Bill is so much the better for the collaborative nature that we have all brought to it, and on no more important a topic than this amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will be extremely brief. We have come a very long way since the Joint Committee made its recommendations to the Government, largely, I think, as a result of the noble Baroness, Lady Kidron. I keep mistakenly calling her “Baroness Beeban”; familiarity breeds formality, or something.

I thank the Minister and the Secretary of State for what they have done, and the bereaved families for having identified these issues. My noble friend Lord Allan rightly identified the sentiments as grief and anger at what has transpired. All we can do is try to do, in a small way, what we can to redress the harm that has already been done. I was really interested in his insights into how a platform will respond and how this will help them through the process of legal order and data protection issues with a public authority.

My main question to the Minister is in that context—the relationship with the Information Commissioner’s Office—because there are issues here. There is, if you like, an overlap of jurisdiction with the ICO, because the potential or actual disclosure of personal data is involved, and therefore there will necessarily have to be co-operation between the ICO and Ofcom to ensure the most effective regulatory response. I do not know whether that has emerged on the Minister’s radar, but it certainly has emerged on the ICO’s radar. Indeed, in the ideal world, there probably should be some sort of consultation requirement on Ofcom to co-operate with the Information Commissioner in these circumstances. Anything that the Minister can say on that would be very helpful.

Again, this is all about reassurance. We must make sure that we have absolutely nailed down all the data protection issues involved in the very creative way the Government have responded to the requests of the bereaved families so notably championed by the noble Baroness, Lady Kidron.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, the codes of practice are among the most important documents that Ofcom will produce as a result of the Bill—in effect, deciding what content we, the users of the internet, will see. The Government’s right to modify these drafts affects us all, so it is absolutely essential that the codes are trusted.

I, too, welcome the Government’s Amendments 134 to 138, which are a huge improvement on the Clause 39 that was presented in Committee. I am especially grateful that the Government have not proceeded with including economic conditions as a reason for the Secretary of State to modify draft codes, which the noble Baroness, Lady Harding, pointed out in Committee would be very damaging. But I would like the Minister to go further, which is why I put my name to Amendments 139, 140, 144 and 145.

Amendment 139 is so important at the moment. My fear is about the opt-out from publishing these directions from the Secretary of State for Ofcom to modify the draft codes, which will then allow them to be made behind closed doors between the Government and the regulator. This should not be allowed to happen. It would happen at a time when trust in the Government is low and there is a feeling that so many decisions affecting us all are taken without our knowledge. Surely it is right that there should be as much transparency as possible in exposing the pressure that the Minister is placing on the regulator. I hope that, if this amendment is adopted, it will allow Parliament to impose the bright light of transparency on the entire process, which is in danger of becoming opaque.

I am sure that no one wants a repeat of what happened under Section 94 of the Telecommunications Act 1984, which gave the Secretary of State power to give directions of a “general character” to anyone, in the “interests of national security” or international relations, as long as they did not disclose important information to Parliament. The Minister’s power to operate in total secrecy, without any accountability to Parliament, was seen by many as wrong and undemocratic. It was subsequently repealed. Amendments 139 and 140 will prevent the creation of a similar problem.

Likewise, I support Amendment 144, which builds on the previous amendments, as another brake on the control of the Secretary of State over this important area of regulations. Noble Lords in this House know how much the Government dislike legislative ping-pong—which we will see later this evening, I suspect. I ask the Minister to transfer this dislike to limiting ping-pong between the Government and the regulator over the drafting of codes of practice. It would also prevent the Secretary of State or civil servants expanding their control of the draft codes of practice from initial parameters to slightly wider sets of parameters each time that they are returned to the Minister for consideration. It will force the civil servants and the Secretary of State to make a judgment on the limitation of content and ensure that they stick to it. As it is, the Secretary of State has two bites of the cherry. They are involved in the original shaping of the draft codes of practice and then they can respond to Ofcom’s formulation. I hope the Minister would agree that it is sensible to stop this process from carrying on indefinitely. I want the users of the digital world to have full faith that the control of online content they see is above board —and not the result of secretive government overreach.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, not for the first time I find myself in quite a different place from my noble friend Lord Moylan. Before I go through some detailed comments on the amendments, I want to reflect that at the root of our disagreement is a fundamental view about how serious online safety is. The logical corollary of my noble friend’s argument is that all decisions should be taken by Secretaries of State and scrutinised in Parliament. We do not do that in other technical areas of health and safety in the physical world and we should not do that in the digital world, which is why I take such a different view—

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

Perhaps the noble Lord will allow me to make my point. I really welcome the government amendments in this group. I thank my noble friend the Minister for bringing them forward and for listening hard to the debates that we had at Second Reading and in Committee. I am very pleased to see the removal of economic policy and the burdens to business as one of the reasons that a Secretary of State could issue directions. I firmly believe that we should not be putting Secretaries of State in the position of having to trade off safety for economic growth. The reality is that big tech has found it impossible to make those trade-offs too. People who work in these companies are human beings. They are looking for growth in their businesses. Secretaries of State are rightly looking for economic growth in our countries. We should not be putting people in the position of trying to make that trade-off. The right answer is to defer to our independent regulator to protect safety. I thank my noble friend and the Government very much for tabling these amendments.

I also support my noble friend Lady Stowell, as a member of the Communications and Digital Committee that she chairs so ably. She has brought forward a characteristically thoughtful and detailed set of amendments in an attempt to look around the corners of these powers. I urge my noble friend the Minister to see whether he can find a way in the specific issues of infinite and secretive ping-pong. Taking the secretive, my noble friend Lady Stowell has found a very clever way of making sure that it is not possible for future Governments to obscure completely any direction that they are giving, while at the same time not putting at risk any national secrets. It is a very thoughtful and precise amendment. I very much hope that my noble friend the Minister can support it.

On the infinite nature of ping-pong, which I feel is quite ironic today—I am not sure anyone in this House welcomes the concept of infinite ping-pong right now, whatever our views on business later today—friends of mine in the business world ask me what is different about working in government versus working in the business world; I have worked in both big and small businesses. Mostly it is not different: people come to work wanting to do a good job and to further the objectives of the organisation that they are part of, but one of the biggest differences in government is that doing nothing and continuing to kick the can down the road is a much more viable option in the body politic than it is in the business world. Rarely is that for the good.

One of the things you learn in business is that doing nothing is often the very worst thing you can do. My worry about the infinite nature of the ping-pong is that it refers to a technical business world that moves unbelievably fast. What we do not need is to enshrine a system that enables government essentially to avoid doing anything. That is a particularly business and pragmatic reason to support my noble friend’s amendment. I stress that it is a very mild amendment. My noble friend Lady Stowell has been very careful and precise not to put unreasonable burdens on a future Secretary of State. In “Yes Minister”-speak, the bare minimum could be quite a lot. I urge my noble friend the Minister to look positively on what are extremely constructive amendments, delivered in a very thoughtful way.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I will speak briefly on a couple of amendments and pick up from where the noble Lord, Lord Allan, just finished on Amendment 186A. I associate myself with all the comments that the noble Baroness, Lady Kidron, made on her Amendment 191A. As ever, she introduced the amendment so brilliantly that there is no need for me to add anything other than my wholehearted support.

I will briefly reference Amendment 253 from the noble Lord, Lord Clement-Jones. Both his amendment and my noble friend Lord Moylan’s point to one of the challenges about regulating the digital world, which is that it touches everything. We oscillate between wanting to compartmentalise the digital and recognising that it is interconnected to everything. That is the same challenge faced by every organisation that is trying to digitise: do you ring-fence or recognise that it touches everything? I am very supportive of the principles behind Amendment 253 precisely because, in the end, it does touch everything. It is hugely important that, even though this Bill and others still to come are creating an extraordinarily powerful single regulator in the form of Ofcom, we also recognise the interconnectivity of the regulatory landscape. The amendment is very well placed, and I hope my noble friend the Minister looks favourably on it and its heritage from the pre-legislative scrutiny committee.

I will briefly add my thoughts on Amendment 186A in this miscellaneous group. It feels very much as if we are having a Committee debate on this amendment, and I thank my noble friend Lord Moylan for introducing it. He raises a hugely important point, and I am incredibly sympathetic to the logic he set out.

In this area the digital world operates differently from the physical world, and we do not have the right balance at all between the powers of the big companies and consumer rights. I am completely with my noble friend in the spirit in which he introduced the amendment but, together with the noble Lord, Lord Allan, I think it would be better tackled in the Digital Markets, Competition and Consumers Bill, precisely because it is much broader than online safety. This fundamentally touches the issue of consumer rights in the digital world and I am worried that, if we are not careful, we will do something with the very best intentions that actually makes things slightly worse.

I worry that the terms and conditions of user-to-user services are incomprehensible to consumers today. Enshrining it as a contract in law might, in some cases, make it worse. Today, when user-to-user services have used our data for something, they are keen to tell us that we agreed to it because it was in their terms of service. My noble friend opens up a really important issue to which we should give proper attention when the Digital Markets, Competition and Consumers Bill arrives in the House. It is genuinely not too late to address that, as it is working its way through the Commons now. I thank my noble friend for introducing the amendment, because we should all have thought of the issue earlier, but it is much broader than online safety.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, even by previous standards, this is the most miscellaneous of miscellaneous groups. We have ranged very broadly. I will speak first to Amendment 191A from the noble Baroness, Lady Kidron, which was so well spoken to by her and by the noble Baroness, Lady Harding. It is common sense, and my noble friend Lord Allan, as ever, put his finger on it: it is not as if coroners are going to come across this every day of the week; they need this kind of guidance. The Minister has introduced his amendments on this, and we need to reduce those to an understandable code for coroners and bereaved parents. I defy anybody, apart from about three Members of this House, to describe in any detail how the information notices will interlock and operate. I could probably name those Members off the top of my head. That demonstrates why we need such a code of practice. It speaks for itself.

I am hugely sympathetic to Amendment 275A in the name of the noble Baroness, Lady Finlay, who asked a series of important questions. The Minister said at col. 1773 that he would follow up with further information on the responsibility of private providers for their content. This is a real, live issue. The noble Baroness, Lady Kidron, put it right: we hope fervently that the Bill covers the issue. I do not know how many debates about future-proofing we have had on the Bill but each time, including in that last debate, we have not quite been reassured enough that we are covering the metaverse and provider content in the way we should be. I hope that this time the Minister can give us definitive chapter and verse that will help to settle the horses, so to speak, because that is exactly what the very good amendment in the name of the noble Baroness, Lady Finlay, was about.

--- Later in debate ---
For that reason, I again ask the Government, as a minimum, to accept the shorter date that was proposed or perhaps to think again before Third Reading.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I associate myself with my noble friend Lady Fraser of Craigmaddie’s incredibly well-made points. I learned a long time ago that, when people speak very softly and say they have a very small point to make, they are often about to deliver a zinger. She really did; it was hugely powerful. I will say no more than that I wholeheartedly agree with her; thank you for helping us to understand the issue properly.

I will speak in more detail about access to data for researchers and in support of my noble friend Lord Bethell’s amendments. I too am extremely grateful to the Minister for bringing forward all the government amendments; the direction of travel is encouraging. I am particularly pleased to see the movement from “may” to “must”, but I am worried that it is Ofcom’s rather than the regulated services’ “may” that moves to “must”. There is no backstop for recalcitrant regulated services that refuse to abide by Ofcom’s guidance. As the noble Baroness, Lady Kidron, said, in other areas of the Bill we have quite reasonably resorted to launching a review, requiring Ofcom to publish its results, requiring the Secretary of State to review the recommendations and then giving the Secretary of State backstop powers, if necessary, to implement regulations that would then require regulated companies to change.

I have a simple question for the Minister: why are we not following the same recipe here? Why does this differ from the other issues, on which the House agrees that there is more work to be done? Why are we not putting backstop powers into the Bill for this specific issue, when it is clear to all of us that it is highly likely that there will be said recalcitrant regulated firms that are not willing to grant access to their data for researchers?

Before my noble friend the Minister leaps to the hint he gave in his opening remarks—that this should all be picked up in the Data Protection and Digital Information Bill—unlike the group we have just discussed, this issue was discussed at Second Reading and given a really detailed airing in Committee. This is not new news, in the same way that other issues where we have adopted the same recipe that includes a backstop are being dealt with in the Bill. I urge my noble friend the Minister to follow the good progress so far and to complete the package, as we have in other areas.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is valuable to be able to speak immediately after my noble friend Lady Harding of Winscombe, because it gives me an opportunity to address some remarks she made last Wednesday when we were considering the Bill on Report. She suggested that there was a fundamental disagreement between us about our view of how serious online safety is—the suggestion being that somehow I did not think it was terribly important. I take this opportunity to rebut that and to add to it by saying that other things are also important. One of those things is privacy. We have not discussed privacy in relation to the Bill quite as much as we have freedom of expression, but it is tremendously important too.

Government Amendment 247A represents the most astonishing level of intrusion. In fact, I find it very hard to see how the Government think they can get away with saying that it is compatible with the provisions of the European Convention on Human Rights, which we incorporated into law some 20 years ago, thus creating a whole law of privacy that is now vindicated in the courts. It is not enough just to go around saying that it is “proportionate and necessary” as a mantra; it has to be true.

This provision says that an agency has the right to go into a private business with no warrant, and with no let or hindrance, and is able to look at its processes, data and equipment at will. I know of no other business that can be subjected to that without a warrant or some legal process in advance pertinent to that instance, that case or that business.

My noble friend Lord Bethell said that the internet has been abused by people who carry out evil things; he mentioned terrorism, for example, and he could have mentioned others. However, take mobile telephones and Royal Mail—these are also abused by people conducting terrorism, but we do not allow those communications to be intruded into without some sort of warrant or process. It does not seem to me that the fact that the systems can be abused is sufficient to justify what is being proposed.

My noble friend the Minister says that this can happen only offline. Frankly, I did not understand what he meant by that. In fact, I was going to say that I disagreed with him, but I am moving to the point of saying that I think it is almost meaningless to say that it is going to happen offline. He might be able to explain that. He also said that Ofcom will not see individual traffic. However, neither the point about being offline nor the point about not seeing individual traffic is on the face of the Bill.

When we ask ourselves what the purpose of this astonishing power is—this was referred to obliquely to some extent by the noble Baroness, Lady Fox of Buckley—we can find it in Clause 91(1), to which proposed new subsection (2A) is being added or squeezed in subordinate to it. Clause 91(1) talks about

“any information that they”—

that is, Ofcom—

“require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions”.

The power could be used entirely as a fishing expedition. It could be entirely for the purpose of educating Ofcom as to what it should be doing. There is nothing here to say that it can have these powers of intrusion only if it suspects that there is criminality, a breach of the codes of conduct or any other offence. It is a fishing expedition, entirely for the purpose of

“exercising, or deciding whether to exercise”.

Those are the intrusions imposed upon companies. In some ways, I am less concerned about the companies than I am about what I am going to come to next: the intrusion on the privacy of individuals and users. If we sat back and listened to ourselves and what we are saying, could we explain to ordinary people—we are going to come to this when we discuss end-to-end encryption—what exactly can happen?

Two very significant breaches of the protections in place for privacy on the internet arise from what is proposed. First, if you allow someone into a system and into equipment, especially from outside, you increase the risk and the possibility that a further, probably more hostile party that is sufficiently well-equipped with resources—we know state actors with evil intent which are so equipped—can get in through that or similar holes. The privacy of the system itself would be structurally weakened as a result of doing this. Secondly, if Ofcom is able to see what is going on, the system becomes leaky in the direction of Ofcom. It can come into possession of information, some of which could be of an individual character. My noble friend says that it will not be allowed to release any data and that all sorts of protections are in place. We know that, and I fully accept the honesty and integrity of Ofcom as an institution and of its staff. However, we also know that things get leaked and escape. As a result of this provision, very large holes are being built into the protections of privacy that exist, yet there has been no reference at all to privacy in the remarks made so far by my noble friend.

I finish by saying that we are racing ahead and not thinking. Good Lord, my modest amendment in the last group to bring a well-established piece of legislation—the Consumer Rights Act—to bear upon this Bill was challenged on the grounds that there had not been an impact assessment. Where is the impact assessment for this? Where is even the smell test for this in relation to explaining it to the public? If my noble friend is able to expatiate at the end on the implications for privacy and attempt to give us some assurance, that would be some consolation. I doubt that he is going to give way and do the right thing and withdraw this amendment.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, we have had some productive discussions on application stores, commonly known as “app stores”, and their role as a gateway for children accessing online services. I am grateful in particular to my noble friend Lady Harding of Winscombe for her detailed scrutiny of this area and the collaborative approach she has taken in relation to it and to her amendments, to which I will turn in a moment. These share the same goals as the amendments tabled in my name in seeking to add evidence-based duties on app stores to protect children.

The amendments in my name will do two things. First, they will establish an evidence base on the use of app stores by children and the role that app stores play in children encountering harmful content online. Secondly, following consideration of this evidence base, the amendments also confer a power on the Secretary of State to bring app stores into scope of the Bill should there be a material risk of significant harm to children on or through them.

On the evidence base, Amendment 272A places a duty on Ofcom to publish a report on the role of app stores in children accessing harmful content on the applications of regulated services. To help build a greater evidence base about the types of harm available on and through different kinds of app stores, the report will consider a broad range of these stores, which could include those available on various devices, such as smartphones, gaming devices and smart televisions. The report will also assess the use and effectiveness of age assurance on app stores and consider whether the greater use of age assurance or other measures could protect children further.

Publication of the report must be two to three years after the child safety duties come into force so as not to interfere with the Bill’s implementation timelines. This timing will also enable the report to take into account the impact of the regulatory framework that the Bill establishes.

Amendment 274A is a consequential amendment to include this report in the Bill’s broader confidentiality provisions, meaning that Ofcom will need to exclude confidential matters—for example, commercially sensitive information—from the report’s publication.

Government Amendments 236A, 236B and 237D provide the Secretary of State with a delegated power to bring app stores into the scope of regulation following consideration of Ofcom’s report. The power will allow the Secretary of State to make regulations putting duties on app stores to reduce the risks of harm presented to children from harmful content on or via app stores. The specific requirements in these regulations will be informed by the outcome of the Ofcom report I have mentioned.

As well as setting out the rules for app stores, the regulations may also make provisions regarding the duties and functions of Ofcom in regulating app stores. This may include information-gathering and enforcement powers, as well as any obligations to produce guidance or codes of practice for app store providers.

By making these amendments, our intention is to build a robust evidence base on the potential risks of app stores for children without affecting the Bill’s implementation more broadly. Should it be found that duties are required, the Secretary of State will have the ability to make robust and comprehensive duties, which will provide further layers of protection for children. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, before speaking to my Amendment 239A, I thank my noble friend the Minister, the Secretary of State and the teams in both the department and Ofcom for their collaborative approach in working to bring forward this group of amendments. I also thank my cosignatories. My noble friend Lady Stowell cannot be in her place tonight but she has been hugely helpful in guiding me through the procedure, as have been the noble Lords, Lord Stevenson, Lord Clement-Jones and Lord Knight, not to mention the noble Baroness, Lady Kidron. It has been a proper cross-House team effort. Even the noble Lord, Lord Allan, who started out quite sceptical, has been extremely helpful in shaping the discussion.

I also thank the NSPCC and Barnardo’s for their invaluable advice and support, as well as Snap and Match—two companies which have been willing to stick their heads above the parapet and challenge suppliers and providers on which they are completely dependent in the shape of the current app store owners, Apple and Google.

I reassure my noble friend the Minister—and everyone else—that I have no intention of dividing the House on my amendment, in case noble Lords were worried. I am simply seeking some reassurance on a number of points where my amendments differ from those tabled by the Government—but, first, I will highlight the similarities.

As my noble friend the Minister has referred to, I am delighted that we have two packages of amendments that in both cases recognise that this was a really significant gap in the Bill as drafted. Ignoring the elements of the ecosystem that sell access to regulated services, decide age guidelines and have the ability to do age assurance was a substantial gap in the framing of the Bill. But we have also recognised together that it is very important that this is an “and” not an “or”—it is not instead of regulating user-to-user services or search but in addition to. It is an additional layer that we can bring to protect children online, and it is very important that we recognise that—and both packages do.

Online Safety Bill

Baroness Harding of Winscombe Excerpts
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I promise I will be brief. I, too, welcome what the Minister has said and the amendments that the Government have proposed. This is the full package which we have been seeking in a number of areas, so I am very pleased to see it. My noble friend Lady Newlove and the noble Baroness, Lady Kidron, are not in their places, but I know I speak for both of them in wanting to register that, although the thoughtful and slow-and-steady approach has some benefits, there also some real costs to it. The UK Safer Internet Centre estimates that there will be some 340,000 individuals in the UK who will have no recourse for action if the platforms complaints mechanism does not work for them in the next two years. That is quite a large number of people, so I have one very simple question for the Minister: if I have exhausted the complaints procedure with an existing platform in the next two years, where do I go? I cannot go to Ofcom. My noble friend Lord Grade was very clear in front of the committee I sit on that it is not Ofcom’s job. Where do I go if I have a complaint that I cannot get resolved in the next two years?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest as chair of Trust Alliance Group, which operates the energy and communications ombudsman schemes, so I have a particular interest in the operation of these ADR schemes. I thank the Minister for the flexibility that he has shown in the provision about the report by Ofcom and in having backstop powers for the Secretary of State to introduce such a scheme.

Of course, I understand that the noble Baroness, Lady Newlove, and the UK Safer Internet Centre are very disappointed that this is not going to come into effect immediately, but there are advantages in not setting out the scheme at this very early point before we know what some of the issues arising are. I believe that Ofcom will definitely want to institute such a scheme, but it may be that, in the initial stages, working out the exact architecture is going to be necessary. Of course, I would have preferred to have a mandated scheme, in the sense that the report will look not at the “whether” but the “how”, but I believe that at the end of the day it will absolutely obvious that there needs to be such an ADR scheme in order to provide the kind of redress the noble Baroness, Lady Harding, was talking about.

I also agree with noble Baroness, Lady Morgan, that the kinds of complaints that this would cover should include fraudulent adverts. I very much hope that the Minister will be able to answer the questions that both noble Baronesses asked. As my noble friend said, will he reassure us that the department and Ofcom will not take their foot off the pedal, whatever the Bill may say?

--- Later in debate ---
Moved by
240: Clause 82, page 74, line 25, leave out “presented by content”
Member’s explanatory statement
This amendment ensures that Ofcom is empowered to consider harms presented by features, functionalities, behaviours and the design and operation of services not just by content.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, if I may, I shall speak very briefly, in the absence of my noble friend Lady Kidron, and because I am one of the signatories of this amendment, alongside the noble Lord, Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. Amendment 240, together with a number of amendments that we will be debating today, turns on a fundamental issue that we have not yet resolved.

I came in this morning being told that we would be voting on this amendment and that other amendments later today would be consequential—I am a novice at this level of parliamentary procedure, so forgive me if I have got myself confused during the day—but I now understand that my noble friend considers this amendment to be consequential but, strangely, the amendments right at the end of the day are not. I just wanted to flag to the House that they all cover the same fundamental issue of whether harms can be unrelated to content, whether the harms of the online world can be to do with functionality—the systems and processes that drive the addiction that causes so much harm to our children.

It is a fundamental disagreement. I pay tribute to the amount of time the department, the Secretary of State and my noble friend have spent on it, but it is not yet resolved and, although I understand that I should now say that I beg leave to move the amendment formally, I just wanted to mark, with apologies, the necessity, most likely, of having to bring the same issue back to vote on later today.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, His Majesty’s Government indeed agree that this is consequential on the other amendments, including Amendment 35, which the noble Baroness, Lady Kidron, previously moved at Report. We disagreed with them, but we lost that vote; this is consequential, and we will not force a Division on it.

We will have further opportunity to debate the fundamental issues that lie behind it, to which my noble friend Lady Harding just referred. Some of the amendments on which we may divide later, the noble Baroness, Lady Kidron, tabled after defeating the Government the other day, so we cannot treat them as consequential. We look forward to debating them; I will urge noble Lords not to vote for them, but we will have opportunity to discuss them later.

--- Later in debate ---
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I rise to speak in favour of my noble friend Lord Moylan’s amendment. Given that I understand he is not going to press it, and while I see Amendment 255 as the ideal amendment, I thank the noble Lords, Lord Stevenson and Lord Clement- Jones, for their Amendments 256, 257 and 259, and the noble Lords, Lord Clement-Jones and Lord Allan of Hallam, for Amendments 258 and 258ZA.

I will try to be as brief as I can. I think about two principles—unintended consequences and the history of technology transfer. The point about technology transfer is that once a technology is used it becomes available to other people quickly, even bad guys, whether that was intended or not. There is obviously formal technology transfer, where you have agreement or knowledge transfer via foreign investment, but let us think about the Cold War and some of the great technological developments—atomic secrets, Concorde and the space shuttle. In no time at all, the other side had that access, and that was before the advent of the internet.

If we are to open a door for access to encrypted messages, that technology will be available to the bad guys in no time at all, and they will use it against dissidents, many of whom will be in contact with journalists and human rights organisations in this country and elsewhere. Therefore, the unintended consequence may well be that in seeking to protect children in this country by accessing encrypted messages or unencrypted messages, we may well be damaging the childhoods of children in other countries when their parents, who are dissidents, are suddenly taken away and maybe the whole family is wiped out. Let us be careful about those unintended consequences.

I also welcome my noble friend Lord Parkinson’s amendments about ensuring journalistic integrity, such as Amendment 257D and others. They are important. However, we must remember that once these technologies are available, everyone has a price and that technology will be transferred to the bad guys.

Given that my noble friend Lord Moylan will not press Amendment 255, let us talk about some of the other amendments—I will make some general points rather than go into specifics, as many noble Lords have raised these points. These amendments are sub-optimal, but at least there is some accountability for Ofcom being able to use this power and using it sensibly and proportionately. One of the things that has run throughout this Bill and other Bills is “who regulates the regulators?” and ensuring that regulators are accountable. The amendments proposed by the noble Lords, Lord Stevenson and Lord Clement-Jones, and by the noble Lords, Lord Clement-Jones and Lord Allan of Hallam, go some way towards ensuring that safeguards are in place. If the Government are not prepared to have an explicit statement that they will not allow access to encrypted messages, I hope that there will be some support for the noble Lords’ amendments.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I promise to speak very briefly. I welcome the Government’s amendments. I particularly welcome that they appear to mirror partly some of the safeguards that are embedded in the Investigatory Powers Act 2016.

I have one question for my noble friend the Minister about the wording, “a skilled person”. I am worried that “a skilled person” is a very vague term. I have been taken all through the course of this Bill by the comparison with the Investigatory Powers Act and the need to think carefully about how we balance the importance of privacy with the imperative of protecting our children and being able to track down the most evil and wicked perpetrators online. That is very similar to the debates that we had here several years ago on the Investigatory Powers Act.

The IPA created the Technical Advisory Board. It is not a decision-making body. Its purpose is to advise the Investigatory Powers Commissioner and judicial commissioners on the impact of changing technology and the development of techniques to use investigatory powers while maintaining privacy. It is an expert panel constituted to advise the regulator—in this case, the judicial commissioner—specifically on technology interventions that must balance this really difficult trade-off between privacy and child protection. Why have we not followed the same recipe? Rather than having a skilled person, why would we not have a technology advisory panel of a similar standing where it is clear to all who the members are. Those members would be required to produce a regular report. It might not need to be as regular as the IPA one, but it would just take what the Government have already laid one step further towards institutionalising the independent check that is really important if these Ofcom powers were ever to be used.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I added my name to some amendments on this issue in Committee. I have not done so on Report, not least because I have been so occupied with other things and have not had the time to focus on this. However, I remain concerned about this part of the Bill. I am sympathetic to my noble friend Lord Moylan’s Amendment 255, but listening to this debate and studying all the amendments in this group, I am a little confused and so have some simple questions.

First, I heard my noble friend the Minister say that the Government have no intention to require the platforms to carry out general monitoring, but is that now specific in any of the amendments that he has tabled? Regarding the amendments which would bring further safeguards around the oversight of Ofcom’s use of this power, like my noble friend Lady Harding, I have always been concerned that the oversight approach should be in line with that for the Investigatory Powers Act and could never understand why it was not in the original version of the Bill. Like her, I am pleased that the Government have tabled some amendments, but I am not yet convinced that they go far enough.

That leads me to the amendments that have been tabled by the noble Lords, Lord Stevenson and Lord Clement-Jones, and particularly that in the name of the noble Lord, Lord Allan of Hallam. As his noble friend Lord Clement-Jones has added his name to it, perhaps he could answer my question when he gets up. Would the safeguards that are outlined there—the introduction of the Information Commissioner—meet the concerns of the big tech companies? Do we know whether it would meet their needs and therefore lead them not to feel it necessary to withdraw their services from the UK? I am keen to understand that.

There is another thing that might be of benefit for anyone listening to this debate who is not steeped in the detail of this Bill, and I look to any of those winding up to answer it—including my noble friend the Minister. Is this an end to end-to-end encryption? Is that what is happening in this Bill? Or is this about ensuring that what is already permissible in terms of the authorities being able to use their powers to go after suspected criminals is somehow codified in this Bill to make sure it has proper safeguards around it? That is still not clear. It would be very helpful to get that clarity from my noble friend, or others.

--- Later in debate ---
Moved by
281BA: Clause 208, page 175, line 5, at end insert—
“(3A) In this Act “functionality”, in relation to a regulated service, includes the design of systems and processes that engage or impact on users, particularly algorithms.”Member’s explanatory statement
This amendment clarifies the role that system design can impact on outcomes on users in light of the requirement for systems to be safe by design.
--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I note that the noble Lord, Lord Stevenson, is no longer in his place, but I promise to still try to live by his admonition to all of us to speak briefly.

I will speak to Amendments 281BA, 281FA, 286A and 281F, which has already been debated but is central to this issue. These amendments aim to fix a problem we repeatedly raised in Committee and on Report. They are also in the name of the noble Baroness, Lady Kidron, and the noble Lords, Lord Stevenson and Lord Clement-Jones, and build on amendments in Committee laid by the noble Lord, Lord Russell, my noble friend Lord Bethell and the right reverend Prelate the Bishop of Oxford. This issue has broad support across the whole House.

The problem these amendments seek to solve is that, while the Government have consistently asserted that this is a systems and processes Bill, the Bill is constructed in a manner that focuses on content. Because this is a rerun of previous debates, I will try to keep my remarks short, but I want to be clear about why this is a real issue.

I am expecting my noble friend the Minister to say, as he has done before, that this is all covered; we are just seeing shadows, we are reading the Bill wrong and the harms that we are most concerned about are genuinely in the Bill. But I really struggle to understand why, if they are in the Bill, stating them clearly on the face of the Bill creates the legal uncertainty that seems to be the Government’s favourite problem with each of the amendments we have been raising today.

My noble friend—sorry, my friend—the noble Baroness, Lady Kidron, commissioned a legal opinion that looked at the statements from the Government and compared it to the text in the Bill. That opinion, like that of the many noble Lords I have just mentioned, is that the current language in the Bill about features and functionalities only pertains as far as it relates to harmful content. All roads in this game of Mornington Crescent lead back to content.

Harmful content is set out in a schedule to the Bill, and this set of amendments ensures that the design of services, irrespective of content, is required to be safe by design. If the Government were correct in their assertion that this is already covered, then these amendments really should not pose any threat at all, and I have yet to hear the Government enunciate what the real legal uncertainty actually is in stating that harm can come from functionality, not just from content.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is not just a content Bill. The Government have always been clear that the way in which a service is designed and operated, including its features and functionalities, can have a significant impact on the risk of harm to a user. That is why the Bill already explicitly requires providers to ensure their services are safe by design and to address the risks that arise from features and functionalities.

The Government have recognised the concerns which noble Lords have voiced throughout our scrutiny of the Bill, and those which predated the scrutiny of it. We have tabled a number of amendments to make it even more explicit that these elements are covered by the Bill. We have tabled the new introductory Clause 1, which makes it clear that duties on providers are aimed at ensuring that services are safe by design. It also highlights that obligations on services extend to the design and operation of the service. These obligations ensure that the consideration of risks associated with the business model of a service is a fundamental aspect of the Bill.

My noble friend Baroness Harding of Winscombe worried that we had made the Bill worse by adding this. The new clause was a collaborative one, which we have inserted while the Bill has been before your Lordships’ House. Let me reassure her and other noble Lords as we conclude Report that we have not made it worse by so doing. The Bill will require services to take a safety by design approach to the design and operation of their services. We have always been clear that this will be crucial to compliance with the legislation. The new introductory Clause 1 makes this explicit as an overarching objective of the Bill. The introductory clause does not introduce any new concepts; it is an accurate summary of the key provisions and objectives of the Bill and, to that end, the framework and introductory statement are entirely compatible.

We also tabled amendments—which we debated last Monday—to Clause 209. These make it clear that functionalities contribute to the risk of harm to users, and that combinations of functionality may cumulatively drive up the level of risk. Amendment 281BA would amend the meaning of “functionality” within the Bill, so that it includes any system or process which affects users. This presents a number of concerns. First, such a broad interpretation would mean that any service in scope of the Bill would need to consider the risk of any feature or functionality, including ones that are positive for users’ online experience. That could include, for example, processes designed for optimising the interface depending on the user’s device and language settings. The amendment would increase the burden on service providers under the existing illegal content and child safety duties and would dilute their focus on genuinely risky functionality and design.

Second, by duplicating the reference to systems, processes and algorithms elsewhere in the Bill, it implies that the existing references in the Bill to the design of a service or to algorithms must be intended to capture matters not covered by the proposed new definition of “functionality”. This would suggest that references to systems and processes, and algorithms, mentioned elsewhere in the Bill, cover only systems, processes or algorithms which do not have an impact on users. That risks undermining the effectiveness of the existing duties and the protections for users, including children.

Amendment 268A introduces a further interpretation of features and functionality in the general interpretation clause. This duplicates the overarching interpretation of functionality in Clause 208 and, in so doing, introduces legal and regulatory uncertainty, which in turn risks weakening the existing duties. I hope that sets out for my noble friend Lady Harding and others our legal concerns here.

Amendment 281FA seeks to add to the interpretation of harm in Clause 209 by clarifying the scenarios in which harm may arise, specifically from services, systems and processes. This has a number of concerning effects. First, it states that harm can arise solely from a system and process, but a design choice does not in isolation harm a user. For example, the decision to use algorithms, or even the algorithm itself, is not what causes harm to a user—it is the fact that harmful content may be pushed to a user, or content pushed in such a manner that is harmful, for example repeatedly and in volume. That is already addressed comprehensively in the Bill, including in the child safety risk assessment duties.

Secondly, noble Lords should be aware that the drafting of the amendment has the effect of saying that harm can arise from proposed new paragraphs (a) (b) and (c)—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

Can I just double-check what my noble friend has just said? I was lulled into a possibly false sense of security until we got to the point where he said “harmful” and then the dreaded word “content”. Does he accept that there can be harm without there needing to be content?

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, being the understudy for the noble Baroness, Lady Kidron, is quite a stressful thing. I am, however, reliably informed that she is currently offline in the White House, but I know that she will scrutinise everything I say afterwards and that I will receive a detailed school report tomorrow.

I am extremely grateful to my noble friend the Minister for how he has just summed up, but I would point out two things in response. The first is the circularity of the legal uncertainty. What I think I have heard is that we are trying to insert into the Bill some clarity because we do not think it is clear, but the Government’s concern is that by inserting clarity, we then imply that there was not clarity in the rest of the Bill, which then creates the legal uncertainty—and round we go. I am not convinced that we have really solved that problem, but I may be one step further towards understanding why the Government think that it is a problem. I think we have to keep exploring that and properly bottom it out.

My second point is about what I think will for evermore be known as the marshmallow problem. We have just rehearsed across the House a really heartfelt concern that just because we cannot imagine it today, it does not mean that there will not be functionality that causes enormous harm which does not link back to a marshmallow, multiple marshmallows or any other form of content.

Those two big issues are the ones we need to keep discussing: what is really causing the legal uncertainty and how we can be confident that unimaginable harms from unimaginable functionality are genuinely going to be captured in the Bill. Provided that we can continue, maybe it is entirely fitting at the end of what I think has been an extraordinarily collaborative Report, Committee and whole process of the Bill going through this House—which I have felt incredibly proud and privileged to be a part of—that we end with a commitment to continue said collaborative process. With that, I beg leave to withdraw the amendment.

Amendment 281BA withdrawn.