46 Baroness Stowell of Beeston debates involving the Department for Digital, Culture, Media & Sport

Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 11th May 2023
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage

Online Safety Bill

Baroness Stowell of Beeston Excerpts
At heart, I recognise that this is in principle no more than ensuring that the expertise and knowledge of those who have served in an appropriate parliamentary Select Committee are grafted on to the normal affirmative or negative approval mechanisms for secondary legislation, but I also think it opens up a substantial new way of doing what has, on many occasions, been merely a rubber-stamping of what can be rather significant policy changes. It also gives a good opportunity to bring Parliament and parliamentarians into the policy delivery mechanism in what seems to me to be a satisfying way. It makes sense to do this for a complex new regime in a fast-changing technological environment such as the one that the Bill is ushering in, but it might have other applications, particularly consideration of other legislation that is currently in the pipeline. I beg to move.
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, it is a great pleasure to follow the noble Lord, Lord Stevenson. I am grateful to him, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, for their support for my amendments, which I will come to in a moment. Before I do, I know that my noble friend Lord Moylan will be very disappointed not to be here for the start of this debate. From the conversation I had with him last week when we were deliberating the Bill, I know that he is detained on committee business away from the House. That is what is keeping him today; I hope he may join us a bit later.

Before I get into the detail of my amendments, I want to take a step back and look at the bigger picture. I remind noble Lords that on the first day in Committee, when we discussed the purpose of the Bill, one of the points I made was that, in my view, the Bill is about increasing big tech’s accountability to the public. For too long, and I am not saying anything that is new or novel here, it has enjoyed power beyond anything that other media organisations have enjoyed—including the broadcasters, which, as we know, have been subject to regulation for a long time now. I say that because, in my mind, the fundamental problem this legislation seeks to address is the lack of accountability of social media and tech platforms to citizens and users for the power and influence they have over our lives and society, as well as their economic impact. The latter will be addressed via the Digital Markets, Competition and Consumers Bill.

I emphasise “if that is the problem”, because when we talk about this bit of the Bill and the amendments we have tabled, we have started—and I am as guilty of this as anyone else—to frame it very much as if the problem is around the powers for the Secretary of State. In my view, we need to think about why they are not, in the way they are currently proposed, the right solution to the problem that I have outlined.

I do not think what we should be doing, as some of what is proposed in the Bill tends to do, is shift the democratic deficit from big tech to the regulator, although, of course, like all regulators, Ofcom must serve the public interest as a whole, which means taking everyone’s expectations seriously in the way in which it goes about its work.

That kind of analysis of the problem is probably behind some of what the Government are proposing by way of greater powers for the Secretary of State for oversight and direction of the regulator in what is, as we have heard, a novel regulatory space. I think that the problem with some, although not all, of the new powers proposed for the Secretary of State is that they would undermine the independence of Ofcom and therefore dilute the regulator’s authority over the social media and tech platforms, and that is in addition to what the noble Lord, Lord Stevenson, has already said, which is that there is a fundamental principle about the independence of media regulators in the western world that we also need to uphold and to which the Government have already subscribed.

If that is the bigger picture, my amendments would redress the balance between the regulator and the Executive, but there remains the vital role of Parliament, which I will come back to in a moment and which the noble Lord, Lord Stevenson, has already touched on, because that is where we need to beef up oversight of regulators.

Before I get into the detail, I should also add that my amendments have the full authority of your Lordships’ Communications and Digital Select Committee, which I have the great honour of chairing. In January, we took evidence from my noble friend Minister and his colleague, Paul Scully, and our amendments are the result of their evidence. I have to say that my noble friend on the Front Bench is someone for whom I have huge respect and admiration, but on that day when the Ministers were before us, we found as a committee that the Government’s evidence in respect of the powers that they were proposing for the Secretary of State was not that convincing.

I shall outline the amendments, starting with Amendments 113, 114, and 115. I am grateful to other noble Lords who have signed them, which demonstrates support from around the House. The Bill allows the Secretary of State to direct Ofcom to change its codes of practice on regulating social media firms for reasons of public policy. While it is legitimate for the Government to set strategic direction, this goes further and amounts to direct and unnecessary interference. The Government have suggested clarifying this clause, as we have heard, with a list of issues such as security, foreign policy, economic policy and burden to business, but it is our view as a committee that the list of items is so vague and expansive that almost anything could be included in it. Nor does it recognise the fact that the Government should respect the separation of powers between Executive and regulator in the first place, as I have already described. These amendments would therefore remove the Secretary of State’s power to direct Ofcom for reasons of public policy. Instead, the Secretary of State may write to Ofcom with non-binding observations on issues of security and child safety to which it must have regard. It is worth noting that under Clause 156 the Secretary of State still has powers to direct Ofcom in special circumstances to address threats to public health, safety and security, so the Government will not be left toothless, although I note that the noble Lord, Lord Stevenson, is proposing to remove Clause 156. Just to be clear, the committee is not proposing removing Clause 156; that is a place where the noble Lord and I propose different remedies.

Amendments 117 and 118 are about limiting the risk of infinite ping-pong. As part of its implementation work, Ofcom will have to develop codes of practice, but the Government can reject those proposals infinitely if they disagree with them. At the moment that would all happen behind closed doors. In theory, this process could go on for ever, with no parliamentary oversight. The Select Committee and I struggle to understand why the Government see this power as necessary, so our amendments would remove the Secretary of State’s power to issue unlimited directions to Ofcom on a draft code of practice, replacing it with a maximum of two exchanges of letters.

Amendment 120, also supported by the noble Lords I referred to earlier, is closely related to previous amendments. It is designed to improve parliamentary oversight of Ofcom’s draft codes of practice. Given the novel nature of the proposals to regulate the online world, we need to ensure that the Government and Ofcom have the space and flexibility to develop and adapt their proposals accordingly, but there needs to be a role for Parliament in scrutinising that work and being able to hold the Executive and regulator to account where needed. The amendment would ensure that the affirmative procedure, and not the negative procedure currently proposed in the Bill, was used to approve Ofcom’s codes of practice if they had been subject to attempts by the Secretary of State to introduce changes. This amendment is also supported by the Delegated Powers and Regulatory Reform Committee in its report.

Finally, Amendment 257 would remove paragraph (a) from Clause 157(1). This is closely related to previous amendments regarding the Secretary of State’s powers. The clause currently provides powers to provide wide-ranging guidance to Ofcom about how it carries out its work. This is expansive and poorly defined, and the committee again struggled to see the necessity for it. The Secretary of State already has extensive powers to set strategic priorities for Ofcom, establish expert advisory committees, direct action in special circumstances, direct Ofcom about its codes or just write to it if my amendments are accepted, give guidance to Ofcom about its media literacy work, change definitions, and require Ofcom to review its codes and undertake a comprehensive review of the entire online safety regime. Including yet another power to give unlimited guidance to Ofcom about how it should carry out its work seems unnecessary and intrusive, so this amendment would remove it, by removing paragraph (a) of Clause 157(1).

I hope noble Lords can see that, even after taking account of the amendments that the committee is proposing, the Secretary of State would be left with substantial and suitable powers to discharge their responsibilities properly.

Perhaps I may comment on some of the amendments to which I have not added my name. Amendment 110 from the noble Lords, Lord Stevenson and Lord Clement-Jones, and Amendment 290 from the noble Lord, Lord Stevenson, are about parliamentary oversight by Select Committees. I do not support the detail of these amendments nor the procedures proposed, because I believe they are potentially too cumbersome and could cause too much delay to various processes. As I have already said, and as the noble Lord, Lord Stevenson, said in opening, the Select Committee and I are concerned to ensure that there is adequate parliamentary oversight of Ofcom as it implements this legislation over the next few years. My committee clearly has a role in this, alongside the new DSIT Select Committee in the House of Commons and perhaps others, but we need to guard against duplication and fragmentation.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

When we publish the wording, we will rightly have an opportunity to discuss it before the debate on Report. I will be happy to discuss it with noble Lords then. On the broader points about economic policy, that is a competency of His Majesty’s Government, not an area of focus for Ofcom. If the Government had access to additional information that led them to believe that a code of practice as drafted could have a significant, disproportionate and adverse effect on the livelihoods of the British people or to the broader economy, and if it met the test for exceptional circumstances, taking action via a direction from the Secretary of State could be warranted. I will happily discuss that when my noble friend and others see the wording of the changes we will bring on Report. I am sure we will scrutinise that properly, as we should.

I was about to say that, in addition to the commitment we have already made, in the light of the debate today we will also consider whether transparency about the use of this power could be increased further, while retaining the important need for government oversight of issues that are genuinely beyond Ofcom’s remit. I am conscious that, as my noble friend Lady Stowell politely said, I did not convince her or your Lordships’ committee when I appeared before it with my honourable friend Paul Scully. I am happy to continue our discussions and I hope that we may reach some understanding on this important area.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - -

I am sorry to interrupt, but may I clarify what my noble friend just said? I think he said that, although he is open to increasing the transparency of the procedure, he does not concede a change—from direction to a letter about guidance which Ofcom should take account of. Is he willing to consider that as well?

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I also failed to stand up before the noble Lord, Lord Allan, did. I too am always slightly nervous to speak before or after him for fear of not having the detailed knowledge that he does. There have been so many powerful speeches in this group. I will try to speak swiftly.

My role in this amendment was predefined for me by the noble Baroness, Lady Kidron, as the midwife. I have spent many hours debating these amendments with my noble friend Lord Bethell, the noble Baroness, Lady Kidron, and with many noble Lords who have already spoken in this debate. I think it is very clear from the debate why it is so important to put a definition of age assurance and age verification on the face of the Bill. People feel so passionately about this subject. We are creating the digital legal scaffolding, so being really clear what we mean by the words matters. It really matters and we have seen it mattering even in the course of this debate.

My two friends—they are my friends—the noble Baroness, Lady Kidron, and my noble friend Lord Bethell both used the word “proportionate”, with one not wanting us to be proportionate and the other wanting us to be proportionate. Yet, both have their names to the same amendment. I thought it might be helpful to explain what I think they both mean—I am sure they will interrupt me if I get this wrong—and explain why the words of the amendment matter so much.

Age assurance should not be proportionate for pornography. It should be the highest possible bar. We should do everything in our power to stop children seeing it, whether it is on a specific porn site or on any other site. We do not want our children to see pornography; we are all agreed on that. There should not be anything proportionate about that. It should be the highest bar. Whether “beyond reasonable doubt” is the right wording or it should instead be “the highest possible bar practically achievable”, I do not know. I would be very keen to hear my noble friend the Minister’s thoughts on what the right wording is because, surely, we are all clear it should be disproportionate; it should absolutely be the hardest we can take.

Equally, age assurance is not just about pornography, as the noble Lord, Lord Allan, has said. We need to have a proportionate approach. We need a ladder where age assurance for pornography sits at the top, and where we are making sure that nine year-olds cannot access social media sites if they are age-rated for 13. We all know that we can go into any primary school classroom in the land and find that the majority of nine year-olds are on social media. We do not have good age assurance further down.

As both the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, have said, we need age assurance to enable providers to adapt the experience to make it age-appropriate for children on services we want children to use. It needs to be both proportionate and disproportionate, and that needs to be defined on the face of the Bill. If we do not, I fear that we will fall into the trap that the noble Lord, Lord Allan, mentioned: the cookie trap. We will have very well-intentioned work that will not protect children and will go against the very thing that we are all looking for.

In my role as the pragmatic midwife, I implore my noble friend the Minister to hear what we are all saying and to help us between Committee and Report, so that we can come back together with a clear definition of age assurance and age verification on the face of the Bill that we can all support.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, about half an hour ago I decided I would not speak, but as we have now got to this point, I thought I might as well say what I was going to say after all. I reassure noble Lords that in Committee it is perfectly permissible to speak after the winder, so no one is breaking any procedural convention. That said, I will be very brief.

My first purpose in rising is to honour a commitment I made last week when I spoke against the violence against women and girls code. I said that I would none the less be more sympathetic to and supportive of stronger restrictions preventing child access to pornography, so I want to get my support on the record and honour that commitment in this context.

My noble friend Lady Harding spoke on the last group about bringing our previous experiences to bear when contributing to some of these issues. As I may have said in the context of other amendments earlier in Committee, as a former regulator, I know that one of the important guiding principles is to ensure that you regulate for a reason. It is very easy for regulators to have a set of rules. The noble Baroness, Lady Kidron, referred to rules of the road for the tech companies to follow. It is very easy for regulators to examine whether those rules are being followed and, having decided that they have, to say that they have discharged their responsibility. That is not good enough. There must be a result, an outcome from that. As the noble Lord, Lord Allan, emphasised, this must be about outcomes and intended benefits.

I support making it clear in the Bill that, as my noble friend Lady Harding said, we are trying to prevent, disproportionately, children accessing pornography. We will do all we can to ensure that it happens, and that should be because of the rules being in place. Ofcom should be clear on that. However, I also support a proportionate approach to age assurance in all other contexts, as has been described. Therefore, I support the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell, and the role my noble friend Lady Harding has played in arriving at a pragmatic solution.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a privilege to be in your Lordships’ House, and on some occasions it all comes together and we experience a series of debates and discussions that we perhaps would never have otherwise reached, and at a level which I doubt could be echoed anywhere else in the world. This is one of those days. We take for granted that every now and again, we get one of these rapturous occasions when everything comes together, but we forget the cost of that. I pay tribute, as others have, to the noble Baroness, Lady Kidron. She has worked so hard on this issue and lots of other issues relating to this Bill and has exhausted herself more times than is right for someone of her still youthful age. I am very pleased that she is going off on holiday and will not be with us for a few days; I wish her well. I am joking slightly, but I mean it sincerely when I say that we have had a very high-quality debate. That it has gone on rather later than the Whips would have wanted is tough, because it has been great to hear and be part of. However, I will be brief.

It was such a good debate that I felt a tension, in that everybody wanted to get in and say what they wanted to say be sure they were on the record. That can sometimes be a disaster, because everyone repeats everything, but as the noble Baroness, Lady Harding, said, we know our roles, we know what to say and when to say it, and it has come together very nicely. Again, we should congratulate ourselves on that. However, we must be careful about something which we keep saying to each other but sometimes do not do. This is a Bill about systems, not content. The more that we get into the content issues, the more difficult it is to remember what the Bill can do and what the regulator will be able to do if we get the Bill to the right place. We must be sure about that.

I want to say just a few things about where we need to go with this. As most noble Lords have said, we need certainty: if we want to protect our children, we have to be able to identify them. We should not be in any doubt about that; there is no doubt that we must do it, whatever it takes. The noble Lord, Lord Allan, is right to say that we are in the midst of an emerging set of technologies, and there will be other things coming down the line. The Bill must keep open to that; it must not be technology-specific, but we must be certain of what this part is about, and it must drill down to that. I come back to the idea of proportionality: we want everybody who is 18 or under to be identifiable as such, and we want to be absolutely clear about that. I like the idea that this should be focused on the phones and other equipment we use; if we can get to that level, it will be a step forward, although I doubt whether we are there yet.

Online Safety Bill

Baroness Stowell of Beeston Excerpts
Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I strongly support Amendment 97 in the name of the noble Baroness, Lady Morgan. We must strengthen the Bill by imposing an obligation on Ofcom to develop and issue a code of practice on violence against women and girls. This will empower Ofcom and guide services in meeting their duties in regard to women and girls, and encourage them to recognise the many manifestations of online violence that disproportionately affect women and girls.

Refuge, the domestic abuse charity, has seen a growing number of cases of technology-facilitated domestic abuse in recent years. As other noble Lords have said, this tech abuse can take many forms but social media is a particularly powerful weapon for perpetrators, with one in three women experiencing online abuse, rising to almost two in three among young women. Yet the tech companies have been too slow to respond. Many survivors are left waiting weeks or months for a response when they report abusive content, if indeed they receive one at all. It appears that too many services do not understand the risks and nature of VAWG. They do not take complaints seriously and they think that this abuse does not breach community standards. A new code would address this with recommended measures and best practice on the appropriate prevention of and response to violence against women and girls. It would also support the delivery of existing duties set out in the Bill, such as those on illegal content, user empowerment and child safety.

I hope the Minister can accept this amendment, as it would be in keeping with other government policies, such as in the strategic policing requirement, which requires police forces to treat violence against women and girls as a national threat. Adding this code would help to meet the Government’s national and international commitments to tackling online VAWG, such as the tackling VAWG strategy and the Global Partnership for Action on Gender-Based Online Harassment and Abuse.

The Online Safety Bill is a chance to act on tackling the completely unacceptable levels of abuse of women and girls by making it clear through Ofcom that companies need to take this matter seriously and make systemic changes to the design and operation of their services to address VAWG. It would allow Ofcom to add this as a priority, as mandated in the Bill, rather than leave it as an optional extra to be tackled at a later date. The work to produce this code has already been done thanks to Refuge and other charities and academics who have produced a model that is freely available and has been shared with Ofcom. So it is not an extra burden and does not need to delay the implementation of the Bill; in fact, it will greatly aid Ofcom.

The Government are to be congratulated on their amendment to include controlling or coercive behaviour in their list of priority offences. I would like to congratulate them further if they can accept this valuable Amendment 97.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, I start by commending my noble friend Lady Morgan on her clear introduction to this group of amendments. I also commend the noble Baroness, Lady Kidron, on her powerful speech.

From those who have spoken so far, we have a clear picture of the widespread nature of some of the abuse and offences that women experience when they go online. I note from what my noble friend Lady Morgan said that there is widespread support from a range of organisations outside the Committee for this group of amendments. She also made an important and powerful point about the potential chilling effect of this kind of activity on women, including women in public life, being able to exercise their right to freedom of expression.

I feel it is important for me to make it clear that—this is an obvious thing—I very much support tough legal and criminal sanctions against any perpetrator of violence or sexual abuse against women. I really do understand and support this, and hear the scale of the problem that is being outlined in this group of amendments.

Mine is a dissenting voice, in that I am not persuaded by the proposed solution to the problem that has been described. I will not take up a lot of the Committee’s time, but any noble Lords who were in the House when we were discussing a group of amendments on another piece of legislation earlier this year may remember that I spoke against making misogyny a hate crime. The reason why I did that then is similar, in that I feel somewhat nervous about introducing a code of conduct which is directly relevant to women. I do not like the idea of trying to address some of these serious problems by separating women from men. Although I know it is not the intention of a code such as this or any such measures, I feel that it perpetuates a sense of division between men and women. I just do not like the idea that we live in a society where we try to address problems by isolating or categorising ourselves into different groups of people, emphasising the sense of weakness and being victims of any kind of attack or offence from another group, and assuming that everybody who is in the other group will be a perpetrator of some kind of attack, criticism or violence against us.

My view is that, in a world where we see some of this serious activity happening, we should do more to support young men and boys to understand the proper expectations of them. When we get to the groups of amendments on pornography and what more we can do to prevent children’s access to it, I will be much more sympathetic. Forgive me if this sounds like motherhood and apple pie, but I want us to try to generate a society where basic standards of behaviour and social norms are shared between men and women, young and old. I lament how so much of this has broken down, and a lot of the problems we see in society are the fault of political and—dare I say it?—religious leaders not doing more to promote some of those social norms in the past. As I said, I do not want us to respond to the situation we are in by perpetuating more divisions.

I look forward to hearing what my noble friend the Minister has to say, but I am nervous about the solution proposed in the amendments.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, it gives me great pleasure to follow the noble Baroness, Lady Stowell of Beeston, not least because she became a dissenting voice, and I was dreading that I might be the only one.

First, I think it important that we establish that those of us who have spent decades fighting violence against women and girls are not complacent about it. The question is whether the physical violence we describe in the Bill is the same as the abuse being described in the amendments. I worry about conflating online incivility, abuse and vile things said with physical violence, as is sometimes done.

I note that Refuge, an organisation I have a great deal of respect for, suggested that the user empowerment duties that opted to place the burden on women users to filter out their own online experience was the same as asking women to take control of their own safety and protect themselves offline from violence. I thought that was unfair, because user empowerment duties and deciding what you filter out can be women using their agency.

Online Safety Bill

Baroness Stowell of Beeston Excerpts
Of course, the Minister’s conclusion is that there is no need to amend the Bill because we have parliamentary procedure and draft regulations, and because Ofcom will be consulted and so on. That is all fair enough. As the noble Baroness, Lady Morgan, said, this is a probing amendment. If we have done something to speed up the process, all well and good, but the essence of this is to get something cracking. I hope that the debate has at least had some impact, but this is still incredibly vague. We do not really know what role is envisaged for the IWF. The Minister has heard around the Committee the regard in which the IWF is held. He has heard our desire to see that it is an integral part of the protection process and the procedures under the Bill, and to see it work with Ofcom.
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, I have held back from contributing to this group, because it is not really my group and I have not really engaged in the topic at all. I have been waiting to see whether somebody who is engaged in it would raise this point.

The one factual piece of information that has not been raised in the debate is the fact that the IWF, of which I too am a huge admirer—I have huge respect for the work that it does; it does some fantastic work—is a registered charity. That may lead to some very proper questions about what its role should be in any kind of formal relationship with a statutory regulator. I noticed that no one is proposing in any of these amendments that it be put on the face of the Bill, which, searching back into my previous roles and experience, I think I am right to say would not be proper anyway. But even in the context of whatever role it might have along with Ofcom, I genuinely urge the DCMS and/or Ofcom to ensure that they consult the Charity Commission, not just the IWF, on what is being proposed so that it is compatible with its other legal obligations as a charity.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

If I might follow up that comment, I agree entirely with what the noble Baroness has just said. It is very tricky for an independent charity to have the sort of relationship addressed in some of the language in this debate. Before the Minister completes his comments and sits down again, I ask him: if Ofcom were to negotiate a contracted set of duties with the IWF—indeed, with many other charities or others who are interested in assisting with this important work—could that be done directly by Ofcom, with powers that it already has? I think I am right to say that it would not require parliamentary approval. It is only if we are talking about co-regulation, which again raises other issues, that we would go through a process that requires what sounded like the affirmative procedure—the one that was used, for example, with the Advertising Standards Authority. Is that right?

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, His Majesty’s Government are committed to defending the invaluable role of our free media. We are clear that our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information. That is why this Bill includes strong protections for recognised news publishers. The Bill does not impose new duties on news publishers’ content, which is exempt from the Bill’s safety duties. In addition, the Bill includes strong safeguards for news publisher content, set out in Clause 14. In order to benefit from these protections, publishers will have to meet a set of stringent criteria, set out in Clause 50.

I am aware of concerns in your Lordships’ House and another place that the definition of news publishers is too broad and that these protections could therefore create a loophole to be exploited. That is why the Government are bringing forward amendments to the definition of “recognised news publisher” to ensure that sanctioned entities cannot benefit from these protections. I will shortly explain these protections in detail but I would like to be clear that narrowing the definition any further would pose a critical risk to our commitment to self-regulation of the press. We do not want to create requirements which would in effect put Ofcom in the position of a press regulator. We believe that the criteria set out in Clause 50 are already strong, and we have taken significant care to ensure that established news publishers are captured, while limiting the opportunity for bad actors to benefit. 

Government Amendments 126A and 127A propose changes to the criteria for recognised news publishers. These criteria already exclude any entity that is a proscribed organisation under the Terrorism Act 2000 or the purpose of which is to support a proscribed organisation under that Act. We are clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections either. The amendments we are tabling today will therefore tighten the recognised news publisher criteria further by excluding entities that have been designated for sanctions imposed by both His Majesty’s Government and the United Nations Security Council. I hope noble Lords will accept these amendments, in order to ensure that content from publishers which pose a security threat to this country cannot benefit from protections designed to defend a free press.

In addition, the Government have also tabled amendments 50B, 50C, 50D, 127B, 127C and 283A, which are aimed at ensuring that the protections for news publishers in Clause 14 are workable and do not have unforeseen consequences for the operation of category 1 services. Clause 14 gives category 1 platforms a duty to notify recognised news publishers and offer a right of appeal before taking action against any of their content or accounts.

Clause 14 sets out the circumstances in which companies must offer news publishers an appeal. As drafted, it states that platforms must offer this before they take down news publisher content, before they restrict users’ access to such content or where they propose to “take any other action” in relation to publisher content. Platforms must also offer an appeal if they propose to take action against a registered news publisher’s account by giving them a warning, suspending or banning them from using a service or in any way restricting their ability to use a service.

These amendments provide greater clarity about what constitutes “taking action” in relation to news publisher content, and therefore when category 1 services must offer an appeal. They make it clear that a platform must offer this before they take down such content, add a warning label or take any other action against content in line with any terms of service that allow or prohibit content. This will ensure that platforms are not required to offer publishers a right of appeal every time they propose to carry out routine content curation and similar routine actions. That would be unworkable for platforms and would be likely to inhibit the effectiveness of the appeal process.

As noble Lords know, the Bill has a strong focus on user empowerment and enabling users to take control of their online experience. The Government have therefore tabled amendments to Clause 52 to ensure that providers are required only to offer publishers a right of appeal in relation to their own moderation decisions, not where a user has voluntarily chosen not to view certain types of content. For example, if a user has epilepsy and has opted not to view photo-sensitive content, platforms will not be required to offer publishers a right of appeal before restricting that content for the user in question.

In addition, to ensure that the Bill maintains strong protections for children, the amendments make it clear that platforms are not required to offer news publishers an appeal before applying warning labels to content viewed by children. The amendments also make it clear that platforms would be in breach of the legislation if they applied warning labels to content encountered by adults without first offering news publishers an appeal, but in order to ensure that the Bill maintains strong protections for children, that does not apply to warning labels on content encountered by children. I beg to move.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, I welcome the amendments the Government have tabled, but I ask the Minister to clarify the effect of Amendment 50E. I declare an interest as chair of the Communications and Digital Select Committee, which has discussed Amendment 50E and the labelling of content for children with the news media organisations. This is a very technical issue, but from what my noble friend was just saying, it seems that content that would qualify for labelling for child protection purposes, and which therefore does not qualify for a right of appeal before the content is so labelled, is not content that would normally be encountered by adults but might happen to appeal to children. I would like to be clear that we are not giving the platforms scope for adding labels to content that they ought not to be adding labels to. That aside, as I say, I am grateful to my noble friend for these amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, like the noble Baroness, Lady Stowell, I have no major objection and support the Government’s amendments. In a sense the Minister got his retaliation in first, because we will have a much more substantial debate on the scope of Clause 14. At this point I welcome any restriction on Clause 14 in the way that the Minister has stated.

Yet to come we have the whole issue of whether an unregulated recognised news publisher, effectively unregulated by the PRP’s arrangements, should be entitled to complete freedom in terms of below-the-line content, where there is no moderation and it does not have what qualifies as independent regulation. Some debates are coming down the track and—just kicking the tyres on the Minister’s amendments—I think the noble Baroness, Lady Stowell, made a fair point, which I hope the Minister will answer.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am sorry; in my enthusiasm to get this day of Committee off to a swift start, I perhaps rattled through that rather quickly. On Amendment 50E, which my noble friend Lady Stowell asked about, I make clear that platforms will be in breach of their duties if, without applying the protection, they add warning labels to news publishers’ content that they know will be seen by adult users, regardless of whether that content particularly appeals to children.

As the noble Lord, Lord Clement-Jones, and others noted, we will return to some of the underlying principles later on, but the Government have laid these amendments to clarify category 1 platforms’ duties to protect recognised news publishers’ content. They take some publishers out of scope of the protections and make it clearer that category 1 platforms will have only to offer news publishers an appeal before taking punitive actions against their content.

The noble Baroness, Lady Fox, asked about how we define “recognised news publisher”. I am conscious that we will debate this more in later groups, but Clause 50 sets out a range of criteria that an organisation must meet to qualify as a recognised news publisher. These include the organisation’s “principal purpose” being the publication of news, it being subject to a “standards code” and its content being “created by different persons”. The protections for organisations are focused on publishers whose primary purpose is reporting on news and current affairs, recognising the importance of that in a democratic society. I am grateful to noble Lords for their support.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - -

What my noble friend said is absolutely fine with me, and I thank him very much for it. It might be worth letting the noble Baroness, Lady Fox, know that Amendment 127 has now been moved to the group that the noble Lord, Lord Clement-Jones, referred to. I thought it was worth offering that comfort to the noble Baroness.

Online Safety Bill

Baroness Stowell of Beeston Excerpts
Moved by
46: After Clause 12, insert the following new Clause—
“Adult risk assessment duties
(1) This section sets out the duties about risk assessments in respect of adult users which apply in relation to Category 1 services.(2) A duty to carry out a suitable and sufficient adults’ risk assessment.(3) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—(a) the user base;(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of content specified in section 12(10) to (12), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;(c) the level of risk of functionalities of the service, including user empowerment tools, which facilitate the presence, identification, dissemination, and likelihood of users encountering or being alerted to, content specified in section 12(10) to (12);(d) the extent to which user empowerment tools might result in interference with users’ right to freedom of expression within the law (see section 18);(e) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.”Member’s explanatory statement
This and other amendments in the name of Baroness Stowell relate to risk assessments for adults in relation to platforms’ new duties to provide user empowerment tools. They would require platforms to provide public risk assessments in their terms of service and be transparent about the effect of user empowerment tools on users’ freedom of expression.
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - -

My Lords, in introducing this group, I will speak directly to the three amendments in my name—Amendments 46, 47 and 64. I will also make some general remarks about the issue of freedom of speech and of expression, which is the theme of this group. I will come to these in a moment.

The noble Lord, Lord McNally, said earlier that I had taken my amendments out of a different group— I hope from my introductory remarks that it will be clear why—but, in doing so, I did not realise that I would end up opening on this group. I offer my apologies to the noble Lord, Lord Stevenson of Balmacara, for usurping his position in getting us started.

I am grateful to the noble Baronesses, Lady Bull and Lady Featherstone, for adding their names. The amendments represent the position of the Communications and Digital Select Committee of your Lordships’ House. In proposing them, I do so with that authority. My co-signatories are a recent and a current member. I should add sincere apologies from the noble Baroness, Lady Featherstone, for not being here this evening. If she is watching, I send her my very best wishes.

When my noble friend Lord Gilbert of Panteg was its chair, the committee carried out an inquiry into freedom of speech online. This has already been remarked on this evening. At part of that inquiry, the committee concluded that the Government’s proposals in the then draft Bill—which may have just been a White Paper at that time—for content described as legal but harmful were detrimental to freedom of speech. It called for changes. Since then, as we know, the Government have dropped legal but harmful and instead introduced new user empowerment tools for adults to filter out harmful content. As we heard in earlier groups this evening, these would allow people to turn off or on content about subjects such as eating disorders and self-harm.

Some members of our committee might favour enhanced protection for adults. Indeed, some of my colleagues have already spoken in support of amendments to this end in other groups. Earlier this year, when the committee looked at the Bill as it had been reintroduced to Parliament, we agreed that, as things stood, these new user empowerment tools were a threat to freedom of speech. Whatever one’s views, there is no way of judging their impact or effectiveness—whether good or bad.

As we have heard already this evening, the Government have dropped the requirement for platforms to provide a public risk assessment of how these tools would work and their impact on freedom of speech. To be clear, for these user empowerment tools to be effective, the platforms will have to identify the content that users can switch off. This gives the platforms great power over what is deemed harmful to adults. Amendments 46, 47 and 64 are about ensuring that tech platforms are transparent about how they balance the principles of privacy, safety and freedom of speech for adults. These amendments would require platforms to undertake a risk assessment and publish a summary in their terms of service. This would involve them being clear about the effect of user empowerment tools on the users’ freedom of expression. Without such assessments, there is a risk that platforms would do either too much or too little. It would be very difficult to find out how they are filtering content and on what basis, and how they are addressing the twin imperatives of ensuring online safety without unduly affecting free speech.

To be clear, these amendments, unlike amendments in earlier groups, are neither about seeking to provide greater protection to adults nor about trying to reopen or revisit the question of legal but harmful. They are about ensuring transparency to give all users confidence about how platforms are striking the right balance. While their purpose is to safeguard freedom of speech, they would also bring benefits to those adults who wanted to opt in to the user empowerment tool because they would be able to assess what it was they were choosing not to see.

It is because of their twin benefits—indeed, their benefit to everyone—that we decided formally, as a committee, to recommend these amendments to the Government and for debate by your Lordships’ House. That said, the debate earlier suggests support for a different approach to enhancing protection for adults, and we may discover through this debate a preference for other amendments in this group to protect freedom of speech—but that is why we have brought these amendments forward.

--- Later in debate ---
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - -

My Lords, your Lordships will want me to be brief, bearing in mind the time. I am very grateful for the support I received from my noble friends Lady Harding and Lady Fraser and the noble Baronesses, Lady Kidron and Lady Bull, for the amendments I tabled. I am particularly grateful to the noble Baroness, Lady Bull, for the detail she added to my description of the amendments. I can always rely on the noble Baroness to colour in my rather broad-brush approach to these sorts of things.

I am pleased that the noble Lord, Lord Stevenson, made his remarks at the beginning of the debate. That was very helpful in setting the context that followed. We have heard a basic theme come through from your Lordships: a lack of certainty that the Government have struck the right balance between privacy protection and freedom of expression. I never stop learning in your Lordships’ House. I was very pleased to learn from the new Milton—my noble friend Lord Moylan—that freedom of expression is a fundamental right. Therefore, the balance between that and the other things in the Bill needs to be considered in a way I had not thought of before.

What is clear is that there is a lack of confidence from all noble Lords—irrespective of the direction they are coming from in their contributions to this and earlier debates— either that the balance has been properly struck or that some of the clauses seeking to address freedom of speech in the Bill are doing so in a way that will deliver the outcome and overall purpose of this legislation as brought forward by the Government.

I will make a couple of other points. My noble friend Lord Moylan’s amendments about the power of Ofcom in this context were particularly interesting. I have some sympathy for what he was arguing. As I said earlier, the question of power and the distribution of it between the various parties involved in this new regime will be one we will look at in broad terms certainly in later groups.

On the amendments of the noble Lord, Lord Stevenson, on Clauses 13, 14 and so on and the protections and provisions for news media, I tend towards the position of my noble friend Lord Black, against what the noble Lord, Lord Stevenson, argued. As I said at the beginning, I am concerned about the censorship of our news organisations by the tech firms. But I also see his argument, and that of the noble Viscount, Lord Colville, that it is not just our traditional legacy media that provides quality journalism now—that is an important issue for us to address.

I am grateful to my noble friend the Minister for his round-up and concluding remarks. Although it is heartening to hear that he and the Bill team will consider the amendment from the noble and learned Lord, Lord Hope, in this group, we are looking—in the various debates today, for sure—for a little more responsiveness and willingness to consider movement by the Government on various matters. I hope that he is able to give us more encouraging signs of this, as we proceed through Committee and before we get to further discussions with him—I hope—outside the Chamber before Report. With that, I of course withdraw my amendment.

Amendment 46 withdrawn.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a really interesting debate. I started out thinking that we were developing quite a lot of clarity. The Government have moved quite a long way since we first started debating senior manager liability, but there is still a bit of fog that needs dispelling—the noble Baronesses, Lady Kidron and Lady Harding, have demonstrated that we are not there yet.

I started off by saying yes to this group, before I got to grips with the government amendments. I broadly thought that Amendment 33, tabled by the noble Lord, Lord Stevenson, and Amendment 182, tabled by the noble Lord, Lord Bethell, were heading in the right direction. However, I was stopped short by Trustpilot’s briefing, which talked about a stepped approach regarding breaches and so on—that is a very strong point. It says that it is important to recognise that not all breaches should carry the same weight. In fact, it is even more than that: certain things should not even be an offence, unless you have been persistent or negligent. We have to be quite mindful as to how you formulate criminal offences.

I very much liked what the noble Lord, Lord Bethell, had to say about the tech view of its own liability. We have all seen articles about tech exceptionalism, and, for some reason, that seems to have taken quite a hold—so we have to dispel that as well. That is why I very much liked what the noble Lord, Lord Curry, said. It seemed to me that that was very much part of a stepped approach, while also being transparent to the object of the exercise and the company involved. That fits very well with the architecture of the Bill.

The noble Baroness, Lady Harding, put her finger on it: the Bill is not absolutely clear. In the Government’s response to the Joint Committee’s report, we were promised that, within three to six months, we would get that senior manager liability. On reading the Bill, I am certainly still a bit foggy about it, and it is quite reassuring that the noble Baroness, Lady Harding, is foggy about it too. Is that senior manager liability definitely there? Will it be there?

The Joint Committee made two other recommendations which I thought made a lot of sense: the obligation to report on risk assessment to the main board of a company, and the appointment of a safety controller, which the noble Lord, Lord Knight, mentioned. Such a controller would make it very clear—as with GDPR, you would have a senior manager who you can fix the duty on.

Like the noble Baroness, Lady Harding, I would very much like to hear from the Minister on the question of personal liability, as well as about Ofcom. It is important that any criminal prosecution is mediated by Ofcom; that is cardinal. You cannot just create criminal offences where you can have a prosecution without the intervention of Ofcom. That is extraordinarily important.

I have just a couple of final points. The noble Baroness, Lady Fox, comes back quite often to this point about regulation being the enemy of innovation. It very much depends what kind of innovation we are talking about. Technology is not necessarily neutral. It depends how the humans who deploy it operate it. In circumstances such as this, where we are talking about children and about smaller platforms that can do harm, I have no qualms about having regulation or indeed criminal liability. That is a really important factor. We are talking about a really important area.

I very strongly support Amendment 219. It deals with a really important aspect which is completely missing from the Bill. I have a splendid briefing here, which I am not going to read out, but it is all about Mastodon being one example of a new style of federated platform in which the app or hub for a network may be category 1 owing to the size of its user base but individual subdomains or networks sitting below it could fall under category 2 status. I am very happy to give a copy of the briefing to the Minister; it is a really well-written brief, and demonstrates entirely some of the issues we are talking about here.

I reassure the noble Lord, Lord Knight, that I think the amendment is very well drafted. It is really quite cunning in the way that it is done.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, I wonder whether I can make a brief intervention—I am sorry to do so after the noble Lord, Lord Clement-Jones, but I want to intervene before my noble friend the Minister stands up, unless the Labour Benches are about to speak.

I have been pondering this debate and have had a couple of thoughts. Listening to the noble Lord, Lord Clement-Jones, I am reminded of something which was always very much a guiding light for me when I chaired the Charity Commission, and therefore working in a regulatory space: regulation is never an end in itself; you regulate for a reason.

I was struck by the first debate we had on day one of Committee about the purpose of the Bill. If noble Lords recall, I said in that debate that, for me, the Bill at its heart was about enhancing the accountability of the platforms and the social media businesses. I felt that the contribution from my noble friend Lady Harding was incredibly important. What we are trying to do here is to use enforcement to drive culture change, and to force the organisations not to never think about profit but to move away from profit-making to focusing on child safety in the way in which they go about their work. That is really important when we start to consider the whole issue of enforcement.

It struck me at the start of this discussion that we have to be clear what our general approach and mindset is about this part of our economy that we are seeking to regulate. We have to be clear about the crimes we think are being committed or the offences that need to be dealt with. We need to make sure that Ofcom has the powers to tackle those offences and that it can do so in a way that meets Parliament’s and the public’s expectations of us having legislated to make things better.

I am really asking my noble friend the Minister, when he comes to respond on this, to give us a sense of clarity on the whole question of enforcement. At the moment, it is insufficiently clear. Even if we do not get that level of clarity today, when we come back later on and look at enforcement, it is really important that we know what we are trying to tackle here.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will endeavour to give that clarity, but it may be clearer still if I flesh some points out in writing in addition to what I say now.

--- Later in debate ---
Lord McNally Portrait Lord McNally (LD)
- Hansard - - - Excerpts

My Lords, as a former Deputy Leader of this House, if I were sitting on the Front Bench, I would have more gumption than to try to start a debate only 10 minutes before closing time. But I realise that the wheels grind on—perhaps things are no longer as flexible as they were in my day—so noble Lords will get my speech. The noble Lord, Lord Grade, who is at his post—it is very encouraging to see the chair of Ofcom listening to this debate—and I share a love of music hall. He will remember Eric Morecambe saying that one slot was like the last slot at the Glasgow Empire on a Friday night. That is how I feel now.

A number of references have been made to those who served on the Joint Committee and what an important factor it has been in their thinking. I have said on many occasions that one of the most fulfilling times of my parliamentary life was serving on the Joint Committee for the Communications Act 2003. The interesting thing was that we had no real idea of what was coming down the track as far as the internet was concerned, but we did set up Ofcom. At that time, a lot of the pundits and observers were saying, “Murdoch’s lawyers will have these government regulators for breakfast”. Well, they did not. Ofcom has turned into a regulator for which—at some stages this has slightly worried me—for almost any problem facing the Government, they say, “We’ll give it to Ofcom”. It has certainly proved that it can regulate across a vast area and with great skill. I have every confidence that the noble Lord, Lord Grade, will take that forward.

Perhaps it is to do with the generation I come from, but I do not have this fear of regulation or government intervention. In some ways, the story of my life is that of government intervention. If I am anybody’s child, I am Attlee’s child—not just because of the reforms of the Labour Party, but the reforms of the coalition Government, the Butler Education Act and the bringing in of the welfare state. So I am not afraid of government and Parliament taking responsibility in addressing real dangers.

In bringing forward this amendment, along with my colleague the noble Lord, Lord Lipsey, who cannot be here today, I am referring to legislation that is 20 years old. That is a warning to newcomers; it could be another 20 years before parliamentary time is found for a Bill of this complexity, so we want to be sure that we get its scope right.

The Minister said recently that the Bill is primarily a child safety Bill, but it did not start off that way. Five years ago, the online harms White Paper was seen as a pathfinder and trailblazer for broader legislation. Before we accept the argument that the Bill is now narrowed down to more specific terms, we should think about whether there are other areas that still need to be covered.

These amendments are in the same spirit as those in the names of the noble Baronesses, Lady Stowell, Lady Bull, and Lady Featherstone. We seek to reinstate an adult risk assessment duty because we fear that the change in title signals a reduction in scope and a retreat from the protections which earlier versions of the Bill intended to provide.

It was in this spirit, and to enable us to get ahead of the game, that in 2016 I proposed a Private Member’s Bill on this subject: the Online Harms Reduction Regulator (Report) Bill, which asked Ofcom to publish, in advance of the anticipated legislation, assessments of what action was needed to reduce harm to users and wider society from social networks. I think we can all agree that, if that work had been done in advance of the main legislation, such evidence would be very useful now.

I am well aware that there are those who, in the cause of some absolute concepts of freedom, believe that to seek to broaden the scope of the Bill takes us into the realms of the nanny state. But part of the social contract which enables us to survive in this increasingly complex world is that the ordinary citizen, who is busy struggling with the day-to-day challenges of normal life, does trust his Government and Parliament to keep an anticipatory weather eye on what is coming down the track and what dangers lie therein for the ordinary citizen.

When there have been game-changing advances in technology in the past, it has often taken a long time for societies to adapt and adjust. The noble Lord, Lord Moylan, referred to the invention of the printing press. That caused the Reformation, the Industrial Revolution and around 300 years of war, so we have to be careful how we handle these technological changes. Instagram was founded in 2010, and the iPhone 4 was released then too. One eminent social psychologist wrote:

“The arrival of smartphones rewired social life.”


It is not surprising that liberal democracies, with their essentially 18th-century construct of democracy, struggle to keep up.

The record of big tech in the last 20 years has, yes, been an amazing leap in access to information. However, that quantum leap has come with a social cost in almost every aspect of our lives. Nevertheless, I refuse to accept the premise that these technologies are too global and too powerful in their operation for them not to come within the reach of any single jurisdiction or the rule of law. I am more impressed by efforts by big tech companies to identify and deal with real harms than I am by threats to quit this or that jurisdiction if they do not get the light-touch regulation they want so as to be able to profit maximise.

We know by their actions that some companies and individuals simply do not care about their social responsibilities or the impact of what they sell and how they sell it on individuals and society as a whole. That is why the social contract in our liberal democracies means a central role for Parliament and government in bringing order and accountability into what would otherwise become a jungle. That is why, over the last 200 years, Parliament has protected its citizens from the bad behaviour of employers, banks, loan sharks, dodgy salesmen, insanitary food, danger at work and so on. In this new age, we know that companies large and small, British and foreign, can, through negligence, indifference or malice, drive innocent people into harmful situations. The risks that people face are complex and interlocking; they cannot be reduced to a simple list, as the Government seek to do in Clause 12.

When I sat on the pre-legislative committee in 2003, we could be forgiven for not fully anticipating the tsunami of change that the internet, the world wide web and the iPhone were about to bring to our societies. That legislation did, as I said, establish Ofcom with a responsibility to promote media literacy, which it has only belatedly begun to take seriously. We now have no excuse for inaction or for drawing up legislation so narrowly that it fails to deal with the wide risks that might befall adults in the synthetic world of social media.

We have tabled our amendments not because they will solve every problem or avert every danger but because they would be a step in the right direction and so make this a better Bill.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

I am very grateful to the noble Lord, Lord McNally, for namechecking me and the amendments I have tabled with the support of the noble Baronesses, Lady Featherstone and Lady Bull, although I regret to inform him that they are not in this group. I understand where the confusion has come from. They were originally in this group, but as it developed I felt that my amendments were no longer in the right place. They are now in the freedom of expression group, which we will get to next week. What he has just said has helped, because the amendments I am bringing forward are not similar to the ones he has tabled. They have a very different purpose. I will not pre-empt the debate we will have when we get to freedom of expression, but I think it is only proper that I make that clear. I am very grateful to the noble Lord for the trail.

Debate on Amendment 33B adjourned.

Online Safety Bill

Baroness Stowell of Beeston Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I ask the Committee to have a level of imagination here because I have been asked to read the speech of the noble Viscount, Lord Colville—

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

I do not know who advised the noble Baroness—and forgive me for getting up and getting all former Leader on her—but this is a practice that we seem to have adopted in the last couple of years and that I find very odd. It is perfectly proper for the noble Baroness to deploy the noble Viscount’s arguments, but to read his speech is completely in contravention of our guidance.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I beg the pardon of the Committee. I asked about it and was misinformed; I will do as the noble Baroness says.

The noble Viscount, Lord Colville, is unable to be with us. He put his name to Amendments 273, 275, 277 and 280. His concern is that the Bill sets the threshold for illegality too low and that in spite of the direction provided by Clause 170, the standards for determining illegality are too vague.

I will make a couple of points on that thought. Clause 170(6) directs that a provider must have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”,

but that does not mean that the platform has to be certain that the content is illegal before it takes it down. This is concerning when you take it in combination with what or who will make judgments on illegality.

If a human moderator makes the decision, it will depend on the resources and time available to them as to how much information they gather in order to make that judgment. Unlike in a court case, when a wide range of information and context can be gathered, when it comes to decisions about content online, these resources are very rarely available to human moderators, who have a vast amount of content to get through.

If an automated system makes the judgment, it is very well established that algorithms are not good at context—the Communications and Digital Committee took evidence on this repeatedly when I was on it. AI simply uses the information available in the content itself to make a decision, which can lead to significant missteps. Clause 170(3) provides the requirement for the decision-makers to judge whether there is a defence for the content. In the context of algorithms, it is very unclear how they will come to such a judgment from the content itself.

I understand that these are probing amendments, but I think the concern is that the vagueness of the definition will lead to too much content being taken down. This concern was supported by Parliament’s Joint Committee on Human Rights, which wrote to the former Culture Secretary, Nadine Dorries, on that matter. I apologise again.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

That is very helpful.

I am concerned that removing so-called illegal content for the purpose of complying with the regulatory system covers not only that which reaches conviction in a criminal court but possibly anything that a platform determines could be illegal, and therefore it undermines our own legal system. As I have said, that marks a significant departure from the rule of law. It seems that the state is asking or mandating private companies to make determinations about what constitutes illegality.

The obligations on a platform to determine what constitutes illegality could obviously become a real problem, particularly in relation to limitations on free expression. As we have already heard, the Public Order Act 1986 criminalises, for example, those who stir up hatred through the use of words, behaviour or written material. That is contentious in the law offline. By “contentious”, I mean that it is a matter of difficulty that requires the full rigour of the criminal justice system, understanding the whole history of established case law. That is all necessary to make a conviction under that law for offences of this nature.

Now we appear to be saying that, without any of that, social media companies should make the decision, which is a nerve-racking situation to be in. We have already heard the slippery phrase “reasonable grounds to infer”. If that was the basis on which you were sent to prison—if they did not have to prove that you were guilty but they had reasonable grounds to infer that you might be, without any evidence—I would be worried, yet reasonable grounds to infer that the content could be illegal is the basis on which we are asking for those decisions to be made. That is significantly below the ordinary burden of proof required to determine that an illegal act has been committed. Under this definition, I fear that platforms will be forced to overremove and censor what ultimately will be entirely lawful speech.

Can the Minister consider what competency social media companies have to determine what is lawful? We have heard some of the dilemmas from somebody who was in that position—let alone the international complications, as was indicated. Will all these big tech companies have to employ lots of ex-policemen and criminal lawyers? How will it work? It seems to me that there is a real lack of qualifications in that sphere— that is not a criticism, because those people decided to work in big tech, not in criminal law, and yet we are asking them to pursue this. That is a concern.

I will also make reference to what I think are the controversies around government Amendments 136A and 136B to indicate the difficulties of these provisions. They concern illegal activity—such as “assisting unlawful immigration”, illegal entry, human trafficking and similar offences—but I am unsure as to how this would operate. While it is the case that certain entrances to the UK are illegal, I suddenly envisage a situation where a perfectly legitimate political debate—for example, about the small boats controversy—would be taken down, and that people advocating for a position against the Government’s new Illegal Migration Bill could be accused of supporting illegality. What exactly will be made illegal in those amendments to the Online Safety Bill?

The noble Baroness, Lady Buscombe, made a fascinating speech about an interesting group of amendments. Because of the way the amendments are grouped, I feel that we have moved to a completely different debate, so I will not go into any detail on this subject. Anonymous trolling, Twitter storms and spreading false information are incredibly unpleasant. I am often the recipient of them—at least once a week—so I know personally that you feel frustrated that people tell lies and your reputation is sullied. However, I do not think that these amendments offer the basis on which that activity should be censored, and I will definitely argue against removing anonymity clauses—but that will be in another group. It is a real problem, but I do not think that the solution is contained in these amendments.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, my contribution will be less officious than my intervention earlier in this group. In the last couple of years since I returned to the House—as I describe it—having spent time at the Charity Commission, I have noticed a new practice emerging of noble Lords reading out other people’s speeches. Every time I had seen it happen before, I had not said anything, but today I thought, “I can’t sit here and not say anything again”. I apologise for my intervention.

I am grateful to my noble friend Lord Moylan for bringing forward his amendments and for introducing them in the incredibly clear way he did; they cover some very complex and diverse issues. I know that there are other amendments in the group which might be described as similar to his.

There are a couple of things I want to highlight. One interesting thing about the debate on this group is the absence of some of our legal friends—I apologise to my noble friend Lady Buscombe, who is of course a very distinguished lawyer. The point I am making is that we are so often enriched by a lot of legal advice and contributions on some of the more challenging legal issues that we grapple with, but we do not have that today, and this is a very difficult legal issue.

It is worth highlighting again, as has been touched on a little in some of the contributions, the concern, as I understand it, with how the Bill is drafted in relation to illegal content and the potential chilling effect of these clauses on social media platforms. As has already been said, there is a concern that it might lead them to take a safety-first approach in order to avoid breaking the law and incurring the sanctions and fines that come with the Bill, which Ofcom will have the power to apply. That is the point we are concerned with here. It is the way in which this is laid out, and people who are much better equipped than I am have already explained the difference between evidence versus reasonable grounds to infer.

What the noble Lord, Lord Allan, hit on in his contribution is also worth taking into account, and that is the role of Ofcom in this situation. One of the things I fear, as we move into an implementation phase and the consequences of the Bill start to impact on the social media firms, is the potential for the regulator to be weaponised in a battle on the cultural issues that people are becoming increasingly exercised about. I do not have an answer to this, but I think it is important to understand the danger of where we might get to in the expectations of the regulator if we create a situation where the social media platforms are acting in a way that means people are looking for recourse or a place to generate further an argument and a battle that will not be helpful at all.

I am not entirely sure, given my lack of legal expertise —this is why I would have been very grateful for some legal expertise on this group—whether what my noble friend is proposing in his amendments is the solution, but I think we need to be very clear that this is a genuine problem. I am not sure, as things stand in the Bill, that we should be comfortable that it is not going to create problems. We need to find a way to be satisfied that this has been dealt with properly.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

It is a great honour to follow my noble friend. I completely agree with her that this is a powerful discussion and there are big problems in this area. I am grateful also to my noble friend Lord Moylan for raising this in the first place. It has been a very productive discussion.

I approach the matter from a slightly different angle. I will not talk about the fringe cases—the ones where there is ambiguity, difficulty of interpretation, or responsibility or regulatory override, all of which are very important issues. The bit I am concerned about is where primary priority content that clearly demonstrates some kind of priority offence is not followed up by the authorities at all.

The noble Lord, Lord Allan, referred to this point, although he did slightly glide over it, as though implying, if I understood him correctly, that this was not an area of concern because, if a crime had clearly been committed, it would be followed up on. My fear and anxiety is that the history of the internet over the last 25 years shows that crimes—overt and clear crimes that are there for us to see—are very often not followed up by the authorities. This is another egregious example of where the digital world is somehow exceptionalised and does not have real-world rules applied to it.

Online Safety Bill

Baroness Stowell of Beeston Excerpts
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The noble Lord has concluded with my conclusion, which was to say that those services will be driven out, but not because they are irresponsible around horrible, dangerous messages. They do not read our messages because they are private. However, if we ever receive anything that makes us feel uncomfortable, they should be put under pressure to act. Many of them already do and are actually very responsible, but that is different from demanding that they scan our messages and we breach that privacy.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, that last exchange was incredibly helpful. I am grateful to the noble Lord, Lord Allan, for what he just said and the way in which he introduced this group. I want to make only a few brief remarks.

I have put my name to two amendments in this group: Amendment 202 in the name of the noble Lord, Lord Stevenson, which seeks to ensure that Ofcom will be subject to the same kind of requirements and controls as exist under the Regulation of Investigatory Powers Act before issuing a technology notice

“to a regulated service which offers private messaging with end-to-end encryption”;

and Amendment 285, also in the name of the noble Lord, Lord Stevenson, and that of the noble Lord, Lord Clement-Jones. This amendment would make sure that no social media platforms or private end-to-end messaging services have an obligation generally to monitor what is going on across their platforms. When I looked at this group and the various amendments in it, those were the two issues that I thought were critical. These two amendments seemed to approach them in the most simple and straightforward manner.

Like other noble Lords, my main concern is that I do not want search and social media platforms to have an obligation to become what we might describe as thought police. I do not want private messaging firms to start collecting and storing the content of our messages so that they have what we say ready to hand over in case they are required to do so. What the noble Lord, Lord Allan, just said is an important point to emphasise. Some of us heard from senior representatives from WhatsApp a few weeks ago. I was quite surprised to learn how much they are doing in this area to co-operate with the authorities; I felt very reassured to learn about that. I in no way want to discourage that because they are doing an awful amount of good stuff.

Basically, this is such a sensitive matter, as has been said, that it is important for the Government to be clear what their policy intentions are by being clear in the Bill. If they do not intend to require general monitoring that needs to be made explicit. It is also important that, if Ofcom is to be given new investigatory powers or powers to insist on things through these technology notices, it is clear that its powers do not go beyond those that are already set out in law. As we have heard from noble Lords, there is widespread concern about this matter not just from the social media platforms and search engines themselves but from news organisations, journalists and those lobby groups that often speak out on liberty-type matters. These topics go across a wide range of interest groups, so I very much hope that my noble friend the Minister will be able to respond constructively and open-mindedly on them.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I was not intending to intervene on this group because my noble friend Lord Stevenson will address these amendments in their entirety, but listening in to this public conversation about this group of amendments has stimulated a question that I want both to put on the record and to give the Minister time to reflect on.

If we get the issues of privacy and encrypted messaging wrong, it will push more people into using VPN—virtual private network—services. I went into the app store on my phone to search for VPN software. There is nothing wrong with such software—our parliamentary devices have it to do general monitoring and make sure that we do not use services such as TikTok—but it is used to circumnavigate much of the regulatory regime that we are seeking to put together through this Bill. When I search for VPNs in the app store, the first one that comes up that is not a sponsored, promoted advertisement has an advisory age limit of four years old. Several of them are the same; some are 17-plus but most are four-plus. Clearly, the app promotes itself very much on the basis that it offers privacy and anonymity, which are the key features of a VPN. However, a review of it says, “I wouldn’t recommend people use this because it turns out that this company sends all its users’ data to China so that it can do general monitoring”.

I am not sure how VPNs are being addressed by the Bill, even though they seem really pertinent to the issues of privacy and encryption. I would be interested to hear whether—and if we are, how—we are bringing the regulation and misuse of VPNs into scope for regulation by Ofcom.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - -

Before my noble friend moves on, when he is reviewing that back in the office, could he also satisfy himself that the concerns coming from the journalism and news organisations in the context of RIPA are also understood and have been addressed? That is another angle which, from what my noble friend has said so far, I am not sure has really been acknowledged. That is not a criticism but it is worth him satisfying himself on it.

Online Safety Bill

Baroness Stowell of Beeston Excerpts
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the two noble Baronesses. I remind the Committee of my background as a board member of the Centre for Data Ethics and Innovation. I also declare an indirect interest, as my oldest son is the founder and studio head of Mediatonic, which is now part of Epic Games and is the maker of “Fall Guys”, which I am sure is familiar to your Lordships.

I speak today in support of Amendments 2 and 92 and the consequent amendments in this group. I also support the various app store amendments proposed by the noble Baroness, Lady Harding, but I will not address them directly in these remarks.

I was remarkably encouraged on Wednesday by the Minister’s reply to the debate on the purposes of the Bill, especially by the priority that he and the Government gave to the safety of children as its primary purpose. The Minister underlined this point in three different ways:

“The main purposes of the Bill are: to give the highest levels of protection to children … The Bill will require companies to take stringent measures to tackle illegal content and protect children, with the highest protections in the Bill devoted to protecting children … Children’s safety is prioritised throughout this Bill”.—[Official Report, 19/4/23; col. 724.]


The purpose of Amendments 2 and 92 and consequent amendments is to extend and deepen the provisions in the Bill to protect children against a range of harms. This is necessary for both the present and the future. It is necessary in the present because of the harms to which children are exposed through a broad range of services, many of which are not currently in the Bill’s scope. Amendment 2 expands the scope to include any internet service that meets the child user condition and enables or promotes harmful activity and content as set out in the schedule provided. Why would the Government not take this step, given the aims and purposes of the Bill to give the highest protection to children?

Every day, the diocese of Oxford educates some 60,000 children in our primary and secondary schools. Almost all of them have or will have access to a smartphone, either late in primary, hopefully, or early in secondary school. The smartphone is a wonderful tool to access educational content, entertainment and friendship networks, but it is also a potential gateway for companies, children and individuals to access children’s inner lives, in secret, in the dead of night and without robust regulation. It therefore exposes them to harm. Sometimes that harm is deliberate and sometimes unintentional. This power for harm will only increase in the coming years without these provisions.

The Committee needs to be alert to generational changes in technology. When I was 16 in secondary school in Halifax, I did a computer course in the sixth form. We had to take a long bus ride to the computer building in Huddersfield University. The computer filled several rooms in the basement. The class learned how to program using punch cards. The answers to our questions came back days later, on long screeds of printed paper.

When my own children were teenagers and my oldest was 16, we had one family computer in the main living room of the house. The family was able to monitor usage. Access to the internet was possible, but only through a dial-up modem. The oldest of my grandchildren is now seven and many of his friends have smartphones now. In a few years, he will certainly carry a connected device in his pocket and, potentially, have access to the entire internet 24/7.

I want him and millions of other children to have the same protection online as he enjoys offline. That means recognising that harms come in a variety of shapes and sizes. Some are easy to spot, such as pornography. We know the terrible damage that porn inflicts on young lives. Some are more insidious and gradual: addictive behaviours, the promotion of gambling, the erosion of confidence, grooming, self-harm and suicidal thoughts, encouraging eating disorders, fostering addiction through algorithms and eroding the barriers of the person.

The NSPCC describes many harms to children on social networks that we are all now familiar with, but it also highlights online chat, comments on livestream sites, voice chat in games and private messaging among the vectors for harm. According to Ofcom, nine in 10 children in the UK play video games, and they do so on devices ranging from computers to mobile phones to consoles. Internet Matters says that most children’s first interaction with someone they do not know online is now more likely to be in a video game such as “Roblox” than anywhere else. It also found that parents underestimate the frequency with which their children are contacted by strangers online.

The Gambling Commission has estimated that 25,000 children in the UK aged between 11 and 16 are problem gamblers, with many of them introduced to betting via computer games and social media. Families have been left with bills, sometimes of more than £3,000, after uncontrolled spending on loot boxes.

Online companies, we know, design their products with psychological principles of engagement firmly in view, and then refine their products by scraping data from users. According to the Information Commissioner, more than 1 million underage children could have been exposed to underage content on TikTok alone, with the platform collecting and using their personal data.

As the noble Baroness, Lady Kidron, has said, we already have robust and tested definitions of scope in the ICO’s age-appropriate design code—definitions increasingly taken up in other jurisdictions. To give the highest protection to children, we need to build on these secure definitions in this Bill and find the courage to extend robust protection across the internet now.

We also need to future-proof this Bill. These key amendments would ensure that any development, any new kind of service not yet imagined which meets the child user condition and enables or promotes harmful activity and content, would be in scope. This would give Ofcom the power to develop new guidance and accountabilities for the applications that are certain to come in the coming years.

We have an opportunity and a responsibility, as the Minister has said, to build the highest protection into this Bill. I support the key amendments standing in my name.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, first, I beg the indulgence of the Committee to speak briefly at this juncture. I know that no one from the Lib Dem or Labour Benches has spoken yet, but I need to dash over to the Moses Room to speak to some amendments I am moving on the Bill being considered there. Secondly, I also ask the Committee that, if I do not get back in time for the wind-ups, I be forgiven on this occasion.

I simply wanted to say something briefly in support of Amendments 19, 22, 298 and 299, to which I have added my name. My noble friend Lady Harding has already spoken to them comprehensively, so there little I want to add; I just want to emphasise a couple of points. But first, if I may, I will pick up on something the right reverend Prelate said. I think I am right in saying that the most recent Ofcom research shows that 57% of 7 year-olds such as his grandchild have their own phone, and by the time children reach the age of 12 they pretty much all have their own phone. One can only imagine that the age at which children possess their own device is going to get lower.

Turning to app stores, with which these amendments are concerned, currently it is the responsibility of parents and developers to make sure that children are prevented from accessing inappropriate content. My noble friend’s amendments do not dilute in any way the responsibility that should be held by those two very important constituent groups. All we are seeking to do is ensure that app stores, which are currently completely unregulated, take their share of responsibility for making sure that those seeking to download and then use such apps are in the age group the apps are designed for.

As has already been very powerfully explained by my noble friend and by the noble Baroness, Lady Kidron, different age ratings are being given by the two different app stores right now. It is important for us to understand, in the context of the digital markets and competition Bill, which is being introduced to Parliament today—I cannot tell noble Lords how long we have waited for that legislation and how important it is, not least because it will open up competition, particularly in app stores—that the more competition there will be across app stores and the doorways through which children can go to purchase or download apps, the more important it is that there is consistency and some regulation. That is why I support my noble friend and was very happy to add my name to her amendments.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I have had a helpful reminder about declarations of interest. I once worked for Facebook; I divested myself of any financial interest back in 2020, but of course a person out there may think that what I say today is influenced by the fact that I previously took the Facebook shilling. I want that to be on record as we debate the Bill.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, I have not engaged with this amendment in any particular detail—until the last 24 hours, in fact. I thought that I would come to listen to the debate today and see if there was anything that I could usefully contribute. I have been interested in the different points that have been raised so far. I find myself agreeing with some points that are perhaps in tension or conflict with each other. I emphasise from the start, though, my complete respect for the Joint Committee and the work that it did in the pre-legislative scrutiny of the Bill. I cannot compare my knowledge and wisdom on the Bill with those who, as has already been said, have spent so much intensive time thinking about it in the way that they did at that stage.

Like my noble friend Lady Harding, I always have a desire for clarity of purpose. It is critical for the success of any organisation, or anything that we are trying to do. As a point of principle, I like the idea of setting out at the start of this Bill its purpose. When I looked through the Bill again over the last couple of weeks in preparation for Committee, it was striking just how complicated and disjointed a piece of work it is and so very difficult to follow.

There are many reasons why I am sympathetic towards the amendment. I can see why bringing together at the beginning of the Bill what are currently described as “Purposes” might be for it to meet its overall aims. But that brings me to some of the points that the noble Baroness, Lady Fox, has just made. The Joint Committee’s report recommends that the objectives of the Bill

“should be that Ofcom should aim to improve online safety for UK citizens by ensuring that service providers”—

it then set out objectives aimed at Ofcom rather than them actually being the purposes of the Bill.

I was also struck by what the noble Lord, Lord Allen, said about what we are looking for. Are we looking for regulation of the type that we would expect of airlines, or of the kind we would expect from the car industry? If we are still asking that question, that is very worrying. I think we are looking for something akin to the car industry model as opposed to the airline model. I would be very grateful if my noble friend the Minister was at least able to give us some assurance on that point.

If I were to set out a purpose of the Bill at the beginning of the document, I would limit myself to what is currently in proposed new subsection (1)(g), which is

“to secure that regulated internet services operate with transparency and accountability in respect of online safety”.

That is all I would say, because that, to me, is what this Bill is trying to do.

The other thing that struck me when I looked at this—I know that there has been an approach to this legislation that sought to adopt regulation that applies to the broadcasting world—was the thought, “Somebody’s looked at the BBC charter and thought, well, they’ve got purposes and we might adopt a similar sort of approach here.” The BBC charter and the purposes set out in it are important and give structure to the way the BBC operates, but they do not give the kind of clarity of purpose that my noble friend Lady Harding is seeking—which I too very much support and want to see—because there is almost too much there. That is my view on what the place to start would be when setting out a very simple statement of purpose for this Bill.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this day has not come early enough for me. I am pleased to join others on embarking on the Committee stage of the elusive Online Safety Bill, where we will be going on an intrepid journey, as we have heard so far. Twenty years ago, while I was on the Ofcom content board, I pleaded for the internet to be regulated, but was told that it was mission impossible. So this is a day I feared might not happen, and I thank the Government for making it possible.

I welcome Amendment 1, in the names of the noble Lords, Lord Stevenson, Lord Clement-Jones, and others. It does indeed encapsulate the overarching purpose of the Bill. But it also sets out the focus of what other amendments will be needed if the Bill is to achieve the purpose set out in that amendment.

The Bill offers a landmark opportunity to protect children online, and it is up to us to make sure that it is robust, effective and evolvable for years to come. In particular, I welcome subsection (1)(a) and (b) of the new clause proposed by Amendment 1. Those paragraphs highlight an omission in the Bill. If the purposes set out in them are to be met, the Bill needs to go much further than it currently does.

Yes, the Bill does not go far enough on pornography. The amendment sets out a critical purpose for the Bill: children need a “higher level of protection”. The impact that pornography has on children is known. It poses a serious risk to their mental health and their understanding of consent, healthy sex and relationships. We know that children as young as seven are accessing pornographic content. Their formative years are being influenced by hardcore, abusive pornography.

As I keep saying, childhood lasts a lifetime, so we need to put children first. This is why I have dedicated my life to the protection of children and their well-being. This includes protection from pornography, where I have spent over a decade campaigning to prevent children easily accessing online pornographic content.

I know that others have proposed amendments that will be debated in due course which meet this purpose. I particularly support the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. Those amendments meet the purpose of the Bill by ensuring that children are protected from pornographic content wherever it is found through robust, anonymous age verification that proves the user’s age beyond reasonable doubt.

Online pornographic content normalises abusive sexual acts, with the Government’s own research finding

“substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women”

and children. This problem is driven largely by the types of content that are easily available online. Pornography is no longer the stereotype that we might imagine from the 1970s and 1980s. It is now vicious, violent and pervasive. Content that would be prohibited offline is readily available online for free with just a few clicks. The Online Safety Bill comes at a crucial moment to regulate online pornography. That is why I welcome the amendment introducing a purpose to the Bill that ensures that internet companies “comply with UK law”.

We have the Obscene Publications Act 1959 and UK law does not allow the offline distribution of material that sexualises children—such as “barely legal” pornography, where petite-looking adult actors are made to look like children—content which depicts incest and content which depicts sexual violence, including strangulation. That is why it is important that the Bill makes that type of material illegal online as well. Such content poses a high risk to children as well as women and girls. There is evidence that such content acts as a gateway to more hardcore material, including illegal child sexual abuse material. Some users spiral out of control, viewing content that is more and more extreme, until the next click is illegal child sexual abuse material, or even going on to contact and abuse children online and offline.

My amendment would require service providers to exclude from online video on-demand services any pornographic content that would be classified as more extreme than R18 and that would be prohibited offline. This would address the inconsistency between online and offline regulation of pornographic content—

--- Later in debate ---
Viscount Stansgate Portrait Viscount Stansgate (Lab)
- View Speech - Hansard - - - Excerpts

If I may, I will prevail upon the noble Lord, Lord Clement-Jones, to wait just another few seconds before beginning his winding-up speech. I have found this an extremely interesting and worthwhile debate, and there seems to be an enormous amount of consensus that the amendment is a good thing to try to achieve. It is also true that this is a very complex Bill. My only point in rising is to say to the Minister—who is himself about to speak, telling us why the Government are not going to accept Amendment 1—that, as a result of the very long series of debates we are going to have on this Bill over a number of days, perhaps the Government might still be able, at the end of this very long process, to rethink the benefits of an having amendment of this kind at the beginning of the Bill. I hope that, just because he is going to ask us that the amendment be withdrawn today, he will not lose sight of the benefits of such an amendment.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - -

My Lords, just before the noble Lord, Lord Clement-Jones gets to wind up, I wanted to ask a question and make a point of clarification. I am grateful for the contribution from the noble Baroness, Lady Chakrabarti; that was a helpful point to make.

My question, which I was going to direct to the noble Lord, Lord Stevenson—although it may be one that the noble Lord, Lord Clement-Jones, wants to respond to if the noble Lord, Lord Stevenson, is not coming back—is about the use of the word “purpose” versus “objective”. The point I was trying to make in referring to the Joint Committee’s report was that, when it set out the limbs of this amendment, it was referring to them as objectives for Ofcom. What we have here is an amendment that is talking about purposes of the Bill, and in the course of this debate we have been talking about the need for clarity of purpose. The point I was trying to make was not that I object to the contents of this amendment, but that if we are looking for clarity of purpose to inform the way we want people to behave as a result of this legislation, I would make it much shorter and simpler, which is why I pointed to subsection (g) of the proposed clause.

It may be that the content of this amendment—and this is where I pick up the point the noble Baroness, Lady Chakrabarti, was making—is not objectionable, although I take the point made by the noble Baroness, Lady Fox. However, the noble Baroness, Lady Chakrabarti, is right: at the moment, let us worry less about the specifics. Then, we can be clearer about what bits of the amendment are meant to be doing what, rather than trying to get all of them to offer clarity of purpose. That is my problem with it: there are purposes, which, as I say, are helpful structurally in terms of how an organisation might go about its work, and there is then the clarity of purpose that should be driving everything. The shorter, simpler and more to the point we can make that, the better.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Baroness. I hope I have not appeared to rush the proceedings, but I am conscious that there are three Statements after the Bill. I thank the noble Lord, Lord Stevenson, for tabling this amendment, speaking so cogently to it and inspiring so many interesting and thoughtful speeches today. He and I have worked on many Bills together over the years, and it has been a real pleasure to see him back in harness on the Opposition Front Bench, both in the Joint Committee and on this Bill. Long may that last.

It has been quite some journey to get to this stage of the Bill; I think we have had four Digital Ministers and five Prime Ministers since we started. It is pretty clear that Bismarck never said, “Laws are like sausages: it’s best not to see them being made”, but whoever did say it still made a very good point. The process leading to today’s Bill has been particularly messy, with Green and White Papers; a draft Bill; reports from the Joint Committee and Lords and Commons Select Committees; several versions of the Bill itself; and several government amendments anticipated to come. Obviously, the fact that the Government chose to inflict last-minute radical surgery on the Bill to satisfy what I believe are the rather unjustified concerns of a small number in the Government’s own party made it even messier.

It is extremely refreshing, therefore, to start at first principles, as the noble Lord, Lord Stevenson, has done. He has outlined them and the context in which we should see them—namely, we should focus essentially on the systems, what is readily enforceable and where safety by design and transparency are absolutely the essence of the purpose of the Bill. I share his confidence in Ofcom and its ability to interpret those purposes. I say to the noble Baroness, Lady Stowell, that I am not going to dance on the heads of too many pins about the difference between “purpose” and “objective”. I think it is pretty clear what the amendment intends, but I do have a certain humility about drafting; the noble Baroness, Lady Chakrabarti, reminded us of that. Of course, one should always be open to change and condensation of wording if we need to do that. But we are only at Amendment 1 in Committee, so there is quite a lot of water to flow under the bridge.

It is very heartening that there is a great deal of cross-party agreement about how we must regulate social media going forward. These Benches—and others, I am sure—will examine the Bill extremely carefully and will do so in a cross-party spirit of constructive criticism, as we explained at Second Reading. Our Joint Committee on the draft Bill exemplified that cross-party spirit, and I am extremely pleased that all four signatories to this amendment served on the Joint Committee and readily signed up to its conclusions.

Right at the start of our report, we made a strong case for the Bill to set out these core objectives, as the noble Lord, Lord Stevenson, has explained, so as to provide clarity—that word has been used around the Committee this afternoon—for users and regulators about what the Bill is trying to achieve and to inform the detailed duties set out in the legislation. In fact, I believe that the noble Lord, Lord Stevenson, has improved on that wording by including a duty on the Secretary of State, as well as Ofcom, to have regard to the purposes.

We have heard some very passionate speeches around the Committee for proper regulation of harms on social media. The case for that was made eloquently to the Joint Committee by Ian Russell and by witnesses such as Edleen John of the FA and Frances Haugen, the Facebook whistleblower. A long line of reports by Select Committees and all-party groups have rightly concluded that regulation is absolutely necessary given the failure of the platforms even today to address the systemic issues inherent in their services and business models.

The introduction to our Joint Committee report makes it clear that without the original architecture of a duty of care, as the White Paper originally proposed, we need an explicit set of objectives to ensure clarity for Ofcom when drawing up the codes and when the provisions of the Bill are tested in court, as they inevitably will be. Indeed, in practice, the tests that many of us will use when judging whether to support amendments as the Bill passes through the House are inherently bound up with these purposes, several of which many of us mentioned at Second Reading. Decisions may need to be made on balancing some of these objectives and purposes, but that is the nature of regulation. I have considerable confidence, as I mentioned earlier, in Ofcom’s ability to do this, and those seven objectives—as the right reverend Prelate reminded us, the rule of seven is important in other contexts—set that out.

In their response to the report published more than a year ago, the Government repeated at least half of these objectives in stating their own intentions for the Bill. Indeed, they said:

“We are pleased to agree with the Joint Committee on the core objectives of the Bill”,


and, later:

“We agree with all of the objectives the Joint Committee has set out, and believe that the Bill already encapsulates and should achieve these objectives”.


That is exactly the point of dispute: we need this to be explicit, and the Government seem to believe that it is implicit. Despite agreeing with those objectives, at paragraph 21 of their response the Government say:

“In terms of the specific restructure that the Committee suggested, we believe that using these objectives as the basis for Ofcom’s regulation would delegate unprecedented power to a regulator. We do not believe that reformulating this regulatory framework in this way would be desirable or effective. In particular, the proposal would leave Ofcom with a series of high-level duties, which would likely create an uncertain and unclear operating environment”.


That is exactly the opposite of what most noble Lords have been saying today.

It has been an absolute pleasure to listen to so many noble Lords across the Committee set out their ambitions for the Bill and their support for this amendment. It started with the noble Baroness, Lady Kidron, talking about this set of purposes being the “North Star”. I pay tribute to her tireless work, which drove all of us in the Joint Committee on in an extremely positive way. I am not going to go through a summing-up process, but what my noble friend had to say about the nature of the risk we are undertaking and the fact that we need to be clear about it was very important. The whole question of clarity and certainty for business and the platforms, in terms of making sure that they understand the purpose of the Bill—as the noble Baroness, Lady Harding, and many other noble Lords mentioned—is utterly crucial.

If noble Lords look at the impact assessment, they will see that the Government seem to think the cost of compliance is a bagatelle—but, believe me, it will not be. It will be a pretty expensive undertaking to train people in those platforms, across social media start-ups and so on to understand the nature of their duties.