(5 months, 2 weeks ago)
Lords ChamberI would like to add something to the constitutional points which were made by my noble friend Lord Lipsey. The appropriateness of dealing with this issue in wash-up is clearly in contention. The noble Lord, Lord Hunt of Wirral, said yesterday that the abolition of Section 40 is a clear commitment of the 2019 Conservative Party manifesto. I am afraid it is not clear: the sentence starts by saying that but it then sets conditions. It provides additional text that confuses the issue, and raises issues which were dealt with in yesterday’s debate. I have read yesterday’s debate and clearly questions have been raised about the accuracy of the information in that particular quote from the manifesto—I see the Minister disagrees. Claiming that it is clear is incorrect.
The second issue arises from the Salisbury convention about manifesto commitments. It is quite clear that this cannot be an essential commitment because the Government have had more than four years to deal with the matter, and they failed to do so. Bringing it up in the wash-up period is an insult to this House and an exploitation of the arrangements which have been made.
My Lords, I want only to reflect slightly on some of the comments that have been made about the tone of the debate, and in particular the attack against the noble Lord, Lord Pannick. I have been shocked by the tone of the debate against the points of noble Lord, Lord Pannick. The argument seems to be that this is a Tory conspiracy theory, that the Tories are in bed with the press barons, and that there is all sorts of skulduggery going on. I am genuinely shocked that this is being allowed to pass. I want to at least mention that there are some of us who worry about an authorised regulator, and the politicisation of regulation, who are not in bed with press barons. I spend most of my time reading newspapers that write rubbish about me, so I am not keen on press barons—let me put it that way. I also happen to believe in media independence and freedom, which is an important point.
I want the Bill to get on and pass, but, first, there was an earlier discussion about local TV news channels. For the sake of the public and accuracy, one noble Lord sitting close to me said that GM News—he meant GB News—is constantly played on local TV news channels. It is not GB News; it is TalkTV that is played on all those channels, including in Liverpool. I get it and watch it, and that is what is played.
Secondly, the accusation was made that Ofcom, because it is run by evil Tories, is not doing anything about GB News and the way that it presents itself. It is worth reading the papers and the press on this, because then we would all know that Ofcom is in fact accused of overregulating GB News for exactly the things that the noble Lord mentioned it was ignoring—to such an extent that GB News is beginning a formal legal process against Ofcom, which it considers to be overly political in its involvement in the editorial independence of GB News.
I make these statements of factual accuracy because what is at stake is not which political party you are in. We talked about Reithian principles before, but has anyone explored Reith’s politics at any point? He was not a socialist or a Lib Dem, but I agree that Reithian principles matter. I do not care who is arguing for press freedom or who is trying to overturn Section 40. I do not think that it is a conspiracy to establish an independent regulator for the media, which is not a threat to the British public. The threat to the British public is a politicised, misinformed, ill-informed discussion that tries to suggest that the only people who care about press freedom are working with press barons. That is nonsense.
(7 months, 3 weeks ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord McNally, for this important discussion. Already, it has raised meaty topics. Yesterday, in the debate on the opposition of the noble Baroness, Lady Stowell, to foreign state ownership of the media, even those antagonistic to the politics of the Telegraph and the Spectator spoke passionately in support of media plurality.
While newspapers cover a wide range of political stances, in broadcasting there is a lot less viewpoint diversity, I would say, and we must ensure that any regulation does not narrow choice further. I am especially thinking of attitudes to three-year-old GB News. Love it or loathe it, the channel is surely a valuable shake-up of the media landscape, yet it has attracted a disproportionate hostility from influential voices. It is, however, popular with growing audiences, 60% of whom are based in the north. Some 3.5 million viewers watch the TV channel monthly; a further 3.5 million access its social media and 20 million its website. This month, GB News has had more views than Sky News 48% of the time and more than BBC News 29% of the time. So why are some so determined to scupper a popular channel?
Even before its launch, a liberal NGO, Stop Funding Hate, lobbied advertisers to boycott the channel, using corporate cash as a tool for censorship. More recently, big-name media players all over X have constantly urged their followers to complain about GBN to Ofcom, seemingly keen to regulate the channel out of existence. A year ago, GBN comprised 1.3% of total broadcast complaints to Ofcom. Now it is 11.3%. That is merited less by content than by politicised malice.
One complaint is the use of MPs as presenters. I am not sure how I feel about that, but some perspective is required. The channel has 30 main presenters who host their own shows, of which only two are serving MPs, appearing collectively for five hours a week out of a total of 126. What is more, as the noble Lord, Lord Vaizey, explained, GBN did not invent the model: LBC has been doing it for years. Beyond David Lammy, in the past there has been LBC’s “Call Clegg”, “Ask Boris” and even “Phone Farage”.
I am not a cheerleader for GB News but a critical friend. Programmes such as Andrew Doyle’s “Free Speech Nation” and Michael Portillo’s culture show are the very best of UK public service broadcasting, but some shows are less to my taste. I am also a critical friend of all other broadcasters, such as the BBC; I have just been on “Politics Live”, but I have a love-hate relationship with much of the Beeb’s political output. We should not hold back from criticising channels when it is deserved, but that is not the same as trying to destroy them. I want a level regulatory playing field; otherwise, double standards might distort the focus of regulation.
In January, Jewish staff working for the BBC lodged formal complaints about anti-Semitism internally and including on coverage of the conflict. We have had BBC newsreaders ludicrously avoiding calling Hamas a terrorist organisation. As the noble Lord, Lord Pickles, noted in the Chamber on Tuesday, there are serious concerns about anti-Israel bias in the World Service Arabic division—never mind that one-man challenge to impartiality, Gary Lineker, who retweeted a bigoted demand that FIFA should ban the whole Israeli football team from international tournaments, with no consequences.
In contrast, the former BBC senior broadcast journalist Cath Walton recently wrote in the Critic about how BBC managers demanded that she delete a tweet criticising the term “cis women” within an hour of it being posted, followed by a lengthy disciplinary process in which her gender-critical views were treated as wrongthink. Ms Walton’s article was prompted by recent instances where the BBC’s lack of impartiality on sex and gender has led to seriously misleading audiences. Recently, BBC viewers were informed that men can breastfeed—spoiler: they cannot—with a non-binary identifying expert alleging, unchallenged, that the hormone-induced discharge from a trans woman’s nipples is better for babies than a mother’s breast milk. What misogynistic claptrap. Where are BBC Verify and Ofcom when you need them?
Sometimes, in the name of impartiality, facts are described as opinions due to institutionalised ideological partisanship. The BBC recently upheld a complaint against Radio 4’s Justin Webb which ruled that he broke impartiality rules when explaining a story with the factually accurate and true remark,
“trans women, in other words males”.
Finally, there are the sins of omission. Why has the BBC been absent and silent in covering the scandal of the safeguarding risks associated with puberty blockers for the young? Now that NHS England has banned them for teens, the BBC commissioned its LGBTQ+ correspondent to tell the story, not the science reporters to discuss the medical scandals. I am glad to say that GB News has been following, covering and leading on this for years.
(8 months ago)
Lords ChamberThe Government consulted on decriminalisation of TV licence evasion in 2020, and we published our response in 2021. The appropriate time to make this decision is as part of the BBC funding model review, when we can look at the way we can get the sustainable funding for the corporation that everyone wants to see.
My Lords, the urgency with which people are suggesting this is looked at is not helped by the fact that we are going through a very serious financial time and it is likely that there will be more, not fewer, people in this difficult situation. First, can the Minister take note of what the original Question said—that there is a danger of this being abused by private companies that are incentivised to prosecute women in this situation? Secondly, we are spending a lot of time doing the Victims and Prisoners Bill—well, some of us are—and it seems so wrong to criminalise people who are vulnerable and victims. This has got nothing to do with BBC bashing; it has everything to do with recognising that women are being discriminated against unfairly for something which, given the scale of the problems facing society, the Government should really just deal with now.
I hope I can reassure the noble Baroness that the number of women being prosecuted is falling. In the year ending June 2022, 35,000 women were prosecuted; in the year ending June 2023, 29,000 were. So the number is coming down, but the disparity between the sexes is indeed stark, with women still making up around three-quarters of people prosecuted. That is why we are glad the BBC has looked at this and has set out actions, and why we are looking at it as part of our future consideration of how the BBC should be funded.
(1 year, 3 months ago)
Lords ChamberMy Lords, as the noble Lords, Lord Stevenson and Lord Clement-Jones, have already said, the Communications and Digital Select Committee did indeed recommend a new Joint Committee of both Houses to look specifically at the various different aspects of Ofcom’s implementation of what will be the Online Safety Act and ongoing regulation of digital matters. It is something I still have a lot of sympathy for. However, there has not been much appetite for such a Joint Committee at the other end of the Corridor. I do not necessarily think we should give up on that, and I will come back to that in a moment, but in place of that, I am not keen on what is proposed in Amendment 239, because my fear about how that is laid out is that it introduces something that appears a bit too burdensome and probably introduces too much delay in implementation.
To return to the bigger question, I think that we as parliamentarians need to reflect on our oversight of regulators, to which we are delegating significant new powers and requiring them to adopt a much more principles-based approach to regulation to cope with the fast pace of change in the technological world. We have to reflect on whether our current set-up is adequate for the way in which that is changing. What I have in mind is very much a strategic level of oversight, rather than scrutinising operational decisions, although, notwithstanding what the noble Lord has said, something specific in terms of implementation of the Bill and other new legislation is an area I would certainly wish to explore further.
The other aspect of this is making sure that our regulators keep pace too, not just with technology, and apply the new powers we give them in a way which meets our original intentions, but with the new political dynamics. Earlier today in your Lordships’ Chamber, there was a Question about how banks are dealing with political issues, and that raises questions about how the FCA is regulating the banking community. We must not forget that the Bill is about regulating content, and that makes it ever more sensitive. We need to keep reminding ourselves about this; it is very new and very different.
As has been acknowledged, there will continue to be a role for the Communications and Digital Select Committee, which I have the great privilege of chairing, in overseeing Ofcom. My noble friend Lord Grade and Dame Melanie Dawes appeared before us only a week ago. There is a role for the SIT Committee in the Commons; there is also probably some kind of ongoing role for the DCMS Select Committee in the Commons too, I am not sure. In a way, the fractured nature of that oversight makes it all the more critical that we join up a bit more. So I will take it upon myself to give this more thought and speak to the respective chairs of those committees in the other place, but I think that at some point we will need to consider, in some other fora, the way in which we are overseeing the work of regulators.
At some point, I think we will need to address the specific recommendations in the pre-legislative committee’s report, which were very much in line with what my own committee thought was right for the future of digital regulatory oversight, but on this occasion, I will not be supporting the specifics of Amendment 239.
My Lords, very briefly, I was pleased to see this, in whatever form it takes, because as we finish off the Bill, one thing that has come up consistently is that some of us have raised problems of potential unintended consequences, such as whether age gating will lead to a huge invasion of the privacy of adults rather than just narrowly protecting children, or whether the powers given to Ofcom will turn it into the most important and powerful regulator in the country, if not in Europe. In a highly complex Bill, is it possible for us to keep our eye on it a bit more than just by whingeing on the sidelines?
The noble Baroness, Lady Stowell, makes a very important point about the issue in relation to the FCA and banking. Nobody intended that to be the outcome of PEPs, for example, and nobody intended when they suggested encouraging banks to have values such as ESG or EDI—equality, diversity and inclusion—that that would lead to ordinary citizens of this country being threatened with having their banking turned off. It is too late to then retrospectively say, “That wasn’t what we ever intended”.
My Lords, I have put my name to and support Amendment 255, laid by the noble Lord, Lord Moylan, which straight- forwardly means that a notice may not impose a requirement relating to a service that would require that provider to weaken or remove end-to-end encryption. It is very clear. I understand that the other amendments introduce safeguards, which is better than nothing. It is not what I would like, but I will support them if they are pushed a vote. I think that the Government should really consider seriously not going anywhere near getting rid of encryption in this Bill and reconsider it by the time we get to Third Reading.
As the noble Lord, Lord Moylan, explained, this is becoming widely known about now, and it is causing some concern. If passed, this Bill, as it is at the moment, gives Ofcom far-reaching powers to force services, such as WhatsApp, to install software that would scan all our private messages to see whether there is evidence of terrorism, child sexual exploitation or abusive content and would automatically send a report to third parties, such as law enforcement, if it suspects wrongdoing—all without the consent or control of the sender or the intended recipient.
I would just like to state that encryption is a wonderful innovation. That is why more than 40 million people in the UK use it every day. It ensures that our private messages cannot be viewed, compromised or altered by anyone else, not even providers of chat services. It really requires somebody handing them over to a journalist and saying, “You can have my WhatsApp messages for anyone to read them”: beyond that, you cannot read them.
One of the interesting things that we have discussed throughout the passage of the Bill is technologies, their design and functionality and making sure they are not harmful. Ironically, it is the design and function of encryption that actually helps to keep us safe online. That is why so many people talk about civil libertarians, journalists and brave dissenters using it. For the rest of us, it is a tool to protect our data and private communications in the digital realm. I just want to pose here that it is an irony that the technologies being proposed in terms of client-side scanning are the technologies that are potentially harmful because it is, as people have noted, the equivalent of putting video cameras in our homes to listening in to every conversation and send reports to the police if we discuss illicit topics. As I have said before, while child sexual abuse is horrendous and vile, we know that it happens largely in the home and, as yet, the Government have not advocated that we film in everybody’s home in order to stop child sexual abuse. We should do almost anything and everything that we can, but I think this is the wrong answer.
Focusing on encryption just makes no sense. The Government have made exemptions, recognising the importance, in a democracy, of private correspondence: so exempted in the Bill are text messages, MSN, Zoom, oral commentaries and email. It seems perverse to demonise encryption in this way. I also note that there are exemptions for anything sent on message apps by law enforcement or public sector or emergency responders. I appreciate that some government communications are said to be done over apps such as WhatsApp. It seems then that the target of this part of the Bill is UK private citizens and residents and that the public are seen as the people who must be spied on.
In consequence, I do not think it surprising that more than 80 national or international civil society organisations have said that this would make the UK the first liberal democracy to require the routine scanning of people’s private chat messages. What does the Minister say to the legal opinion from the technology barrister Matthew Ryder KC, commissioned by Index on Censorship precisely on this part of the Bill? He compares this to law enforcement wiretapping without a warrant and says that the Bill will grant Ofcom a wider remit of surveillance powers over the public than GCHQ has.
Even if the Minister is not interested in lawyers or civil libertarians, surely we should be listening to the advice of science and technology experts in relation to complex technological solutions. Which experts have been consulted? I noted that Matthew Hodgson, the boss of encrypting messaging app Element, has said wryly that
“the Government has not consulted with UK tech firms, only with huge multinational corporations and companies that want to sell software that scans messages, who are unsurprisingly telling lawmakers that it is possible to scan messages without breaking encryption”.
The problem is that it is not possible to scan those messages without breaking encryption. It is actually misinformation to say that. That is why whole swathes of leading scientists and technologists from across the globe have recently put out an open letter explaining why and how it is not true. They explained that it creates really dangerous side-effects that can be harmful in the way that the noble Lord, Lord Moylan, explained, in terms of security, and makes the online world less safe for many of us. Existing scanning technologies are flawed and ineffective and scanning will nullify the purpose of encryption. I refer noble Lords to the work of the Internet Society and the academic paper Bugs in Our Pockets: The Risks of Client-Side Scanning for more details on all the peer-reviewed work.
I understand that, given the horrific nature of child sexual abuse—and, of course, terrorism, but I shall concentrate on child sexual abuse because the Bill is so concerned with it—it can be tempting for the Government to hope that there is a technological silver bullet to eradicate it. But the evidence suggests otherwise. One warning from scientists is that scanning billions of pieces of content could lead to millions of false positives and that could not only frame innocent users but could mean that the police become overwhelmed, diverting valuable resources away from real investigations into child sexual abuse.
A study by the Max Planck Institute for the study of crime of a similar German law that lasted from 2008 to 2010 found that the German police having access to huge amounts of data did not have any deterrent effect, did not assist in cleaning up crimes or increase convictions, but did waste a lot of police time. So it is important that this draconian invasion of privacy is not stated as necessary for protecting children. I share the exasperation of Signal’s president Meredith Whittaker, who challenged the Secretary of State and pointed out that there were some double standards here: for example, slashing early intervention programmes over the past decade did not help protect children and chronically underfunding and underresourcing child social care does not help.
My own bugbears are that when I, having talked to social workers and colleagues, raised the dangers to child protection when we closed down schools in lockdown, they were brushed to one side. When I and others raised the horrors of the young girls who had been systematically raped by grooming gangs whom the authorities had ignored for many, many years, I was told to stop talking about it. There are real threats to children that we ignore. I do not want us in this instance to use that very emotive discussion to attack privacy.
I also want to stress that there is no complacency here. Law enforcement agencies in the UK already possess a wide range of powers to seize devices and compel passwords and even covertly to monitor and hack accounts to identify criminal activity. That is good. Crucially, private messaging services can and do— I am sure they could do more—work in a wide range of ways to tackle abuse and keep people safe without the need to scan or read people’s messages.
To answer my noble friend Lady Stowell first, it depends on the type of service. It is difficult to give a short answer that covers the range of services that we want to ensure are covered here, but we are seeking to keep this and all other parts of the Bill technology neutral so that, as services develop, technology changes and criminals, unfortunately, seek to exploit that, technology companies can continue to innovate to keep children safe while protecting the privacy of their users. That is a long-winded answer to my noble friend’s short question, but necessarily so. Ofcom will need to make its assessments on a case- by-case basis and can require a company to use its best endeavours to innovate if no effective and accurate technology is currently available.
While I am directing my remarks towards my noble friend, I will also answer a question she raised earlier on general monitoring. General monitoring is not a legally defined concept in UK law; it is a term in European Union law that refers to the generalised monitoring of user activity online, although its parameters are not clearly defined. The use of automated technologies is already fundamental to how many companies protect their users from the most abhorrent harms, including child sexual abuse. It is therefore important that we empower Ofcom to require the use of such technology where it is necessary and proportionate and ensure that the use of these tools is transparent and properly regulated, with clear and appropriate safeguards in place for users’ rights. The UK’s existing intermediary liability regime remains in place.
Amendment 255 from my noble friend Lord Moylan seeks to prevent Ofcom imposing any requirement in a notice that would weaken or remove end-to-end encryption. He is right that end-to-end encryption should not be weakened or removed. The powers in the Bill will not do that. These powers are underpinned by proportionality and technical feasibility; if it is not proportionate or technically feasible for companies to identify child sexual exploitation abuse content on their platform while upholding users’ right to privacy, Ofcom cannot require it.
I agree with my noble friend and the noble Baroness, Lady Fox, that encryption is a very important and popular feature today. However, with technology evolving at a rapid rate, we cannot accept amendments that would risk this legislation quickly becoming out of date. Naming encryption in the Bill would risk that happening. We firmly believe that the best approach is to focus on strong safeguards for upholding users’ rights and ensuring that measures are proportionate to the specific situation, rather than on general features such as encryption.
The Bill already requires Ofcom to consider the risk that technology could result in a breach of any statutory provision or rule of law concerning privacy and whether any alternative measures would significantly reduce the amount of illegal content on a service. As I have said in previous debates, Ofcom is also bound by the Human Rights Act not to act inconsistently with users’ rights.
Will the Minister write to noble Lords who have been here in Committee and on Report in response to the fact that it is not just encryption companies saying that the demands of this clause will lead to the breaching of encryption, even though the Minister and the Government keep saying that it will not? As I have indicated, a wide range of scientists and technologists are saying that, whatever is said, demanding that Ofcom insists that technology notices are used in this way will inadvertently lead to the breaking of encryption. It would be useful if the Government at least explained scientifically and technologically why those experts are wrong and they are right.
I am very happy to put in writing what I have said from the Dispatch Box. The noble Baroness may find that it is the same, but I will happily set it out in further detail.
I should make it clear that the Bill does not permit law enforcement agencies to access information held on platforms, including access to private channels. The National Crime Agency will be responsible for receiving reports from in-scope services via secure transmission, processing these reports and, where appropriate, disseminating them to other UK law enforcement bodies and our international counterparts. The National Crime Agency will process only information provided to it by the company; where it determines that the content is child sexual abuse content and meets the threshold for criminality, it can request further information from the company using existing powers.
I am glad to hear that my noble friend Lord Moylan does not intend to divide on his amendment. The restrictions it sets out are not ones we should impose on the Bill.
Amendments 256, 257 and 259 in the name of the noble Lord, Lord Stevenson of Balmacara, require a notice to be approved by a judicial commissioner appointed under the Investigatory Powers Act 2016 and remove Ofcom’s power to require companies to make best endeavours to develop or source new technology to address child sexual exploitation and abuse content.
My Lords, I just want to make some brief comments in support of the principle of what the noble Lord, Lord Knight, is aiming at in this amendment.
The Bill is going to have a profound impact on children in the United Kingdom. We hope that the most profound impact will be that it will significantly advance their interests in terms of safety online. But it will also potentially have a significant impact on what they can access online and the functionality of different services. They are going to experience new forms of age assurance, about which they may have very strong views. For example, the use of their biometric data to estimate their age will be there to protect them, but they may still have strong views about that.
I have said many times that there may be some measures in the Bill that will encourage services to become 18-plus only. That is not adult in the sense of adult content. Ordinary user-to-user social media services may look at the obligations and say, “Frankly, we would much rather restrict ourselves to users from the UK who identify as being 18-plus, rather than have to take on board all the associated liabilities in respect of children”—not because they are irresponsible, but precisely because they are responsible, and they can see that there is a lot of work to do in order to be legally and safely available to those under 18. For all those reasons, it is really important that the child advocacy body looks at things such as the United Nations Convention on the Rights of the Child and the rights of children to access information, and that it is able to take a view on them.
The reason I think that is important—as will any politician who has been out and spoken in schools—is that very often children are surprising in terms of what they see as their priorities. We make assumptions about their priorities, which can often be entirely wrong. There has been some really good work done on this. There was a project called EU Kids Online, back in the days of the EU, which used to look at children right across the European Union and ask them what their experience of being online was like and what was important to them. There are groups such as Childnet International, which for years has been convening groups of children and taking them to places such as the Internet Governance Forum. That always generates a lot of information that we here would not have thought of, about what children feel is really important to them about their online experience.
For all those reasons, it really would be helpful to institutionalise this in the new regime as some kind of body that looks in the round at children’s interests—their interests to stay safe, but also their interests to be able to access a wide variety of online services and to use the internet as they want to use it. I hope that that strengthens the case the noble Lord, Lord Knight, has made for such a body to exist in some kind of coalition-like format.
My Lords, I am afraid that I have some reservations about this amendment. I was trying not to, but I have. The way that the noble Lord, Lord Allan of Hallam, explained the importance of listening to young people is essential—in general, not being dictated to by them, but to understand the particular ways that they live their lives; the lived experience, to use the jargon. Particularly in relation to a Bill that spends its whole time saying it is designed to protect young people from harm, it might be worth having a word with them and seeing what they say. I mean in an ongoing way—I am not being glib. That seems very sensible.
I suppose my concern is that this becomes a quango. We have to ask who is on it, whether it becomes just another NGO of some kind. I am always concerned about these kinds of organisations when they speak “on behalf of”. If you have an advocacy body for children that says, “We speak on behalf of children”, that makes me very anxious. You can see that that can be a politically very powerful role, because it seems to have the authority of representing the young, whereas actually it can be entirely fictitious and certainly not democratic or accountable.
The key thing we discussed in Committee, which the noble Lord, Lord Knight of Weymouth, is very keen on—and I am too—is that we do not inadvertently deny young people important access rights to the internet in our attempt to protect them. That is why some of these points are here. The noble Baroness, Lady Kidron, was very keen on that. She wants to protect them but does not want to end up with them being denied access to important parts of the internet. That is all good, but I just think this body is wrong.
The only other thing to draw noble Lords’ attention to—I am not trying to be controversial, but it is worth nothing—is that child advocacy is currently in a very toxic state because of some of the issues around who represents children. As we speak, there is a debate about, for example, whether the NSPCC has been captured by Stonewall. I make no comment because I do not know; I am just noting it. We have had situations where a child advocacy group such as Mermaids is now discredited because it is seen to have been promoting chest binders for young people, to have gone down the gender ideology route, which some people would argue is child abuse of a sort, advocating that young women remove their breasts—have double mastectomies. This is all online, by the way.
I know that some people would say, “Oh, you’re always going on about that”, but I raise it because it is a very real and current discussion. I know a lot of people who work in education, with young people or in children’s rights organisations, and they keep telling me that they are tearing themselves apart. I just wondered whether the noble Lord, Lord Knight, might note that there is a danger of walking into a minefield here—which I know he does not mean to walk into—by setting up an organisation that could end up being the subject of major culture wars rows or, even worse, one of those dreaded quangos that pretends it is representing people but does not.
(1 year, 3 months ago)
Lords ChamberMy Lords, the noble Lord, Lord Allan of Hallam, hinted at the fact that there have been a plethora of government amendments on Report and, to be honest, it has been quite hard fully to digest most of them, let alone scrutinise them. I appreciate that the vast majority have been drawn up with opposition Lords, who might have found it a bit easier. But some have snuck in and, in that context, I want to raise some problems with the amendments in this group, which are important. I, too, am especially worried about that government amendment on facilitating remote access to services and equipment used to buy services. I am really grateful to the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, for tabling Amendment 247B, because I did not know what to do—and they did it. At least it raises the issue to the level of it needing to be taken seriously.
The biggest problem that I had when I originally read this provision was that facilitating remote access to services, and as yet undefined equipment used by a service, seems like a very big decision, and potentially disproportionate. It certainly has a potential to have regulatory overreach, and it creates real risks around privacy. It feels as though it has not even been flagged up strongly enough by the Government with regard to what it could mean.
I listened to what the Minister said, but I still do not fully understand why this is necessary. Have the Government considered the privacy and security implications that have already been discussed? Through Amendment 252A, the Government now have the power to enter premises for inspection—it rather feels as if there is the potential for raids, but I will put that to one side. They can go in, order an audit and so on. Remote access as a preliminary way to gather information seems heavy-handed. Why not leave it as the very last thing to do in a dialogue between Ofcom and a given platform? We have yet to hear a proper justification of why Ofcom would need this as a first-order thing to do.
The Bill does not define exactly what
“equipment used by the service”
means. Does it extend to employees’ laptops and phones? If it extends to external data centres, have the Government assessed the practicalities and security impact of that and the substantial security implications, as have been explained, for the services, the data centre providers and those of us whose data they hold?
I am also concerned that this will necessitate companies having very strongly to consider internal privacy and security controls to deal with the possibility of this, and that this will place a disproportionate burden on smaller and mid-sized businesses that do not have the resources available to the biggest technology companies. I keep raising this because in other parts of government there is a constant attempt to say that the UK will be the centre of technological innovation and that we will be a powerhouse in new technologies, yet I am concerned that so much of the Bill could damage that innovation. That is worth considering.
It seems to me that Amendment 252A on the power to observe at the premises ignores decentralised projects and services—the very kind of services that can revolutionise social media in a positive way. Not every service is like Facebook, but this amendment misses that point. For example, you will not be able to walk into the premises of the UK-based Matrix, the provider of the encrypted chat service Element that allows users to host their own service. Similarly, the non-profit Mastodon claims to be the largest decentralised social network on the internet and to be built on open-web standards precisely because it does not want to be bought or owned by a billionaire. So many of these amendments seem not to take those issues into account.
I also have a slight concern on researcher access to data. When we discussed this in Committee, the tone was very much—as it is in these amendments now—that these special researchers need to be able to find out what is going on in these big, bad tech companies that are trying to hide away dangerous information from us. Although we are trying to ensure that there is protection from harms, we do not want to demonise the companies so much that, every time they raise privacy issues or say, “We will provide data but you can’t access it remotely” or “We want to be the ones deciding which researchers are allowed to look at our data”, we assume that they are always up to no good. That sends the wrong message if we are to be a tech-innovative country or if there is to be any working together.
Before the Minister sits down, to quote the way the Minister has operated throughout Report, there is consensus across the House that there are some concerns. The reason why there are concerns outside and inside the House on this particular amendment is that it is not entirely clear that those protections exist, and there are worries. I ask the Minister whether, rather than just writing, it would be possible to take this back to the department, table a late amendment and say, “Look again”. That has been done before. It is certainly not too late: if it was not too late to have this amendment then it is certainly not too late to take it away again and to adopt another amendment that gives some safeguarding. Seriously, it is worth looking again.
I had not quite finished; the noble Baroness was quick to catch me before I sat down. I still have some way to go, but I will certainly take on board all the points that have been made on this group.
The noble Lord, Lord Knight, asked about Schedule 12. I will happily write with further information on that, but Schedule 12 is about UK premises, so it is probably not the appropriate place to deal with this, as we need to be able to access services in other countries. If there is a serious security risk then it would not necessarily be proportionate. I will write to him with further details.
My Lords, I want to look at how, in the Government expanding Ofcom’s duties to prioritise media literacy, it has become linked to this group, and to look at the way in which Amendment 274B does this. It is very much linked with misinformation and disinformation. According to the amendment, there has to be an attempt to establish
“accuracy and authenticity of content”
and to
“understand the nature and impact of disinformation and misinformation, and reduce their and others’ exposure to it”.
I was wondering about reducing users’ exposure to misinformation and disinformation. That gives me pause, because I worry that reducing exposure will obviously mean the removal or censorship of material. I just want to probe some assumptions. Is it the presumption that incorrect or seemingly untrue or erroneous information is the predominant cause of real harm if it is not suppressed? Is there not a risk of harm in suppressing ideas too? Apart from the fact that heretical scientific and political theories were historically seen as misinformation and now are conventional wisdom, is there a danger that suppression in the contemporary period would create mistrust and encourage conspiratorial thinking—people saying, “What have you got to hide?”—and so on?
I want to push this by probing Amendment 269AA in the name of the noble Lord, Lord Clement-Jones, which itself is a probing amendment as to why Ofcom’s misinformation and disinformation committee is not required to consider the provenance of information to help empower users to understand whether content is real or true and so on, rather than the wording at the moment, “accuracy and authenticity”. When I saw the word “provenance”, I stopped for a moment. In all the debates going on in society about misinformation and disinformation, excellent provenance cannot necessarily guarantee truth.
I was shocked to discover that the then Wellcome Trust director, Jeremy Farrar, who is now the chief scientist at the World Health Organization, claimed that the Wuhan lab leak and the manmade theories around Covid were highly improbable. We now know that there were emails from Jeremy Farrar—I was shocked because I am a great fan of the Wellcome Trust and Jeremy Farrar’s work in general—in which there was a conscious bending of the truth that led to the editing of a scientific paper and a letter in the Lancet that proved to have been spun in a way to give wrong information. When issues such as the Wuhan lab leak were raised by Matt Ridley, recently of this parish—I do not know whether his provenance would count—they were dismissed as some kind of racist conspiracy theory. I am just not sure that it is that clear that you can get provenance right. We know from the Twitter files that the Biden Administration leaned on social media companies to suppress the Hunter Biden laptop story that was in the New York Post, which was described as Russian disinformation. We now know that it was true.
Therefore, I am concerned that, in attempting to be well-meaning, this amendment that says we should have better media information does not give in to these lazy labels of disinformation and misinformation, as if we all know what the truth is and all we need is fact-checkers, provenance and authenticity. Disinformation and misinformation have been weaponised, which can cause some serious problems.
Can the Minister clarify whether the clause on media literacy is a genuine, positive attempt at encouraging people to know more, or itself becomes part of an information war that is going on offline and which will not help users at all but only confuse things?
My Lords, I will add to my noble friend’s call for us to consider whether Clause 158 should be struck from the Bill as an unnecessary power for the Secretary of State to take. We have discussed powers for the Secretary of State throughout the Bill, with some helpful improvements led by the noble Baroness, Lady Stowell. This one jars in particular because it is about media literacy; some of the other powers related to whether the Secretary of State could intervene on the codes of practice that Ofcom would issue. The core question is whether we trust Ofcom’s discretion in delivering media literacy and whether we need the Secretary of State to have any kind of power to intervene.
I single out media literacy because the clue is in the name: literacy is a generic skill that you acquire about dealing with the online world; it is not about any specific text. Literacy is a broader set of skills, yet Clause 158 has a suggestion that, in response to specific forms of content or a specific crisis happening in the world, the Secretary of State would want to takesb this power to direct the media literacy efforts. To take something specific and immediate to direct something that is generic and long-term jars and seems inappropriate.
I have a series of questions for the Minister to elucidate why this power should exist at all. It would be helpful to have an example of what kind of “public statement notice”—to use the language in the clause—the Government might want to issue that Ofcom would not come up with on its own. Part of the argument we have been presented with is that, somehow, the Government might have additional information, but it seems quite a stretch that they could come up with that. In an area such as national security, my experience has been that companies often have a better idea of what is going on than anybody in government.
Thousands of people out there in the industry are familiar with APT 28 and APT 29 which, as I am sure all noble Lords know, are better known by their names Fancy Bear and Cozy Bear. These are agents of the Russian state that put out misinformation. There is nothing that UK agencies or the Secretary of State might know about them that is not already widely known. I remember talking about the famous troll factory run by Prigozhin, the Internet Research Agency, with people in government in the context of Russian interference—they would say “Who?” and have to go off and find out. In dealing with threats such as that between the people in the companies and Ofcom, you certainly want a media literacy campaign which tells you about these troll agencies and how they operate and gives warnings to the public, but I struggle to see why you need the Secretary of State to intervene as opposed to allowing Ofcom’s experts to work with company experts and come up with a strategy to deal with those kinds of threat.
The other example cited of an area where the Secretary of State might want to intervene is public health and safety. It would be helpful to be specific; had they had it, how would the Government have used this power during the pandemic in 2020 and 2021? Does the Minister have examples of what they were frustrated about and would have done with these powers that Ofcom would not do anyway in working with the companies directly? I do not see that they would have had secret information which would have meant that they had to intervene rather than trusting Ofcom and the companies to do it.
Perhaps there has been an interdepartmental workshop between DHSC, DCMS and others to cook up this provision. I assume that Clause 158 did not come from nowhere. Someone must have thought, “We need these powers in Clause 158 because we were missing them previously”. Are there specific examples of media literacy campaigns that could not be run, where people in government were frustrated and therefore wanted a power to offer it in future? It would be really helpful to hear about them so that we can understand exactly how the Clause 158 powers will be used before we allow this additional power on to the statute book.
In the view of most people in this Chamber, the Bill as a whole quite rightly grants the Government and Ofcom, the independent regulator, a wide range of powers. Here we are looking specifically at where the Government will, in a sense, overrule the independent regulator by giving it orders to do something it had not thought of doing itself. It is incumbent on the Government to flesh that out with some concrete examples so that we can understand why they need this power. At the moment, as noble Lords may be able to tell, these Benches are not convinced that they do.
My Lords, I will be very brief. The danger with Clause 158 is that it discredits media literacy as something benign or anodyne; it will become a political plaything. I am already sceptical, but if ever there was anything to add to this debate then it is that.
We established in our last debate that the notion of a recognised news publisher will go much broader than a broadcaster. I put it to the Minister that we could end up in an interesting situation where one bit of the Bill says, “You have to protect content from these people because they are recognised news publishers”. Another bit, however, will be a direction to the Secretary of State saying that, to deal with this crisis, we are going to give a media literacy direction that says, “Please get rid of all the content from this same news publisher”. That is an anomaly that we risk setting up with these different provisions.
On the previous group, I raised the issue of legal speech that was labelled as misinformation and removed in the extreme situation of a public health panic. This was seemingly because the Government were keen that particular public health information was made available. Subsequently, we discovered that those things were not necessarily untrue and should not have been removed. Is the Minister arguing that this power is necessary for the Government to direct that certain things are removed on the basis that they are misinformation—in which case, that is a direct attempt at censorship? After we have had a public health emergency in which “facts” have been contested and shown to not be as black and white or true as the Government claimed, saying that the power will be used only in extreme circumstances does not fill me with great confidence.
I am happy to make it clear, as I did on the last group, that the power allows Ofcom not to require platforms to remove content, only to set out what they are doing in response to misinformation and disinformation—to require platforms to make a public statement about what they are doing to tackle it. In relation to regulating news providers, we have brought the further amendments forward to ensure that those subject to sanctions cannot avail themselves of the special provisions in the Bill. Of course, the Secretary of State will be mindful of the law when issuing directions in the exceptional circumstances that these clauses set out.
My Lords, I put my name to this very important amendment—all the more important because of the previous discussions we have had about the difficulties around misinformation or potential government interference in decisions about what is online and what is not online. The noble Lord, Lord Moylan, is right to indicate that this is a very modest and moderate amendment; it addresses the problems of government influence or government moderation, or at least allows those of us who are concerned about it to keep our eye on it and make sure that the country and Parliament know what is going on.
The original idea of disinformation came from an absolutely rightful concern about foreign disinformation between states. People were rightly concerned about security; we all should be and nobody wants to be taken in, in that way. But there has been a worry when agencies designed to combat those threats increasingly turn inward against the public, in a wide range of countries. Although that might not be exactly what has happened in the UK, we should note that Meta CEO Mark Zuckerberg recently admitted that the US Government asked Facebook to suppress true information. In a recent interview, he said that the scientific establishment
“asked for a bunch of things to be censored that, in retrospect, ended up being more debatable or true”.
We should all be concerned about this. It is not just a matter for those of us who are worried about free speech or raise the issue. If we are genuinely worried about misinformation or fake news, we have to make sure that we are equally concerned if it comes from other sources, not just from malign players.
The noble Lord, Lord Moylan, mentioned the American court case Missouri v Biden. In his 155-page ruling, Judge Doughty depicted quite a dystopian scene when he said that, during the pandemic, the US Government seem
“to have assumed a role similar to an Orwellian ‘Ministry of Truth’”.
I do not think we want to emulate the worst of what is happening in the US here.
The judge there outlined a huge complex of government agencies and officials connected with big tech and an army of bureaucrats hired to monitor websites and flag and remove problematic posts. It is not like that in the UK, but some of us were quite taken aback to discover that the Government ran a counter-disinformation policy forum during the lockdown, which brought tech giants together to discuss how to deal with Covid misinformation, as it was said. There was a worry about political interference then.
I do not think that this is just paranoia. Since then, Big Brother Watch and its investigative work have shown that the UK Government had a secret unit that worked with social media companies to monitor and prevent speech critical of Covid lockdown policies, in the shape of the Counter Disinformation Unit, which was set up by Ministers to deal with groups and individuals who criticised policies such as lockdowns, school closures, vaccine mandates or what have you.
Like the noble Lord, Lord Moylan, I do not want to get stuck on what happened during lockdown. That was an exceptional, extreme situation. None the less, the Counter Disinformation Unit—which works out of the Minister’s own department, the DCMS—is still operating. It seems to be able to get content fast-tracked for possible moderation by social media firms such as Facebook and Twitter. It used an AI firm to search social media posts—we need to know the details of that.
I think, therefore, that to have the transparency which the Government and the Minister have constantly stressed is hugely important for the credibility of the Bill, it is important that there is transparency about the likes of the Counter Disinformation Unit and any government attempts at interfering in what we are allowed to see, read or have access to online.
Before the Minister sits down, I think that it is entirely appropriate for him to say—I have heard it before—“Oh no, nothing was taken down. None of this is believable. No individuals were targeted”. However, that is not the evidence I have seen, and it might well be that I have been shown misinformation. But that is why the Minister has to acknowledge that one of the problems here is that indicated by Full Fact—which, as we know, is often endorsed by government Ministers as fact-checkers. It says that because the Government are avoiding any scrutiny for this unit, it cannot know. It becomes a “he said, she said” situation. I am afraid that, because of the broader context, it would make the Minister’s life easier, and be clearer to the public—who are, after all, worried about this—if he accepted the ideas in the amendment of the noble Lord, Lord Moylan. We would then be clear and it would be out in the open. If the FOIs and so on that have been constantly put forward were answered, would that not clear it up?
I have addressed the points made by the noble Baroness and my noble friend already. She asks the same question again and I can give her the same answer. We are operating openly and transparently here, and the Bill sets out further provisions for transparency and accountability.
My Lords, it is all quite exciting now, is it not? I can say “hear, hear!” a lot; everyone is talking about freedom of expression. I cannot tell noble Lords how relieved and pleased I was both to hear the speeches and to see Amendment 228 from the noble Lord, Lord Allan of Hallam, and the noble Viscount, Lord Colville of Culross, who both explained well why this is so important. I am so glad that, even late in our discussions on Report, it has returned as an important issue.
We have already discussed how in many cases, especially when it comes to what is seen as illegal speech, decisions about illegality are very complicated. They are complicated in the law courts and offline, and that is when they have the full power of lawyers, the criminal justice system and so on trying to make decisions. Leaving it up to people who, through no fault of their own, are not qualified but who work in a social media company to try to make that decision in a climate of quite onerous obligations—and having phrases such as “reasonable grounds to infer”—will lead to lawful expression being overmoderated. Ultimately, online platforms will use an abundance of caution, which will lead to a lot of important speech—perfectly lawful if not worthy speech; the public’s speech and the ability to speak freely—being removed. That is not a trivial side issue; it will discredit the Bill, if it has not done so already.
Whenever noble Lords make contributions about why a wide range of amendments and changes are needed—particularly in relation to protecting children, harm and so on—they constantly tell us that the Bill should send an uncompromising message. The difficulty I have is with the danger that the Bill will send an uncompromising message that freedom of expression is not important. I urge the Minister to look carefully at the amendment, because the message should be that, while the Bill is trying to tackle online harm and to protect children in particular—which I have never argued against—huge swathes of it might inadvertently silence people and deprive them of the right to information that they should be able to have.
My Amendment 229—I am not sure why it is in this group, but that is nothing new in the way that the groupings have worked—is about lawful speech and about what content is filtered by users. I have already argued for the replacement of the old legal but harmful duty, but the new duty of user empowerment is welcome, and at face value it puts users in the driving seat and allows adults to judge for themselves what they want and do not want to see. But—and it is a large but—that will work only if users and providers agree about when content should be filtered and what content is filtered.
As with all decisions on speech, as I have just mentioned, in the context particularly of a heightened climate of confusion and sensitivity regarding identity politics and the cancel-culture issues that we are all familiar with, there are some problems with the way that things stand in the Bill. I hope I am using the term “reasonable grounds to infer” in a better way than it is used in terms of illegality. My amendment specifies that companies need to have reasonable grounds to infer that content is abusive or inciting hatred when filtering out content in those user empowerment tools. Where a user chooses to filter out hateful content based on race, on being a woman or whatever, it should catch only content that genuinely falls under those headings. There is a risk that, without this amendment, technologies or individuals working for companies could operate in a heavy-handed way in filtering out legitimate content.
I shall give a couple of examples. Say that someone chooses to filter out abusive content targeting the protected characteristic of race. I imagine that they would have a reasonable expectation that that filter would target aggressive, unpleasant content demeaning to a person because of their race, but does the provider agree with that? Will it interpret my filtering choice as a user in the most restrictive way possible in a bid to protect my safety or by seeing my sensibilities as having a low threshold for what it might consider to be abuse?
The race issue illustrates where we get into difficulties. Will the filterers take their cue from the document that has just been revealed, which was compiled by the Diocese of St Edmundsbury and Ipswich, which the anti-racist campaigning group Don’t Divide Us has just released, and which is being used in 87 schools? Under the heading of racism we have ideas like passive racism includes agreeing that
“There are two sides to every story”,
or if you deny white privilege or if you start a sentence saying, “Not all white people”. “Veiled racism” in this document—which, as I say, is being used in schools for that particular reason by the Church of England—includes a “Euro-centric curriculum” or “cultural appropriation”. “Racist discrimination” includes “anti- immigration policies”, which, as I pointed out before, would indicate that some people would call the Government’s own Bill tonight racist.
The reason why I mention that is that you might think, “I am going to have racism filtered out”, but if there is too much caution then you will have filtered out very legitimate discussions on immigration and cultural appropriation. You will be protected, but if, for example, the filterer follows certain universities that have deemed the novels of Walter Scott, the plays of William Shakespeare or Enid Blyton’s writing as racist, then you can see that we have some real problems. When universities have said there is misogynistic bullying and sexual abuse in “The Great Gatsby” and Ovid’s “Metamorphoses”, I just want to make sure that we do not end up in a situation where there is oversensitivity by the filterers. Perhaps the filtering will take place by algorithm, machine learning and artificial intelligence, but the EHRC has noted that algorithms just cannot cope with the context, cultural difference and complexity of language within the billions of items of content produced every day.
Amendment 229 ensures that there is a common standard—a standard of objective reasonableness. It is not perfect at all; I understand that reasonableness itself is open to interpretation. However, it is an attempt to ensure that the Government’s concept of user empowerment is feasible by at least aspiring to a basic shared understanding between users and providers as to what will be filtered and what will not, and a check against providers’ filter mechanisms removing controversial or unpopular content in the name of protecting users. Just as I indicated in terms of sending a message, if the Government could indicate to the companies that rather than taking a risk-averse attitude, they had to bear in mind freedom of expression, not be oversensitive and not be too risk-averse or overcautious, we might begin to get some balance. Otherwise, an awful lot of lawful material will be removed that is not even harmful.
My Lords, I support Amendment 228. I spoke on this issue to the longer amendment in Committee. To decide whether something is illegal without the entire apparatus of the justice system, in which a great deal of care is taken to decide whether something is illegal, at high volume and high speed, is very worrying. It strikes me as amusing because someone commented earlier that they like a “must” instead of a “maybe”. In this case, I caution that a provider should treat the content as content of the kind in question accordingly, that something a little softer is needed, not a cliff edge that ends up in horrors around illegality where someone who has acted in self-defence is accused of a crime of violence, as happens to many women, and so on and so forth. I do not want to labour the point. I just urge a gentle landing rather than, as it is written, a cliff edge.
(1 year, 3 months ago)
Lords ChamberMy Lords, a lot of positive and interesting things have been said that I am sympathetic to, but this group of amendments raises concerns about a democratic deficit: if too much of the Bill is either delegated to the Secretary of State or open to interference in relation to the Secretary of State and Ofcom, who decides what those priorities are? I will ask for a couple of points of clarification.
I am glad to see that the term “public policy” has been replaced, because what did that mean? Everything. But I am not convinced that saying that the Secretary of State can decide not just on national security but on public safety and public health is reassuring in the present circumstances. The noble Lord, Lord Allan, has just pointed out what it feels like to be leaned on. We had a very recent example internationally of Governments leaning on big tech companies in relation to Covid policies, lockdowns and so on, and removing material that was seen to contradict official public health advice—often public health advice that turned out not to be accurate at all. There should at least have been a lot more debate about what were political responses to a terrible virus. Noble Lords will know that censorship became a matter of course during that time, and Governments interfering in or leaning on big tech directly was problematic. I am not reassured that the Government hold to themselves the ability to lean on Ofcom around those issues.
It is also worth remembering that the Secretary of State already has a huge amount of power to designate, as we have discussed previously. They can designate what constitute priority illegal offences and priority content harmful to children, and that can all change beyond what we have discussed here. We have already seen that there is a constant expansion of what those harms can be, and having those decisions removed using only secondary legislation, unaccountable to Parliament or to public scrutiny, really worries me. It is likely to give a green light to every identity group and special interest NGO to demand that the list of priority harms and so on should be dealt with. That is likely to make the job of the Secretary of State to respond to “something must be done” moral panics all the more difficult. If that is going to happen, we should have parliamentary scrutiny of it; it cannot just be allowed to happen elsewhere.
It is ironic that the Secretary of State is more democratic, because they are elected, than an unelected regulator. I just feel that there is a danger in so much smoke and mirrors. When the Minister very kindly agreed to see the noble Lord, Lord Moylan, and me, I asked in a rather exasperated way why Ofcom could not make freedom of expression a priority, with codes of practice so that it would have to check on freedom of speech. The Minister said, “It’s not up to me to tell Ofcom what to do”, and I thought, “The whole Bill is telling Ofcom what to do”. That did not seem to make any sense.
I had another exchange with the present Secretary of State—again, noble Lords will not be surprised to hear that it was not a sophisticated intervention on my part—in which I said, “Why can’t the Government force the big tech companies to put freedom of expression in their terms and conditions or terms of service?” The Minister said, “They are private companies; we’re not interfering in what they do”. So you just end up thinking, “The whole Bill is telling companies that they’re going to be compelled to act in relation to harm and safety, but not on freedom of expression”. What that means is that you feel all the time as though the Government are saying that they are outsourcing this to third parties, which means that you cannot hold anyone to account.
Civil liberties campaigner Guy Herbert compared this to what is happening with the banks at the moment; they are being blamed by the Government and held to account for things such as politically exposed people and Ts and Cs that overconcentrate on values such as EDI and ESG that may be leading to citizens of this country having their bank accounts closed down. The Government say that they will tell the regulator that it has to act and say that the banks cannot behave in this way, but this all came from legislation—it is not as though the regulator was doing it off its own bat. Maybe it overinterpreted the legislation and the banks then overinterpreted it again and overremoved.
The obvious analogy for me is that there is a danger here that we will not be able to hold anyone to account for overremoval of legitimate democratic discussion from the online world, because everyone is pointing the finger at everyone else. At the very least, the amendments are trying to say that any changes beyond what we have discussed so far on this Bill must come before Parliament. That is very important for any kind of democratic credibility to be attached to this legislation.
My Lords, I too express my admiration to the noble Baroness, Lady Stowell, for her work on this group with the Minister and support the amendments in her name. To pick up on what the noble Baroness, Lady Harding, said about infinite ping-pong, it can be used not only to avoid making a decision but as a form of power and of default decision-making—if you cannot get the information back, you are where you are. That is a particularly important point and I add my voice to those who have supported it.
I have a slight concern that I want to raise in public, so that I have said it once, and get some reassurance from the Minister. New subsection (B1)(d) in Amendment 134 concerns the Secretary of State directing Ofcom to change codes that may affect
“relations with the government of a country outside the United Kingdom”.
Many of the companies that will be regulated sit in America, which has been very forceful about protecting its sector. Without expanding on this too much, when it was suggested that senior managers would face some sort of liability in international fora, various parts of the American Government and state apparatus certainly made their feelings clearly known.
I am sure that the channels between our Government and the US are much more straightforward than any that I have witnessed, but it is absolutely definite that more than one Member of your Lordships’ House was approached about the senior management and said, “This is a worry to us”. I believe that where we have landed is very good, but I would like the Minister to say what the limits of that power are and acknowledge that it could get in a bit of a muddle with the economic outcomes that we were talking about, celebrating that they had been taken off the list, and government relations. That was the thing that slightly worried me in the government amendments, which, in all other ways, I welcome.
My Lords, I will speak very briefly to this amendment; I know that the House is keen to get on to other business today. I very much welcome the amendment that the Government have tabled. My noble friend the Minister has always said that they want to keep women and girls safe online. As has been referred to elsewhere, the importance of making our digital streets safer cannot be overestimated.
As my noble friend said, women and girls experience a disproportionate level of abuse online. That is now recognised in this amendment, although this is only the start, not the end, of the matter. I thank my noble friend and the Secretary of State for their engagement on this issue. I thank the chief executive and the chair of Ofcom. I also thank the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester, who I know cannot be here today, and the noble Lord, Lord Knight, who signed the original amendment that we discussed in Committee.
My noble friend has already talked about the campaigners outside the Chamber who wanted there to be specific mention of women and girls in the Bill. I thank Refuge, the 100,000 people who signed the End Violence Against Women coalition’s petition, BT, Glitch, Carnegie UK, Professor Lorna Woods, the NSPCC and many others who made the case for this amendment.
As my noble friend said, this is Ofcom guidance. It is not necessarily a code of practice, but it is still very welcome because it is broader than just the specific offences that the Government have legislated on, which I also welcome. As he said, this puts all the things that companies, platforms and search engines should be doing to protect women and girls online in one specific place. My noble friend mentioned holistic protection, which is very important.
There is no offline/online distinction these days. Women and girls should feel safe everywhere. I also want to say, because I know that my noble friend has had a letter, that this is not about saying that men and boys should not be safe online; it is about recognising the disproportionate levels of abuse that women and girls suffer.
I welcome the fact that, in producing this guidance, Ofcom will have to consult with the Domestic Abuse Commissioner and the Victims’ Commissioner and more widely. I look forward, as I am sure do all the organisations I just mentioned, to working with Ofcom on the first set of guidance that it will produce. It gives me great pleasure to have signed the amendment and to support its introduction.
My Lords, I know that we do not have long and I do not want to be churlish. I am not that keen on this amendment, but I want to ask a question in relation to it.
I am concerned that there should be no conflation in the best practice guidance between the actual, practical problems of, for example, victims of domestic abuse being stalked online, which is a threat to their safety, or threatened with physical violence—I understand that—and abuse. Abuse is horrible to be on the receiving end of, but it is important for freedom of thought and freedom of speech that we do not make no distinction between words and action. It is important not to overreact or frighten young women by saying that being shouted at is the same as being physically abused.
I am very grateful to everyone for the support they have expressed for this amendment both in the debate now and by adding their names to it. As I said, I am particularly grateful to my noble friend Lady Morgan, with whom we have worked closely on it. I am also grateful for her recognition that men and boys also face harm online, as she rightly points out. As we discussed in Committee, this Bill seeks to address harms for all users but we recognise that women and girls disproportionately face harm online. As we have discussed with the noble Baroness, Lady Merron, women and girls with other characteristics such as women of colour, disabled women, Jewish women and many others face further disproportionate harm and abuse. I hope that Amendment 152 demonstrates our commitment to giving them the protection they need, making it easy and clear for platforms to implement protections for them across all the wide-ranging duties they have.
The noble Baroness, Lady Burt of Solihull, asked why it was guidance and not a code of practice. Ofcom’s codes of practice will set out how companies can comply with the duties and will cover how companies should tackle the systemic risks facing women and girls online. Stipulating that Ofcom must produce specific codes for multiple different issues could, as we discussed in Committee, create duplication between the codes, causing confusion for companies and for Ofcom.
As Ofcom said in its letter to your Lordships ahead of Report, it has already started the preparatory work on the draft illegal content and child sexual abuse and exploitation codes. If it were required to create a separate code relating to violence against women and girls, this preparatory work would need to be revised, so there would be the unintended—and, I think, across the House, undesired—consequence of slowing down the implementation of these vital protections. I am grateful for the recognition that we and Ofcom have had on that point.
Instead, government Amendment 152 will consolidate all the relevant measures across codes of practice, such as on illegal content, child safety and user empowerment, in one place, assisting platforms to reduce the risk of harm that women and girls disproportionately face.
On timing, at present Ofcom expects that this guidance will be published in phase 3 of the implementation of the Bill, which was set out in Ofcom’s implementation plan of 15 June. This is when the duties in Part 4 of the Bill, relating to terms of service and so on, will be implemented. The guidance covers the duties in Part 4, so for guidance to be comprehensive and have the most impact in protecting women and girls, it is appropriate for it to be published during phase 3 of the Bill’s implementation.
The noble Baroness, Lady Fox, mentioned the rights of trans people and the rights of people to express their views. As she knows, gender reassignment and religious or philosophical belief are both protected characteristics under the Equality Act 2010. Sometimes those are in tension, but they are both protected in the law.
With gratitude to all the noble Lords who have expressed their support for it, I commend the amendment to the House.
The Minister did not quite grasp what I said but I will not keep the House. Would he be prepared to accept recommendations for a broader consultation—or who do I address them to? It is important that groups such as the Women’s Rights Network and others, which suffer abuse because they say “I know what a woman is”, are talked to in a discussion on women and abuse, because that would be appropriate.
I am sorry—yes, the noble Baroness made a further point on consultation. I want to reassure her and other noble Lords that Ofcom has the discretion to consult whatever body it considers appropriate, alongside the Victims’ Commissioner, the Domestic Abuse Commissioner and others who I mentioned. Those consultees may not all agree. It is important that Ofcom takes a range of views but is able to consult whomever. As I mentioned previously, Ofcom and its officers can be scrutinised in Parliament through Select Committees and in other ways. The noble Baroness could take it up directly with them but could avail herself of those routes for parliamentary scrutiny if she felt that her pleas were falling on deaf ears.
My Lords, I am completely opposed to Amendments 159 and 160, but the noble Lords, Lord Faulks and Lord Black, and the noble Viscount, Lord Colville, have explained the issues perfectly. I am fully in agreement with what they said. I spoke at length in Committee on that very topic. This is a debate we will undoubtedly come back to in the media Bill. I, for one, am extremely disappointed that the Labour Party has said that it will not repeal Section 40. I am sure that these issues will get an airing elsewhere. As this is a speech-limiting piece of legislation, as was admitted earlier this week, I do not want any more speech limiting. I certainly do not want it to be a media freedom-limiting piece of legislation on top of that.
I want to talk mainly about the other amendments, Amendments 158 and 161, but approach them from a completely different angle from the noble Lord, Lord Allan of Hallam. What is the thinking behind saying that the only people who can clip content from recognised news publishers are the news publishers? The Minister mentioned in passing that there might be a problem of editing them, but it has become common practice these days for members of the public to clip from recognised news publishers and make comments. Is that not going to be allowed? That was the bit that completely confused me. It is too prescriptive; I can see all sorts of people getting caught by that.
The point that the noble Lord, Lord Allan of Hallam, made about what constitutes a recognised news publisher is where the issue gets quite difficult. The point was made about the “wrong” organisations, but I want to know who decides what is right and wrong. We might all nod along when it comes to Infowars and RT, but there are lots of organisations that would potentially fail that test. My concern is that they would not be able to appeal when they are legitimate news organisations, even if not to everybody’s taste. Because I think that we already have too much speech limiting in the Bill, I do not want any more. This is important.
When it comes to talking about the “wrong” organisations, I noticed that the noble Lord, Lord McNally, referred to people who went to Rupert Murdoch’s parties. I declare my interests here: I have never been invited or been to a Rupert Murdoch party—although do feel free, I say, if he is watching—but I have read about them in newspapers. For some people in this Chamber, the “wrong” kind of news organisation is, for example, the Times or one with the wrong kind of owner. The idea that we will all agree or know which news publishers are the “wrong” kind is not clear, and I do not think that the test is going to sort it out.
Will the Minister explain what organisations can do if they fail the recognised news publisher test to appeal and say, “We are legitimate and should be allowed”? Why is there this idea that a member of the public cannot clip a recognised news publisher’s content without falling foul? Why would they not be given some exemption? I genuinely do not understand that.
My Lords, I shall speak very briefly. I feel a responsibility to speak, having spoken in Committee on a similar group of amendments when the noble Lords, Lord Lipsey and Lord McNally, were not available. I spoke against their amendments then and would do so again. I align myself with the comments of my noble friend Lord Black, the noble Lord, Lord Faulks, and the noble Viscount, Lord Colville. As the noble Baroness, Lady Fox, just said, they gave a comprehensive justification for that position. I have no intention of repeating it, or indeed repeating my arguments in Committee, but I think it is worth stating my position.
(1 year, 3 months ago)
Lords ChamberMy Lords, I thank the Minister for engaging with the amendment in my name and that of the noble Baroness, Lady Benjamin, in Committee, to ensure parity between the regulation of online and offline pornography. We did not table it for Report because of the welcome news of the Government’s review. At this point, I would like to give my backing to all that my noble friend Lord Bethell said and would like to thank him for his great encouragement and enthusiasm on our long journey, as well as the noble Baroness, Lady Kidron. I would particularly like to mention the noble Baroness, Lady Benjamin, who, as my noble friend Lord Bethell mentioned, must be very frustrated today at not being able to stand up and benefit us with her passion on this subject, which has kept a lot of us going.
I have some questions and comments about the review, but first I want to stand back and state why this review is so necessary. Our society must ask how pornography was able to proliferate so freely, despite all the warnings of the danger and consequences of this happening when the internet was in its infancy. Human appetites, the profit motive and the ideology of cyberlibertarianism flourished freely in a zeitgeist where notions of right and wrong had become deeply unfashionable. Pre-internet, pornography was mainly on top shelves, in poky and rather sordid sex shops, or in specialist cinemas. There was recognition that exposure to intimate sex acts should never be accidental but always the result of very deliberate decisions made by adults—hence the travesty of leaving children exposed to the danger of stumbling across graphic, violent and frequently misogynistic pornography by not bringing Part 3 of the Digital Economy Act 2017 into force.
I have talked previously in this House about sociology professor Christie Davies’ demoralisation of society thesis: what happens when religiously reinforced moralism, with its totemic notion of free will, is ditched along with God. Notions of right and wrong become subjective, individually determined, and a kind of blindness sets in; how else can we explain why legislators ignored the all-too-predictable effects of unrestrained access to pornography on societal well-being, including but not limited to harms to children? For this Bill to be an inflection point in history, this review, birthed out of it, must unashamedly call out the immorality of what has gone before. How should we define morality? Well, society simply does not work if it is governed by self-gratification and expressive individualism. Relationships—the soil of society—including intimate sexual relationships, are only healthy if they are self-giving, rather than self-gratifying. These values did not emerge from the Enlightenment but from the much deeper seam of our Judeo-Christian foundations. Pornography is antithetical to these values.
I turn to the review’s terms of reference. Can the Minister confirm that the lack of parity between online and offline regulation will be included in the legal gaps it will address? Can he also confirm that the review will address gaps in evidence? As I said in Committee, a deep seam of academic research already exists on the harmful effects of the ubiquity of pornography. The associations with greater mental ill health, especially among teenagers, are completely unsurprising; developing brains are being saturated with dark depictions of child sexual abuse, incest, trafficking, torture, rape, violence and coercion. As I mentioned earlier, research shows that adults whose sexual arousal is utterly dependent on pornography can be catastrophically impaired in their ability to form relationships with flesh-and-blood human beings, let alone engage in intimate physical sex.
Will the review also plug gaps in areas that remain underresearched and controversial and where vested interests are bound? On that point, whoever chairs this review will have to be ready, willing and able to take on powerful, ideologically motivated and profit-driven lobbies.
Inter alia, we need to establish through research the extent to which some young women are driven to change their gender because of hyper-sexualised, porn-depicted female stereotypes. Anecdotally, some individuals have described their complete inability to relate to their natal sex. It can be dangerous and distasteful to be a woman in a world of pornified relationships which expects them to embrace strangulation, degradation and sexual violence. One girl who transitioned described finding such porn as a child: “I am ashamed that I was fascinated by it and would seek it out. Despite this interest in watching it, I hated the idea of myself actually being in the position of the women. For a while, I even thought I was asexual. Sex is still scary to me, complicated”.
Finally, the Government’s announcement mentioned several government departments but does not make it clear that they will also draw in the work of DfE and DHSC—the departments for children’s and adult mental health—for reasons I have already touched on. Can the Minister confirm that the remit will include whatever areas of government responsibility are needed so that the review is genuinely broad enough to look across society at how to protect not just children but adults?
My Lords, I rise to speak to Amendment 184 in my name—
My Lords, the guidance in the Companion states that Peers who were not present for the opening of this debate last week should not speak in the debate today, so I will have to ask the noble Baroness to reserve her remarks on this occasion.
My Lords, I am rather disappointed that, while this is a large group on freedom of expression, it is dominated by amendments by myself and the noble Lord, Lord Moylan. I welcome the noble Baroness, Lady Fraser of Craigmaddie, and the noble Lord, Lord Stevenson of Balmacara, dipping their toes in the free-expression water here and I am glad that the Minister has added his name to their amendment, although it is a shame that he did not add his name to one of mine.
Earlier today we heard a lot of congratulations to the Government for listening. I have to say, it depends who you are, because the Government have not listened to all of us. It is notable that, of the hundreds of new government concessions that have taken the form of amendments on Report, none relates to free speech. Before I go through my amendments, I want to note that, when the noble Lord, Lord Moylan, and I raise concerns about free speech, it can be that we get treated as being slightly eccentric. There has been a generally supportive and generous mood from the regulars in this House. I understand that, but I worry that free speech is being seen as peripheral.
This country, our country, that we legislate for and in, has a long history of boasting that it is the home of liberty and adopts the liberal approach that being free is the default position: that free speech and the plurality and diversity of views it engenders are the cornerstone of democracy in a free society and that any deviation from that approach must require extraordinary and special justification. A comprehensive piece of law, such as the one we are dealing with, that challenges many of those norms, deserves thorough scrutiny through the prism of free speech.
When I approached this Bill, which I had been following long before I arrived in this House, I assumed that there would be packed Benches—as there are on the Illegal Migration Bill—and that everybody, including all these Law Lords, would be in quoting the European Court of Human Rights on Article 8 and Article 10. I assumed there would be complaints about Executive power grabs and so on. But it has been a bit sparse.
That is okay; I can live with that, even if it is a bit dispiriting. But I am concerned when the Government cite that the mood of the Committee has been reflected in their amendments, because it has not been a very large Committee. Many of the amendments that I, the noble Lord, Lord Moylan, and others tabled about free expression represent the concerns of a wide range of policy analysts, civil rights groups, academics, lawyers, free speech campaigners and industry representatives. They have been put forward in good faith—I continue to do that—to suggest ways of mitigating some of the grave threats to free speech in this Bill, with constructive ideas about how to tackle flaws and raising some of the problems of unintended consequences. I have, at times, felt that those concerns were batted away with a certain indifference. Despite the Minister being very affable and charming, none the less it can be a bit disappointing.
Anyway, I am here to bat again. I hope that the Government now will listen very closely and consider how to avoid the UK ending up with the most restrictive internet speech laws of any western democracy at the end of this. I have a lot of different amendments in my name in this group. I wholeheartedly support the amendments in the name of the noble Lord, Lord Moylan, requiring Ofcom to assess the impact of its codes on free speech, but I will not speak to them.
I will talk about my amendments, starting with Amendments 77, 78, 79, 80 and 81. These require platforms to have particular regard to freedom of expression, not just when implementing safety measures and policies but when writing their terms of service. This is to ensure that freedom of expression is not reduced to an abstract “have regard to” secondary notion but is visible in drafting terms of services. This would mean that users know their rights in clear and concrete terms. For example, a platform should be expected to justify how a particular term of service, on something such as religious hatred, will be balanced with consideration of freedom of expression and conscience, in order to allow discussions over different beliefs to take place. Users need to be able to point to specific provisions in the terms of service setting out their free speech protections.
This is all about parity between free speech and safety. Although the Government—and I welcome this—have attempted some balance, via Clause 18, to mitigate the damage to individual rights of free expression from the Bill, it is a rather weak, poor cousin. We need to recognise that, if companies are compelled to prevent and minimise so-called harmful content via operational safety duties, these amendments are saying that there should be parity with free expression. They should be compelled to do the same on freedom of expression, with a clear and positive duty, rather than Clause 64, which is framed rather negatively.
Amendment 188 takes on the issue of terms of service from a different direction, attempting to ensure that duties with regard to safety must not be allowed to restrict lawful expression or that protected by Article 10 of the European Convention on Human Rights. That states that interference in free speech rights is not lawful unless it is a last resort. I note, in case anyone is reading the amendment carefully, and for Hansard, that the amendment cites Article 8—a rather Freudian slip on my part that was not corrected by the Table Office. That is probably because privacy rights are also threatened by the Bill, but I meant Article 10 of course.
Amendment 188 addresses a genuine dilemma in terms of Ofcom enforcing safety duties via terms and conditions. These will transform private agreements between companies and users into statutory duties under Clause 65. This could mean that big tech companies would be exercising public law functions by state-backed enforcement of the suppression of lawful speech. One worry is that platforms’ terms of service are not neutral; they can change due to external political or commercial pressures. We have all been following with great interest what is happening at Twitter. They are driven by values which can be at odds with UK laws. So I hope the Minister will answer the query that this amendment poses: how is the UK able to uphold its Article 10 obligations if state regulators are legally instructed to enforce terms of service attitudes to free speech, even when they censor far more than UK domestic law requires?
Amendment 162 has a different focus and removes offences under Section 5 of the Public Order Act from the priority offences to be regulated as priority illegal content, as set out in Schedule 7. This amendment is prompted by a concern that the legislation enlists social media companies to act as a private online police force and to adjudicate on the legality of online content. This is especially fraught in terms of the legal limits on speech, where illegality is often contested and contentious—offline as well as online.
The inclusion of Section 5 would place a duty on service providers to take measures to prevent individuals ever encountering content that includes
“threatening or abusive words or behaviour, or disorderly behaviour”
that is likely to cause “harassment, alarm or distress”. It would also require service providers to minimise the length of time such content is present on the service.
I am not sure whether noble Lords have been following the dispute that broke out over the weekend. There is a film on social media doing the rounds of a trans speaker, Sarah Jane Baker, at the Trans Pride event screaming pretty hysterically “If you see a TERF, punch them in the effing face”—and I am being polite. You would think that that misogynistic threat would be the crime people might be concerned about, yet some apologists for Trans Pride claim that those women—TERFs such as myself—who are outraged, and have been treating the speech as saying that, are the ones who are stirring up hate.
Now, that is a bit of a mess, but asking service providers, or indeed algorithms, to untangle such disputes can surely lead only to the over-removal of online expression, or even more of a muddle. As the rule of law charity Justice points out, this could also catch content that depicts conflict or atrocities, such as those taking place in the Russia-Ukraine war. Justice asks whether the inclusion of Section 5 of the POA could lead to the removal of posts by individuals sharing stories of their own abuse or mistreatment on internet support forums.
Additionally, under Schedule 7 to the Bill, versions of Section 5 could also be regulated as priority illegal conduct, meaning that providers would have to remove or restrict content that, for instance, encourages what is called disorderly behaviour that is likely to cause alarm. Various organisations are concerned that this could mean that content that portrayed protest activity, that might be considered disorderly by some, was removed unless you condemned it, or even that content which encouraged people to attend protests would be in scope.
I am not a fan of Section 5 of the Public Order Act, which criminalises stirring up hatred, at the best of times, but at least those offences have been and are subject to the full rigour of the criminal justice system and case law. Of course, the courts, the CPS and the police are also bound, for example by Article 10, to protect free speech. But that is very different to compelling social media companies, their staff or automated algorithms to make such complex assessments of the Section 5 threshold of illegality. Through no fault of their own, those companies are just not qualified to make such determinations, and it is obvious that that could mean that legitimate speech will end up being restricted. Dangerously, it also makes a significant departure from the UK’s rule of law in deciding what is legal or illegal speech. It has the potential to limit UK users’ ability to engage in important aspects of public life, and prevent victims of abuse from sharing their stories, as I have described.
I turn finally to the last amendment, Amendment 275—I will keep this short, for time’s sake. I will not go into detail, but I hope that the Minister will take a look at it, see that there is a loophole, and discuss it with the department. In skeleton form, the Free Speech Union has discovered that the British Board of Film Classification runs a mobile classification network, an agreement with mobile network providers that means that it advises mobile providers on what content should be filtered because it is considered suitable for adults only. This arrangement is private, not governed by statute, and as such means that even the weak free speech safeguards in this Bill can be sidestepped. This affects not only under-18s but anyone with factory settings on their phone. It led to a particular bizarre outcome when last year the readers of the online magazine, “The Conservative Woman”, reported that the website was inaccessible. This small online magazine was apparently blacklisted by the BBFC because of comments below the line on its articles. The potential for such arbitrary censorship is a real concern, and the magazine cannot even appeal to the BBFC, so I ask the Minister to take this amendment back to the DCMS, which helped set up this mobile classification network, and find out what is going on.
That peculiar tale illustrates my concerns about what happens when free speech is not front and centre, even when you are concerned about safety and harm. I worry that when free speech is casually disregarded, censorship and bans can become the default, and a thoughtless option. That is why I urge the Minister before Third Reading to at least make sure that some of the issues and amendments in this group are responded to positively.
My Lords, my noble friend on the Front Bench said at various points when we were in Committee that the Bill struck an appropriate balance between protecting the rights of children and the rights of those wishing to exercise their freedom of expression. I have always found it very difficult indeed to discern that point of balance in the Bill as originally drafted, but I will say that if there were such a point, it has been swamped by the hundreds of amendments tabled to the Bill by my noble friend since Committee which push the Bill entirely in the opposite direction.
Among those amendments, I cannot find—it may be my fault, because I am just looking by myself; I have no help to find these things—a single one which seeks to redress the balance back in favour of freedom of expression. My Amendments 123, 128, 130, 141, 148 and 244 seek to do that to some extent, and I am grateful to the noble Baroness, Lady Fox of Buckley, for the support she has expressed for them.
On that point specifically, having worked inside one of the companies, they fear legal action under all sorts of laws, but not under the European Convention on Human Rights. As the Minister explained, it is for public bodies; if people are going to take a case on Article 10 grounds, they will be taking it against a public body. There are lots of other grounds to go after a private company but not ECHR compliance.
My Lords, I genuinely appreciate this debate. The noble Lord, Lord Clement-Jones, made what I thought was a very important point, which is, in going through the weeds of the Bill—and some people have been involved in it for many years, looking at the detail—I appreciate that it can be easy to forget the free speech point. It is important that it has been raised but it also constantly needs to be raised. That is the point: it is, as the noble Lord, Lord Allan of Hallam, admitted, a speech-restricting Bill where we are working out the balance.
I apologise to the noble and learned, Lord Hope of Craighead, for not acknowledging that he has constantly emphasised the distinction between free speech and free expression. He and I will not agree on this; it is that we do not have time for this argument now rather than me not understanding. But he has been diligent in his persistence in trying to at least raise the issues and that is important.
I was a bit surprised by the Minister’s response because, for the first time ever, since I have been here, there has been some enthusiasm across the House for one of my amendments—it really is unprecedented—Amendment 162 on the public order offences. I thought that the Minister might have noted that, because he has noted it every other time there has been a consensus across the House. I think he ought to look again at Amendment 162.
To indicate the muddle one gets in, in terms of public order offences and illegality, the police force in Cheshire, where I am from, has put out a film online today saying that misgendering is a crime. That is the police who have said that. It is not a crime and the point about these things, and the difficulty we are concerned with, is asking people to remove and censor material based on illegality or public offences that they should not be removing. That is my concern: censorship.
To conclude, I absolutely agree with the noble Lord, Lord Allan of Hallam, that of course free speech does not mean saying whatever you want wherever you want. That is not free speech, and I am a free speech absolutist. Even subreddits—if people know what they are—think they are policing each other’s speech. There are norms that are set in place. That is fine with me—that multitude.
My concern is that a state body such as Ofcom is going to set norms of what is acceptable free speech that are lower than free speech laws by demanding, on pain of breach of the law, with fines and so on, that these private companies have to impose their own terms of service, which can actually then set a norm, leading them to be risk-averse, and set a norm for levels of speech that are very dangerous. For example, when you go into work, you cannot just say anything, but there are people such as Maya Forstater, who said something at work and was disciplined and lost her job and has just won more than £100,000, because she was expressing her views and opinions. The Equality Act ran to her aid and she has now won and been shown to be right. You cannot do that if your words have disappeared and are censored.
I could talk about this for a long time, as noble Lords know. I hope that at least, as the Bill progresses, even when it becomes an Act, the Government could just stamp on its head, “Don’t forget free speech”—but before then, as we end this process, they could come back with some concessions to some of the amendments that have been raised here today. That would be more than just words. I beg leave to withdraw the amendment.
(1 year, 4 months ago)
Lords ChamberMy Lords, first, I welcome the amendment from the noble Lord, Lord Allan, and his motivation, because I am concerned that, throughout the Bill, the wrong targets are being caught up. I was grateful to hear his recognition that people who talk about their problems with self-harm could end up being targeted, which nobody would ever intend. These things need to be taken seriously.
In that sense, I was slightly concerned about the motivation of the noble Baroness, Lady Burt of Solihull, in the “reckless” amendment. The argument was that the recklessness standard is easier to prove. I am always worried about things that make it easier to prosecute someone, rather than there being a just reason for that prosecution. As we know, those involved in sending these images are often immature and very foolish young men. I am concerned about lowering the threshold at which we criminalise them—potentially destroying their lives, by the way, because if you have a criminal record it is not good—even though I in no way tolerate what they are doing and it is obviously important that we take that on.
There is a danger that this law will become a mechanism through which people try to resolve a whole range of social problems—which brings me on to responding to the speech just made by the noble Baroness, Lady Kennedy of The Shaws. I continue to be concerned about the question of trying to criminalise indirect threats. The point about somebody who sends a direct threat is that we can at least see the connection between that direct threat and the possibility of action. It is the same sort of thing that we have historically considered in relation to incitement. I understand that, where your physical being is threatened by words, physically a practical thing can happen, and that is to be taken very seriously. The problem I have is with the indirect threat from somebody who says, for example, “That smile should be taken of your face. It can be arranged”, or other indirect but incredibly unpleasant comments. There is clearly no link between that and a specific action. It might use violent language but it is indirect: “It could be arranged”, or “I wish it would happen”.
Anyone on social media—I am sure your Lordships all are—will know that I follow very carefully what people from different political parties say about each other. I do not know if you have ever followed the kind of things that are said about the Government and their Ministers, but the threats are not indirect and are often named. In that instance, it is nothing to do with women, but it is pretty violent and vile. By the way, I have also followed what is said about the Opposition Benches, and that can be pretty violent and vile, including language that implies that they wish those people were the subject of quite intense violence—without going into detail. That happens, and I do not approve of it—obviously. I also do not think that pile-ons are pleasant to be on the receiving end of, and I understand how they happen. However, if we criminalise pile-ons on social media, we are openly imposing censorship.
What is worse in my mind is that we are allowing the conflation of words and actions, where what people say or think is the same as acting on it, as the criminal law would see it. We have seen a very dangerous trend recently, which is particularly popular in the endless arguments and disputes over identity politics, where people will say that speech is violence. This has happened to a number of gender-critical feminists, in this instance women, who have gone in good faith to speak at universities, having been invited. They have been told that their speech was indistinguishable from violence and that it made students at the university feel under threat and unsafe and that it was the equivalent of being attacked. But guess what? Once you remove that distinction, the response to that speech can be to use violence, because you cannot tell the difference between them. That has happened around a number of university actions, where speakers and their supporters were physically assaulted by people who said that they were using self-defence against speech that was violent. I get nervous that this is a slippery slope, and we certainly should not go anywhere near it in legislation.
Finally, I agree that we should tackle the culture of people piling on and using this kind of language, but it is a cultural and social question. What we require is moral leadership and courage in the face of it—calling it out, arguing against it and so on. It is wrong to use the law to send messages; it is an abdication of moral leadership and a cop-out, let alone dangerous in what is criminalised. I urge your Lordships to reject those amendments.
My Lords, I will speak briefly to Amendments 5C and 7A in this group. I welcome the Government’s moves to criminalise cyberflashing. It is something that many have campaigned for in both Houses and outside for many years. I will not repeat the issues so nobly introduced by the noble Baroness, Lady Burt, and I say yet again that I suspect that the noble Baroness, Lady Featherstone, is watching, frustrated that she is still not able to take part in these proceedings.
It is worth making the point that, if actions are deemed to be serious enough to require criminalisation and for people potentially to be prosecuted for them, I very much hope that my noble friend the Minister will be able to say in his remarks that this whole area of the law will be kept under review. There is no doubt that women and girls’ faith in the criminal justice system, both law enforcement and the Crown Prosecution Service, is already very low. If we trumpet the fact that this offence has been introduced, and then there are no prosecutions because the hurdles have not been reached, that is even worse than not introducing the offence in the first place. So I hope very much that this will be kept under review, and no doubt there will be opportunities to return to it in the future.
I do not want to get into the broader debate that we have just heard, because we could be here for a very long time, but I would just say to the noble Baronesses, Lady Kennedy and Lady Fox, that we will debate this in future days on Report and there will be specific protection and mention of women and girls on the face of the Bill—assuming, of course, that Amendment 152 is approved by this House. The guidance might not use the words that have been talked about, but the point is that that is the place to have the debate—led by the regulator with appropriate public consultation—about the gendered nature of abuse that the noble Baroness, Lady Kennedy, has so eloquently set out. I hope that will also be a big step forward in these matters.
I look forward to hearing from the Minister about how this area of law will be kept under review.
(1 year, 4 months ago)
Lords ChamberMy Lords, there is a danger of unanimity breaking out. The noble Lord, Lord Moylan, and I are not always on the same page as others, but this is just straightforward. I hope the Government listen to the fact that, even though we might be coming at this in different ways, there is concern on all sides.
I also note that this is a shift from what happened in Committee, when I tabled an amendment to try to pose the same dilemmas by talking about the size of organisations. Many a noble Lord said that size did not matter and that that did not work—but it was trying to get at the same thing. I do feel rather guilty that, to move the core philosophy forward, I have dumped the small and micro start-ups and SMEs that I also wanted to protect from overregulation—that is what has happened in this amendment—but now it seems an absolute no-brainer that we should find a way to exempt public interest organisations. This is where I would go slightly further. We should have a general exemption for public interest organisations, but with the ability for Ofcom to come down hard if they look as though they have moved from being low risk to being a threat.
As the noble Lord, Lord Moylan, noted, public interest exemptions happen throughout the world. Although I do not want to waste time reading things out, it is important to look at the wording of Amendment 29. As it says, we are talking about:
“historical, academic, artistic, educational, encyclopaedic, journalistic, or statistical content”.
We are talking about the kind of online communities that benefit the public interest. We are talking about charities, user-curated scientific publications and encyclopaedias. They is surely not what this Bill was designed to thwart. However, there is a serious danger that, if we put on them the number of regulatory demands in the Bill, they will not survive. That is not what the Government intend but it is what will happen.
Dealing with the Bill’s complexity will take much time and money for organisations that do not have it. I run a small free-speech organisation called the Academy of Ideas and declare my interest in it. I am also on the board of the Free Speech Union. When you have to spend so much time on regulatory issues it costs money and you will go under. That is important. This could waste Ofcom’s time. The noble Lord, Lord Allan of Hallam, has explained that. It would prevent Ofcom concentrating on the nasty bits that we want it to. It would be wasting its time trying to deal with what is likely to happen.
I should mention a couple of other things. It is important to note that there is sometimes controversy over the definition of a public interest organisation. It is not beyond our ken to sort it out. I Googled it—it is still allowed—and came up with a Wikipedia page that still exists. That is always good. If one looks, the term “public interest” is used across a range of laws. The Government know what kind of organisations they are talking about. The term has not just been made up for the purpose of an exemption.
It is also worth noting that no one is talking about public interest projects and organisations not being regulated at all but this is about an exemption from this regulation. They still have to deal with UK defamation, data protection, charity, counterterrorism and pornography laws, and the common law. Those organisations’ missions and founding articles will require that they do some good in the world. That is what they are all about. The Government should take this matter seriously.
Finally, on the rescue clauses, it is important to note—there is a reference to the Gambling Act—the Bill states that if there is problem, Ofcom should intervene. That was taken from what happens under the Gambling Act, which allows UK authorities to strip one or more gambling businesses of their licensing exemptions when they step out of line. No one is trying to say do not look at those exemptions at all but they obviously should not be in the scope of the Bill. I hope that when we get to the next stage, the Government will, on this matter at least, accept the amendment.
To follow on from that, we are talking about the obligation to bring exemptions to Parliament. Well, we are in Parliament and we are bringing exemptions. The noble Lord is recommending that we bring very specific exemptions while those that the noble Lord, Lord Moylan, and I have been recommending may be rather broad—but I thought we were bringing exemptions to Parliament. I am not being facetious. The point I am making is, “Why can’t we do it now?” We are here now, doing this. We are saying, as Parliament, “Look at these exemptions”. Can the Minister not look at them now instead of saying that we will look at them some other time?
I may as well intervene now as well, so that the Minister can get a good run at this. I too am concerned at the answer that has been given. I can see the headline now, “Online Safety Bill Age-Gates Wikipedia”. I cannot see how it does not, by virtue of some of the material that can be found on Wikipedia. We are trying to say that there are some services that are inherently in a child’s best interests—or that are in their best interests according to their evolving capacity, if we had been allowed to put children’s rights into the Bill. I am concerned that that is the outcome of the answer to the noble Lord, Lord Allan.
My Lords, interestingly, because I have not discussed this at all with the noble Lord, Lord Moylan, I have some similar concerns to his. I have always wanted this to be a children’s online safety Bill. My concerns generally have been about threats to adults’ free speech and privacy and the threat to the UK as the home of technological innovation. I have been happy to keep shtum on things about protecting children, but I got quite a shock when I saw the series of government amendments.
I thought what most people in the public think: the Bill will tackle things such as suicide sites and pornography. We have heard some of that very grim description, and I have been completely convinced by people saying, “It’s the systems”. I get all that. But here we have a series of amendments all about content—endless amounts of content and highly politicised, contentious content at that—and an ever-expanding list of harms that we now have to deal with. That makes me very nervous.
On the misinformation and disinformation point, the Minister is right. Whether for children or adults, those terms have been weaponised. They are often used to delegitimise perfectly legitimate if contrary or minority views. I say to the noble Baroness, Lady Kidron, that the studies that say that youth are the fastest-growing far-right group are often misinformation themselves. I was recently reading a report about this phenomenon, and things such as being gender critical or opposing the small boats arriving were considered to be evidence of far-right views. That was not to do with youth, but at least you can see that this is quite a difficult area. I am sure that many people even in here would fit in the far right as defined by groups such as HOPE not hate, whose definition is so broad.
My main concerns are around the Minister’s Amendment 172. There is a problem: because it is about protected characteristics—or apes the protected characteristics of the Equality Act—we might get into difficulty. Can we at least recognise that, even in relation to the protected characteristics as noted in the Equality Act, there are raging rows politically? I do not know how appropriate it is that the Minister has tabled an amendment dragging young people into this mire. Maya Forstater has just won a case in which she was accused of being opposed to somebody’s protected characteristics and sacked. Because of the protected characteristics of her philosophical views, she has won the case and a substantial amount of money.
I worry when I see this kind of list. It is not just inciting hatred—in any case, what that would mean is ambivalent. It refers to abuse based on race, religion, sex, sexual orientation, disability and so on. This is a minefield for the Government to have wandered into. Whether you like it or not, it will have a chilling effect on young people’s ability to debate and discuss. If you worry that some abuse might be aimed at religion, does that mean that you will not be able to discuss Charlie Hebdo? What if you wanted to show or share the Charlie Hebdo cartoons? Will that count? Some people would say that is abusive or inciteful. This is not where the Bill ought to be going. At the very least, it should not be going there at this late stage. Under race, it says that “nationality” is one of the indicators that we should be looking out for. Maybe it is because I live in Wales, but there is a fair amount of abuse aimed at the English. A lot of Scottish friends dole it out as well. Will this count for young people who do that? I cannot get it.
My final question is in relation to proposed subsection (11). This is about protecting children, yet it lists a person who
“has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex”.
Are the Government seriously accepting that children have not just proposed to reassign but have been reassigned? That is a breach of the law. That is not meant to be happening. Your Lordships will know how bad this is. Has the Department for Education seen this? As we speak, it is trying to untangle the freedom for people not to have to go along with people’s pronouns and so on.
This late in the day, on something as genuinely important as protecting children, I just want to know whether there is a serious danger that this has wandered into the most contentious areas of political life. I think it is very dangerous for a government amendment to affirm gender reassignment to and about children. It is genuinely irresponsible and goes against the guidance the Government are bringing out at the moment for us to avoid. Please can the Minister clarify what is happening with Amendment 172?
My Lords, I am not entirely sure how to begin, but I will try to make the points I was going to make. First, I would like to respond to a couple of the things said by the noble Baroness, Lady Fox. With the greatest respect, I worry that the noble Baroness has not read the beginning of the proposed new clause in Amendment 172, subsection (2), which talks about “Content which is abusive”, as opposed to content just about race, religion or the other protected characteristics.
One of the basic principles of the Bill is that we want to protect our children in the digital world in the same way that we protect them in the physical world. We do not let our children go to the cinema to watch content as listed in the primary priority and priority content lists in my noble friend the Minister’s amendments. We should not let them in the digital world, yet the reality is that they do, day in and day out.
I thank my noble friend the Minister, not just for the amendments that he has tabled but for the countless hours that he and his team have devoted to discussing this with many of us. I have not put my name to the amendments either because I have some concerns but, given the way the debate has turned, I start by thanking him and expressing my broad support for having the harms in the Bill, the importance of which this debate has demonstrated. We do not want this legislation to take people by surprise. The important thing is that we are discussing some fundamental protections for the most vulnerable in our society, so I thank him for putting those harms in the Bill and for allowing us to have this debate. I fear that it will be a theme not just of today but of the next couple of days on Report.
I started with the positives; I would now like to bring some challenges as well. Amendments 171 and 172 set out priority content and primary priority content. It is clear that they do not cover the other elements of harm: contact harms, conduct harms and commercial harms. In fact, it is explicit that they do not cover the commercial harms, because proposed new subsection (4) in Amendment 237 explicitly says that no amendment can be made to the list of harms that is commercial. Why do we have a perfect crystal ball that means we think that no future commercial harms could be done to our children through user-to-user and search services, such that we are going to expressly make it impossible to add those harms to the Bill? It seems to me that we have completely ignored the commercial piece.
I move on to Amendment 174, which I have put my name to. I am absolutely aghast that the Government really think that age-inappropriate sexualised content does not count as priority content. We are not necessarily talking here about a savvy 17 year-old. We are talking about four, five and six year-olds who are doomscrolling on various social media platforms. That is the real world. To suggest that somehow the digital world is different from the old-fashioned cinema, and a place where we do not want to protect younger children from age-inappropriate sexualised material, just seems plain wrong. I really ask my noble friend the Minister to reconsider that element.
I am also depressed about the discussion that we had about misinformation. As I said in Committee several times, I have two teenage girls. The reality is that we are asking today’s teenagers to try to work out what is truth and what is misinformation. My younger daughter will regularly say, “Is this just something silly on the internet?” She does not use the term “misinformation”; she says, “Is that just unreal, Mum?” She cannot tell about what appears in her social media feeds because of the degree of misinformation. Failing to recognise that misinformation is a harm for young people who do not yet know how to validate sources, which was so much easier for us when we were growing up than it is for today’s generations, is a big glaring gap, even in the content element of the harms.
I support the principle behind these amendments, and I am pleased to see the content harms named. We will come back next week to the conduct and contact harms—the functionality—but I ask my noble friend the Minister to reconsider on both misinformation and inappropriate sexualised material, because we are making a huge mistake by failing to protect our children from them.
My Lords, Amendment 172 is exceptionally helpful in putting the priority harms for children on the face of the Bill. It is something that we have asked for and I know the pre-legislative scrutiny committee asked for it and it is good to see it there. I want to comment to make sure that we all have a shared understanding of what this means and that people out there have a shared understanding.
My understanding is that “primary priority” is, in effect, a red light—platforms must not expose children to that content if they are under 18—while “priority” is rather an amber light and, on further review, for some children it will be a red light and for other children it be a green light, and they can see stuff in there. I am commenting partly having had the experience of explaining all this to my domestic focus group of teenagers and they said, “Really? Are you going to get rid of all this stuff for us?” I said, “No, actually, it is quite different”. It is important in our debate to do that because otherwise there is a risk that the Bill comes into disrepute. I look at something like showing the harms to fictional characters. If one has seen the “Twilight” movies, the werewolves do not come off too well, and “Lord of the Rings” is like an orc kill fest.
As regards the point made by the noble Baroness, Lady Harding, about going to the cinema, we allow older teenagers to go to the cinema and see that kind of thing. Post the Online Safety Bill, they will still be able to access it. When we look at something like fictional characters, the Bill is to deal with the harm that is there and is acknowledged regarding people pushing quite vile stuff, whereby characters have been taken out of fiction and a gory image has been created, twisted and pushed to a younger child. That is what we want online providers to do—to prevent an 11 year-old seeing that—not to stop a 16 year-old enjoying the slaughter of werewolves. We need to be clear that that is what we are doing with the priority harms; we are not going further than people think we are.
There are also some interesting challenges around humour and evolving trends. This area will be hard for platforms to deal with. I raised the issue of the Tide pod challenge in Committee. If noble Lords are not familiar, it is the idea that one eats the tablets, the detergent things, that one puts into washing machines. It happened some time ago. It was a real harm and that is reflected here in the “do not ingest” provisions. That makes sense but, again talking to my focus group, the Tide pod challenge has evolved and for older teenagers it is a joke about someone being stupid. It has become a meme. One could genuinely say that it is not the harmful thing that it was. Quite often one sees something on the internet that starts harmful—because kids are eating Tide pods and getting sick—and then over time it becomes a humorous meme. At that point, it has ceased to be harmful. I read it as that filter always being applied. We are not saying, “Always remove every reference to Tide pods” but “At a time when there is evidence that it is causing harm, remove it”. If at a later stage it ceases to be harmful, it may well move into a category where platforms can permit it. It is a genuine concern.
To our freedom of expression colleagues, I say that we do not want mainstream platforms to be so repressive of ordinary banter by teenagers that they leave those regulated mainstream platforms because they cannot speak any more, even when the speech is not harmful, and go somewhere else that is unregulated—one of those platforms that took Ofcom’s letter, screwed it up and threw it in the bin. We do not want that to be an effect of the Bill. Implementation has to be very sensitive to common trends and, importantly, as I know the noble Baroness, Lady Kidron, agrees, has to treat 15, 16 and 17 year-olds very differently from 10, 11 or 12 year-olds. That will be hard.
The other area that jumped out was about encouraging harm through challenges and stunts. That immediately brought “Jackass” to mind, or the Welsh version, “Dirty Sanchez”, which I am sure is a show that everyone in the House watched avidly. It is available on TV. Talking about equality, one can go online and watch it. It is people doing ridiculous, dangerous things, is enjoyed by teenagers and is legal and acceptable. My working assumption has to be that we are expecting platforms to distinguish between a new dangerous stunt such as the choking game—such things really exist—from a ridiculous “Jackass” or “Dirty Sanchez” stunt, which has existed for years and is accessible elsewhere.
The point that I am making in the round is that it is great to have these priority harms in the Bill but it is going to be very difficult to implement them in a meaningful way whereby we are catching the genuinely harmful stuff but not overrestricting. But that is that task that we have set Ofcom and the platforms. The more that we can make it clear to people out there what we are expecting to happen, the better. We are not expecting a blanket ban on all ridiculous teenage humour or activity. We are expecting a nuanced response. That is really helpful as we go through the debate.
I just have a question for the noble Lord. He has given an excellent exposé of the other things that I was worried about but, even when he talks about listing the harms, I wonder how helpful it is. Like him, I read them out to a focus group. Is it helpful to write these things, for example emojis, down? Will that not encourage the platforms to over-panic? That is my concern.
On the noble Baroness’s point, that is why I intervened in the debate: so that we are all clear. We are not saying that, for priority content, it is an amber light and not a red light. We are not saying, “Just remove all this stuff”; it would be a wrong response to the Bill to say, “It’s a fictional character being slaughtered so remove it”, because now we have removed “Twilight”, “Watership Down” and whatever else. We are saying, “Think very carefully”. If it is one of those circumstances where this is causing harm—they exist; we cannot pretend that they do not—it should be removed. However, the default should not be to remove everything on this list; that is the point I am really trying to make.
I certainly will think about it, but the difficulty is the scale of the material and the speed with which we want these assessments to be made and that light to be lit, in order to make sure that people are properly protected.
My noble friend Lord Moylan asked about differing international terminology. In order for companies to operate in the United Kingdom they must have an understanding of the United Kingdom, including the English-language terms used in our legislation. He made a point about the Equality Act 2010. While it uses the same language, it does not extend the Equality Act to this part of the Bill. In particular, it does not create a new offence.
The noble Baroness, Lady Fox, also mentioned the Equality Act when she asked about the phraseology relating to gender reassignment. We included this wording to ensure that the language used in the Bill matches Section 7(1) of the Equality Act 2010 and that gender reassignment has the same meaning in the Bill as it does in that legislation. As has been said by other noble Lords—
I clarify that what I said was aimed at protecting children. Somebody corrected me and asked, “Do you know that this says ‘abusive’?”—of course I do. What I suggested was that this is an area that is very contentious when we talk about introducing it to children. I am thinking about safeguarding children in this instance, not just copying and pasting a bit of an Act.
As was pointed out by others in the debate, the key provision in Amendment 172 is subsection (2) of the proposed new clause, which relates to:
“Content which is abusive and which targets any of the following characteristics”.
It must both be abusive and target the listed characteristics. It does not preclude legitimate debate about those things, but if it were abusive on the basis of those characteristics—rather akin to the debate we had in the previous group and the points raised by the noble Baroness, Lady Kennedy of The Shaws, about people making oblique threats, rather than targeting a particular person, by saying, “People of your characteristic should be abused in the following way”—it would be captured.
I will keep this short, because I know that everyone wants to get on. It would be said that it is abusive to misgender someone; in the context of what is going on in sixth forms and schools, I suggest that this is a problem. It has been suggested that showing pictures of the Prophet Muhammad in an RE lesson—these are real-life events that happen offline—is abusive. I am suggesting that it is not as simple as saying the word “abusive” a lot. In this area, there is a highly contentious and politicised arena that I want to end, but I think that this will exacerbate, not help, it.
(1 year, 4 months ago)
Lords ChamberMy Lords, I am happy to acknowledge and recognise what the Government did when they created user empowerment duties to replace legal but harmful. I think they were trying to counter the dangers of over-paternalism and illiberalism that oblige providers to protect adult users from content that allegedly would cause them harm.
At least the new provisions brought into the Bill have a different philosophy completely. They enhance users’ freedom as individuals and allow them to apply voluntary content filters and freedom of choice, on the principle that adults can make decisions for themselves.
In case anyone panics, I am not making a philosophical speech. I am reminding the Government that that is what they said to us—to everybody—“We are getting rid of legal but harmful because we believe in this principle”. I am worried that some of the amendments seem to be trying to backtrack from that different basis of the Bill—and that more liberal philosophy—to go back to the old legal but harmful. I say to the noble Lord, Lord Allan of Hallam, that the cat is distinctly not dead.
The purpose of Amendment 56 is to try to ensure that providers also cannot thwart the purpose of Clause 12 and make it more censorious and paternalistic. I am not convinced that the Government needed to compromise on this as I think Amendment 60 just muddies the waters and fudges the important principle that the Government themselves originally established.
Amendment 56 says that the default must be no filtering at all. Then users have to make an active decision to switch on the filtering. The default is that you should be exposed to a full flow of ideas and, if you do not want that, you have to actively decide not to and say that you want a bowdlerised or sanitised version.
Amendment 56 takes it a bit further, in paragraph (b), and applies different levels of filtering in terms of content of democratic importance and journalistic content. In the Bill itself, the Government accept the exceptional nature of those categories of content, and this just allows users to be able to do the same and say, “No; I might want to filter some things out but bear in mind the exceptional importance of democratic and journalistic content”. I worry that the government amendments signal to users that certain ideas are dangerous and must be hidden. That is my big concern. In other words, they might be legal but they are harmful: that is what I think these amendments try to counter.
One of the things that worries me about the Bill is the danger of echo chambers. I know we are concentrating on harms, but I think echo chambers are harmful. I started today quite early at Blue Orchid at 55 Broadway with a big crowd of sixth formers involved in debating matters. I complimented Keir Starmer on his speech on the importance of oracy and encouraging young people to speak. I stressed to all the year 12 and year 13 young people that the important thing was that they spoke out but also that they listened to contrary opinions and got out of their safe spaces and echo chambers. They were debating very difficult topics such as commercial surrogacy, cancel culture and the risks of contact sports. I am saying all that to them and then I am thinking, “We have now got a piece of legislation that says you can filter out all the stuff you do not want to hear and create your own safe space”. So I just get anxious that we do not inadvertently encourage in the young—I know this is for all adults—that antidemocratic tendency to not want to hear what you do not want to hear, even when it would be good to hear as many opinions as possible.
I also want to press the Minister on the problem of filtering material that targets race, religion, sex, sexual orientation, disability and gender reassignment. I keep trying to raise the problem that it could lead to diverse philosophical views around those subjects also being removed by overzealous filtering. You might think that you know what you are asking to be filtered out. If you say you want to filter out material that is anti-religion, you might not mean that you do not want any debates on religious tolerance. For example, there was that major controversy over the “The Lady of Heaven” film. I know the Minister was interested, as I was, in the dangers of censorship in relation to that. You would not want, because you said, “Don’t target me for my religion”, to not be able to access that debate.
I think there is a danger that we are handing a lot of power to filterers to make filtering decisions based on their values when we are not clear about what they are. Look at what has happened with the banks in the last few days. Their values have closed down people’s bank accounts because they disagree on values. Again, we say “Don’t target on race”, but I have been having lots of arguments with people recently who have accused the Government, through their Illegal Migration Bill, of being racist. I think we just need to know that we are not accepting an ideological filtering of what we see.
Amendment 63 is key because it requires providers’ terms of service to include provisions about how content to which Clause 12(2) applies is identified, precisely to try to counter these problems. It imposes a duty on providers to apply those provisions consistently, as the noble Lord, Lord Moylan, explained. The point that providers have to set out how they identify content that is allegedly hostile, for example, to religion, or racially abusive, is important because this is about empowering users. Users need to know whether this will be done by machine learning or will it be a human doing it. Do they look for red flags and, if so, what are the red flags? How are these things decided? That means that providers have to state clearly and be accountable for their definition of any criteria that could justify them filtering out and disturbing the flow of democratic information. It is all about transparency and accountability in that sense.
Finally, in relation to Amendment 183, I am worried about the notion of filtering out content from unverified users for a range of reasons. It indicates somehow that there is a direct link between being unverified or anonymous and harm or being dodgy, which I think that is illegitimate. It has already been explained that there will be a detrimental impact on certain organisations —we have talked about Reddit, but I like to remember Mumsnet. There are quite a lot of organisations with community-centred models, where the structure is that influencers broadcast to their followers and where there are pseudonymous users. Is the requirement to filter out those contributors likely to lead to those models collapsing? I need to be reassured on this because I am not convinced at all. As has been pointed out, there will be a two-tier internet because those who are unable or unwilling to disclose their identity online or to be verified by someone would be or could be shut out from public discussions. That is a very dangerous place to have ended up, even though I am sure it is not what the Government intend.
My Lords, I am grateful for the broad, if not universal, support for the amendments that we have brought forward following the points raised in Committee. I apologise for anticipating noble Lords’ arguments, but I am happy to expand on my remarks in light of what they have said.
My noble friend Lord Moylan raised the question of non-verified user duties and crowdsourced platforms. The Government recognise concerns about how the non-verified user duties will work with different functionalities and platforms, and we have engaged extensively on this issue. These duties are only applicable to category 1 platforms, those with the largest reach and influence over public discourse. It is therefore right that such platforms have additional duties to empower their adult users. We anticipate that these features will be used in circumstances where vulnerable adults wish to shield themselves from anonymous abuse. If users decide that they are restricting their experience on a particular platform, they can simply choose not to use them. In addition, before these duties come into force, Ofcom will be required to consult effective providers regarding the codes of practice, at which point they will consider how these duties might interact with various functionalities.
My noble friend and the noble Lord, Lord Allan of Hallam, raised the potential for being bombarded with pop-ups because of the forced-choice approach that we have taken. These amendments have been carefully drafted to minimise unnecessary prompts or pop-ups. That is why we have specified that the requirement to proactively ask users how they want these tools to be applied is applicable only to registered users. This approach ensures that users will be prompted to make a decision only once, unless they choose to ignore it. After a decision has been made, the provider should save this preference and the user should not be prompted to make the choice again.
The noble Lord, Lord Clement-Jones, talked further about his amendments on the cost of user empowerment tools as a core safety duty in the Bill. Category 1 providers will not be able to put the user empowerment tools in Clause 12 behind a pay wall and still be compliant with their duties. That is because they will need to offer them to users at the first possible opportunity, which they will be unable to do if they are behind a pay wall. The wording of Clause 12(2) makes it clear that providers have a duty to include user empowerment features that an adult user may use or apply.