(1 year, 4 months ago)
Lords ChamberMy Lords, as the noble Lords, Lord Stevenson and Lord Clement-Jones, have already said, the Communications and Digital Select Committee did indeed recommend a new Joint Committee of both Houses to look specifically at the various different aspects of Ofcom’s implementation of what will be the Online Safety Act and ongoing regulation of digital matters. It is something I still have a lot of sympathy for. However, there has not been much appetite for such a Joint Committee at the other end of the Corridor. I do not necessarily think we should give up on that, and I will come back to that in a moment, but in place of that, I am not keen on what is proposed in Amendment 239, because my fear about how that is laid out is that it introduces something that appears a bit too burdensome and probably introduces too much delay in implementation.
To return to the bigger question, I think that we as parliamentarians need to reflect on our oversight of regulators, to which we are delegating significant new powers and requiring them to adopt a much more principles-based approach to regulation to cope with the fast pace of change in the technological world. We have to reflect on whether our current set-up is adequate for the way in which that is changing. What I have in mind is very much a strategic level of oversight, rather than scrutinising operational decisions, although, notwithstanding what the noble Lord has said, something specific in terms of implementation of the Bill and other new legislation is an area I would certainly wish to explore further.
The other aspect of this is making sure that our regulators keep pace too, not just with technology, and apply the new powers we give them in a way which meets our original intentions, but with the new political dynamics. Earlier today in your Lordships’ Chamber, there was a Question about how banks are dealing with political issues, and that raises questions about how the FCA is regulating the banking community. We must not forget that the Bill is about regulating content, and that makes it ever more sensitive. We need to keep reminding ourselves about this; it is very new and very different.
As has been acknowledged, there will continue to be a role for the Communications and Digital Select Committee, which I have the great privilege of chairing, in overseeing Ofcom. My noble friend Lord Grade and Dame Melanie Dawes appeared before us only a week ago. There is a role for the SIT Committee in the Commons; there is also probably some kind of ongoing role for the DCMS Select Committee in the Commons too, I am not sure. In a way, the fractured nature of that oversight makes it all the more critical that we join up a bit more. So I will take it upon myself to give this more thought and speak to the respective chairs of those committees in the other place, but I think that at some point we will need to consider, in some other fora, the way in which we are overseeing the work of regulators.
At some point, I think we will need to address the specific recommendations in the pre-legislative committee’s report, which were very much in line with what my own committee thought was right for the future of digital regulatory oversight, but on this occasion, I will not be supporting the specifics of Amendment 239.
My Lords, very briefly, I was pleased to see this, in whatever form it takes, because as we finish off the Bill, one thing that has come up consistently is that some of us have raised problems of potential unintended consequences, such as whether age gating will lead to a huge invasion of the privacy of adults rather than just narrowly protecting children, or whether the powers given to Ofcom will turn it into the most important and powerful regulator in the country, if not in Europe. In a highly complex Bill, is it possible for us to keep our eye on it a bit more than just by whingeing on the sidelines?
The noble Baroness, Lady Stowell, makes a very important point about the issue in relation to the FCA and banking. Nobody intended that to be the outcome of PEPs, for example, and nobody intended when they suggested encouraging banks to have values such as ESG or EDI—equality, diversity and inclusion—that that would lead to ordinary citizens of this country being threatened with having their banking turned off. It is too late to then retrospectively say, “That wasn’t what we ever intended”.
(1 year, 4 months ago)
Lords ChamberMy Lords, I am completely opposed to Amendments 159 and 160, but the noble Lords, Lord Faulks and Lord Black, and the noble Viscount, Lord Colville, have explained the issues perfectly. I am fully in agreement with what they said. I spoke at length in Committee on that very topic. This is a debate we will undoubtedly come back to in the media Bill. I, for one, am extremely disappointed that the Labour Party has said that it will not repeal Section 40. I am sure that these issues will get an airing elsewhere. As this is a speech-limiting piece of legislation, as was admitted earlier this week, I do not want any more speech limiting. I certainly do not want it to be a media freedom-limiting piece of legislation on top of that.
I want to talk mainly about the other amendments, Amendments 158 and 161, but approach them from a completely different angle from the noble Lord, Lord Allan of Hallam. What is the thinking behind saying that the only people who can clip content from recognised news publishers are the news publishers? The Minister mentioned in passing that there might be a problem of editing them, but it has become common practice these days for members of the public to clip from recognised news publishers and make comments. Is that not going to be allowed? That was the bit that completely confused me. It is too prescriptive; I can see all sorts of people getting caught by that.
The point that the noble Lord, Lord Allan of Hallam, made about what constitutes a recognised news publisher is where the issue gets quite difficult. The point was made about the “wrong” organisations, but I want to know who decides what is right and wrong. We might all nod along when it comes to Infowars and RT, but there are lots of organisations that would potentially fail that test. My concern is that they would not be able to appeal when they are legitimate news organisations, even if not to everybody’s taste. Because I think that we already have too much speech limiting in the Bill, I do not want any more. This is important.
When it comes to talking about the “wrong” organisations, I noticed that the noble Lord, Lord McNally, referred to people who went to Rupert Murdoch’s parties. I declare my interests here: I have never been invited or been to a Rupert Murdoch party—although do feel free, I say, if he is watching—but I have read about them in newspapers. For some people in this Chamber, the “wrong” kind of news organisation is, for example, the Times or one with the wrong kind of owner. The idea that we will all agree or know which news publishers are the “wrong” kind is not clear, and I do not think that the test is going to sort it out.
Will the Minister explain what organisations can do if they fail the recognised news publisher test to appeal and say, “We are legitimate and should be allowed”? Why is there this idea that a member of the public cannot clip a recognised news publisher’s content without falling foul? Why would they not be given some exemption? I genuinely do not understand that.
My Lords, I shall speak very briefly. I feel a responsibility to speak, having spoken in Committee on a similar group of amendments when the noble Lords, Lord Lipsey and Lord McNally, were not available. I spoke against their amendments then and would do so again. I align myself with the comments of my noble friend Lord Black, the noble Lord, Lord Faulks, and the noble Viscount, Lord Colville. As the noble Baroness, Lady Fox, just said, they gave a comprehensive justification for that position. I have no intention of repeating it, or indeed repeating my arguments in Committee, but I think it is worth stating my position.
(1 year, 6 months ago)
Lords ChamberMy Lords, I start by commending my noble friend Lady Morgan on her clear introduction to this group of amendments. I also commend the noble Baroness, Lady Kidron, on her powerful speech.
From those who have spoken so far, we have a clear picture of the widespread nature of some of the abuse and offences that women experience when they go online. I note from what my noble friend Lady Morgan said that there is widespread support from a range of organisations outside the Committee for this group of amendments. She also made an important and powerful point about the potential chilling effect of this kind of activity on women, including women in public life, being able to exercise their right to freedom of expression.
I feel it is important for me to make it clear that—this is an obvious thing—I very much support tough legal and criminal sanctions against any perpetrator of violence or sexual abuse against women. I really do understand and support this, and hear the scale of the problem that is being outlined in this group of amendments.
Mine is a dissenting voice, in that I am not persuaded by the proposed solution to the problem that has been described. I will not take up a lot of the Committee’s time, but any noble Lords who were in the House when we were discussing a group of amendments on another piece of legislation earlier this year may remember that I spoke against making misogyny a hate crime. The reason why I did that then is similar, in that I feel somewhat nervous about introducing a code of conduct which is directly relevant to women. I do not like the idea of trying to address some of these serious problems by separating women from men. Although I know it is not the intention of a code such as this or any such measures, I feel that it perpetuates a sense of division between men and women. I just do not like the idea that we live in a society where we try to address problems by isolating or categorising ourselves into different groups of people, emphasising the sense of weakness and being victims of any kind of attack or offence from another group, and assuming that everybody who is in the other group will be a perpetrator of some kind of attack, criticism or violence against us.
My view is that, in a world where we see some of this serious activity happening, we should do more to support young men and boys to understand the proper expectations of them. When we get to the groups of amendments on pornography and what more we can do to prevent children’s access to it, I will be much more sympathetic. Forgive me if this sounds like motherhood and apple pie, but I want us to try to generate a society where basic standards of behaviour and social norms are shared between men and women, young and old. I lament how so much of this has broken down, and a lot of the problems we see in society are the fault of political and—dare I say it?—religious leaders not doing more to promote some of those social norms in the past. As I said, I do not want us to respond to the situation we are in by perpetuating more divisions.
I look forward to hearing what my noble friend the Minister has to say, but I am nervous about the solution proposed in the amendments.
My Lords, it gives me great pleasure to follow the noble Baroness, Lady Stowell of Beeston, not least because she became a dissenting voice, and I was dreading that I might be the only one.
First, I think it important that we establish that those of us who have spent decades fighting violence against women and girls are not complacent about it. The question is whether the physical violence we describe in the Bill is the same as the abuse being described in the amendments. I worry about conflating online incivility, abuse and vile things said with physical violence, as is sometimes done.
I note that Refuge, an organisation I have a great deal of respect for, suggested that the user empowerment duties that opted to place the burden on women users to filter out their own online experience was the same as asking women to take control of their own safety and protect themselves offline from violence. I thought that was unfair, because user empowerment duties and deciding what you filter out can be women using their agency.
(1 year, 7 months ago)
Lords ChamberThat is very helpful.
I am concerned that removing so-called illegal content for the purpose of complying with the regulatory system covers not only that which reaches conviction in a criminal court but possibly anything that a platform determines could be illegal, and therefore it undermines our own legal system. As I have said, that marks a significant departure from the rule of law. It seems that the state is asking or mandating private companies to make determinations about what constitutes illegality.
The obligations on a platform to determine what constitutes illegality could obviously become a real problem, particularly in relation to limitations on free expression. As we have already heard, the Public Order Act 1986 criminalises, for example, those who stir up hatred through the use of words, behaviour or written material. That is contentious in the law offline. By “contentious”, I mean that it is a matter of difficulty that requires the full rigour of the criminal justice system, understanding the whole history of established case law. That is all necessary to make a conviction under that law for offences of this nature.
Now we appear to be saying that, without any of that, social media companies should make the decision, which is a nerve-racking situation to be in. We have already heard the slippery phrase “reasonable grounds to infer”. If that was the basis on which you were sent to prison—if they did not have to prove that you were guilty but they had reasonable grounds to infer that you might be, without any evidence—I would be worried, yet reasonable grounds to infer that the content could be illegal is the basis on which we are asking for those decisions to be made. That is significantly below the ordinary burden of proof required to determine that an illegal act has been committed. Under this definition, I fear that platforms will be forced to overremove and censor what ultimately will be entirely lawful speech.
Can the Minister consider what competency social media companies have to determine what is lawful? We have heard some of the dilemmas from somebody who was in that position—let alone the international complications, as was indicated. Will all these big tech companies have to employ lots of ex-policemen and criminal lawyers? How will it work? It seems to me that there is a real lack of qualifications in that sphere— that is not a criticism, because those people decided to work in big tech, not in criminal law, and yet we are asking them to pursue this. That is a concern.
I will also make reference to what I think are the controversies around government Amendments 136A and 136B to indicate the difficulties of these provisions. They concern illegal activity—such as “assisting unlawful immigration”, illegal entry, human trafficking and similar offences—but I am unsure as to how this would operate. While it is the case that certain entrances to the UK are illegal, I suddenly envisage a situation where a perfectly legitimate political debate—for example, about the small boats controversy—would be taken down, and that people advocating for a position against the Government’s new Illegal Migration Bill could be accused of supporting illegality. What exactly will be made illegal in those amendments to the Online Safety Bill?
The noble Baroness, Lady Buscombe, made a fascinating speech about an interesting group of amendments. Because of the way the amendments are grouped, I feel that we have moved to a completely different debate, so I will not go into any detail on this subject. Anonymous trolling, Twitter storms and spreading false information are incredibly unpleasant. I am often the recipient of them—at least once a week—so I know personally that you feel frustrated that people tell lies and your reputation is sullied. However, I do not think that these amendments offer the basis on which that activity should be censored, and I will definitely argue against removing anonymity clauses—but that will be in another group. It is a real problem, but I do not think that the solution is contained in these amendments.
My Lords, my contribution will be less officious than my intervention earlier in this group. In the last couple of years since I returned to the House—as I describe it—having spent time at the Charity Commission, I have noticed a new practice emerging of noble Lords reading out other people’s speeches. Every time I had seen it happen before, I had not said anything, but today I thought, “I can’t sit here and not say anything again”. I apologise for my intervention.
I am grateful to my noble friend Lord Moylan for bringing forward his amendments and for introducing them in the incredibly clear way he did; they cover some very complex and diverse issues. I know that there are other amendments in the group which might be described as similar to his.
There are a couple of things I want to highlight. One interesting thing about the debate on this group is the absence of some of our legal friends—I apologise to my noble friend Lady Buscombe, who is of course a very distinguished lawyer. The point I am making is that we are so often enriched by a lot of legal advice and contributions on some of the more challenging legal issues that we grapple with, but we do not have that today, and this is a very difficult legal issue.
It is worth highlighting again, as has been touched on a little in some of the contributions, the concern, as I understand it, with how the Bill is drafted in relation to illegal content and the potential chilling effect of these clauses on social media platforms. As has already been said, there is a concern that it might lead them to take a safety-first approach in order to avoid breaking the law and incurring the sanctions and fines that come with the Bill, which Ofcom will have the power to apply. That is the point we are concerned with here. It is the way in which this is laid out, and people who are much better equipped than I am have already explained the difference between evidence versus reasonable grounds to infer.
What the noble Lord, Lord Allan, hit on in his contribution is also worth taking into account, and that is the role of Ofcom in this situation. One of the things I fear, as we move into an implementation phase and the consequences of the Bill start to impact on the social media firms, is the potential for the regulator to be weaponised in a battle on the cultural issues that people are becoming increasingly exercised about. I do not have an answer to this, but I think it is important to understand the danger of where we might get to in the expectations of the regulator if we create a situation where the social media platforms are acting in a way that means people are looking for recourse or a place to generate further an argument and a battle that will not be helpful at all.
I am not entirely sure, given my lack of legal expertise —this is why I would have been very grateful for some legal expertise on this group—whether what my noble friend is proposing in his amendments is the solution, but I think we need to be very clear that this is a genuine problem. I am not sure, as things stand in the Bill, that we should be comfortable that it is not going to create problems. We need to find a way to be satisfied that this has been dealt with properly.
(1 year, 7 months ago)
Lords ChamberThe noble Lord has concluded with my conclusion, which was to say that those services will be driven out, but not because they are irresponsible around horrible, dangerous messages. They do not read our messages because they are private. However, if we ever receive anything that makes us feel uncomfortable, they should be put under pressure to act. Many of them already do and are actually very responsible, but that is different from demanding that they scan our messages and we breach that privacy.
My Lords, that last exchange was incredibly helpful. I am grateful to the noble Lord, Lord Allan, for what he just said and the way in which he introduced this group. I want to make only a few brief remarks.
I have put my name to two amendments in this group: Amendment 202 in the name of the noble Lord, Lord Stevenson, which seeks to ensure that Ofcom will be subject to the same kind of requirements and controls as exist under the Regulation of Investigatory Powers Act before issuing a technology notice
“to a regulated service which offers private messaging with end-to-end encryption”;
and Amendment 285, also in the name of the noble Lord, Lord Stevenson, and that of the noble Lord, Lord Clement-Jones. This amendment would make sure that no social media platforms or private end-to-end messaging services have an obligation generally to monitor what is going on across their platforms. When I looked at this group and the various amendments in it, those were the two issues that I thought were critical. These two amendments seemed to approach them in the most simple and straightforward manner.
Like other noble Lords, my main concern is that I do not want search and social media platforms to have an obligation to become what we might describe as thought police. I do not want private messaging firms to start collecting and storing the content of our messages so that they have what we say ready to hand over in case they are required to do so. What the noble Lord, Lord Allan, just said is an important point to emphasise. Some of us heard from senior representatives from WhatsApp a few weeks ago. I was quite surprised to learn how much they are doing in this area to co-operate with the authorities; I felt very reassured to learn about that. I in no way want to discourage that because they are doing an awful amount of good stuff.
Basically, this is such a sensitive matter, as has been said, that it is important for the Government to be clear what their policy intentions are by being clear in the Bill. If they do not intend to require general monitoring that needs to be made explicit. It is also important that, if Ofcom is to be given new investigatory powers or powers to insist on things through these technology notices, it is clear that its powers do not go beyond those that are already set out in law. As we have heard from noble Lords, there is widespread concern about this matter not just from the social media platforms and search engines themselves but from news organisations, journalists and those lobby groups that often speak out on liberty-type matters. These topics go across a wide range of interest groups, so I very much hope that my noble friend the Minister will be able to respond constructively and open-mindedly on them.
My Lords, I was not intending to intervene on this group because my noble friend Lord Stevenson will address these amendments in their entirety, but listening in to this public conversation about this group of amendments has stimulated a question that I want both to put on the record and to give the Minister time to reflect on.
If we get the issues of privacy and encrypted messaging wrong, it will push more people into using VPN—virtual private network—services. I went into the app store on my phone to search for VPN software. There is nothing wrong with such software—our parliamentary devices have it to do general monitoring and make sure that we do not use services such as TikTok—but it is used to circumnavigate much of the regulatory regime that we are seeking to put together through this Bill. When I search for VPNs in the app store, the first one that comes up that is not a sponsored, promoted advertisement has an advisory age limit of four years old. Several of them are the same; some are 17-plus but most are four-plus. Clearly, the app promotes itself very much on the basis that it offers privacy and anonymity, which are the key features of a VPN. However, a review of it says, “I wouldn’t recommend people use this because it turns out that this company sends all its users’ data to China so that it can do general monitoring”.
I am not sure how VPNs are being addressed by the Bill, even though they seem really pertinent to the issues of privacy and encryption. I would be interested to hear whether—and if we are, how—we are bringing the regulation and misuse of VPNs into scope for regulation by Ofcom.