Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateLord Curry of Kirkharle
Main Page: Lord Curry of Kirkharle (Crossbench - Life peer)Department Debates - View all Lord Curry of Kirkharle's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, in view of the hour, I will be brief, and I have no interests to declare other than that I have grandchildren. I rise to speak to a number of amendments tabled in my name in this group: Amendments 216A to 216C, 218ZZA to 218ZD and 218BA to 218BC. I do not think I have ever achieved such a comprehensive view of the alphabet in a number of amendments.
These amendments carry a simple message: Ofcom must act decisively and quickly. I have tabled them out of a deep concern that the Bill does not specify timescales or obligations within which Ofcom is required to act. It leaves Ofcom, as the regulator, with huge flexibility and discretion as to when it must take action; some action, indeed, could go on for years.
Phrases such as
“OFCOM may vary a confirmation decision”
or it
“may apply to the court for an order”
are not strong enough, in my view. If unsuitable or harmful material is populating social media sites, the regulator must take action. There is no sense of urgency within the drafting of the Bill. If contravention is taking place, action needs to be taken very quickly. If Ofcom delays taking an action, the harmful influence will continue. If the providers of services know that the regulator will clamp down quickly and severely on those who contravene, they are more likely to comply in the first place.
I was very taken by the earlier comments of the noble Baroness, Lady Harding, about putting additional burdens on Ofcom. These amendments are not designed to put additional burdens on Ofcom; indeed, the noble Lord, Lord Knight, referred to the fact that, for six years, I chaired the Better Regulation Executive. It was my experience that regulators that had a reputation for acting quickly and decisively, and being tough, had a much more compliant base as a consequence.
Noble Lords will be pleased to hear that I do not intend to go through each individual amendment. They all have a single purpose: to require the regulator—in this case, Ofcom—to act when necessary, as quickly as possible within specified timescales; and to toughen up the Bill to reduce the risk of continuous harmful content being promoted on social media.
I hope that the Minister will take these comments in the spirit in which they are intended. They are designed to help Ofcom and to help reduce the continuous adverse influence that many of these companies will propagate if they do not think they will be regulated severely.
My Lords, I understand that, for legislation to have any meaning, it has to have some teeth and you have to be able to enforce it; otherwise, it is a waste of time, especially with something as important as the legislation that we are discussing here.
I am a bit troubled by a number of the themes in these amendments and I therefore want to ask some questions. I saw that the Government had tabled these amendments on senior manager liability, then I read amendments from both the noble Lord, Lord Bethell, and the Labour Party, the Opposition. It seemed to me that even more people would be held liable and responsible as a result. I suppose I have a dread that—even with the supply chain amendment—this means that lots of people are going to be sacked. It seems to me that this might spiral dangerously out of control and everybody could get caught up in a kind of blame game.
I appreciate that I might not have understood, so this is a genuine attempt to do so. I am concerned that these new amendments will force senior managers and, indeed, officers and staff to take an extremely risk-averse approach to content moderation. They now have not only to cover their own backs but to avoid jail. One of my concerns has always been that this will lead to the over-removal of legal speech, and more censorship, so that is a question I would like to ask.
I also want to know how noble Lords think this will lie in relation to the UK being a science and technology superpower. Understandably, some people have argued that these amendments are making the UK a hostile environment for digital investment, and there is something to be balanced up there. Is there a risk that this will lead to the withdrawal of services from the UK? Will it make working for these companies unattractive to British staff? We have already heard that Jimmy Wales has vowed that the Wikimedia foundation will not scrutinise posts in the way demanded by the Bill. Is he going to be thrown in prison, or will Wikipedia pull out? How do we get the balance right?
What is the criminal offence that has a threat of a prison sentence? I might have misunderstood, but a technology company manager could fail to prevent a child or young person encountering legal but none the less allegedly harmful speech, be considered in breach of these amendments and get sent to prison. We have to be very careful that we understand what this harmful speech is, as we discussed previously. The threshold for harm, which encompasses physical and psychological harm, is vast and could mean people going to prison without the precise criminal offence being clear. We talked previously about VPNs. If a tech savvy 17-year-old uses a VPN and accesses some of this harmful material, will someone potentially be criminally liable for that young person getting around the law, find themselves accused of dereliction of duty and become a criminal?
My final question is on penalties. When I was looking at this Bill originally and heard about the eye-watering fines that some Silicon Valley companies might face, I thought, “That will destroy them”. Of course, to them it is the mere blink of an eye, and I do get that. This indicates to me, given the endless conversations we have had on whether size matters, that in this instance size does matter. The same kind of liabilities will be imposed not just on the big Silicon Valley monsters that can bear these fines, but on Mumsnet—or am I missing something? Mumsnet might not be the correct example, but could not smaller platforms face similar liabilities if a young person inadvertently encounters harmful material? It is not all malign people trying to do this; my unintended consequence argument is that I do not want to create criminals when a crime is not really being committed. It is a moral dilemma, and I do understand the issue of enforcement.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Curry of Kirkharle
Main Page: Lord Curry of Kirkharle (Crossbench - Life peer)Department Debates - View all Lord Curry of Kirkharle's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, I want to speak to Amendment 218JA in this group, in my name, to which the noble Baroness, Lady Morgan of Cotes, has added her name. This is really trying to understand what the Government’s intentions are in respect of access restriction orders.
Just to take a step back, in the Online Safety Bill regime we are creating, in effect, a licensing regime for in-scope services and saying that, if you want to operate in the United Kingdom and you are covered by the Bill—whether that is the pornography services that the noble Lord, Lord Bethell, referred to or a user-to-user or search service—here are the conditions to which you must adhere. That includes paying a fee to Ofcom for your supervision, and then following the many thousands of pages of guidance that I suspect we will end up producing and issuing to those companies. So what we are exploring here is what happens if a particular organisation does not decide to take up the offer of a licence.
Again, to go back to the previous debate, success for the Bill would be that it has a sufficient deterrent effect that the problems that we are seeking to fix are addressed. I do not think we are looking to block services or for them to fail—we are looking for them to succeed, so stage one is that Ofcom asks them nicely. It says, “You want to operate in the UK, here is what you need to do—it’s a reasonable set of requests we are making”, and the services say, “Fine”. If not, they choose to self-limit—and it is quite trivial for any online service to say, “I’m going to check incoming traffic, and if this person looks like they are coming from the UK, I’m not going to serve them”. That is self-limiting, which is an option that would be preferable if a service chose not to accept the licence condition. But let us assume that it has accepted the licence condition, and Ofcom is going to be monitoring it on a routine basis—and if Ofcom thinks it is not meeting its requirements, whether that is to produce a risk assessment or to fulfil its duty of care, Ofcom will then instruct it to do something. If it fails to follow that instruction, we are in the territory of the amendments that we are considering here: either it has refused to accept the licence conditions and to self-limit, or it has accepted them but has failed to do what we expect it to do. It has signed up and thought that it is not serious, and it is not doing the things that we expect it to do.
At that point, Ofcom has to consider what it can do. The first stage is quite right, in the group of clauses that we are looking at—Ofcom can bring in these business disruption measures. As the noble Lord, Lord Bethell, rightly pointed out, in many instances that will be effective. Any commercial service—not just pornography services, but an online service that depends on advertising—that is told that it can no longer take credit card payments from UK businesses to advertise on the service, will, one hopes, come into line and say, “That’s the end of my business in the UK—I may as well cut myself off”. But if it wants to operate, it will come into line, because that way it gets its payment services restored. But there will be others for which that is insufficient—perhaps that is not their business model—and they will carry on regardless. At that point, we may want to consider the access restrictions.
In a free society, none of us should take pleasure in the idea that we are going to instruct the internet services or block them. That is not our first instinct, but something that is rather potentially a necessary evil. At some point, there may be services that are so harmful and so oblivious to the regime that we put in place that we need to block them. Here we are trying to explore what would happen in those circumstances. The first kind of block is one that we are used to doing, and we do it today for copyright-infringing sites and a small number of other sites that break the law. We instruct service providers such as BT and TalkTalk to implement a network-level block. There are ways you can do that—various technical ways that we do not need to go into in this debate—whereby we can seek to make it so that an ordinary UK user, when they type in www.whatever, will not get to the website. But increasingly people are using technology that will work around that. Browsers, for example, may create traffic between your web browser and the online service such that TalkTalk or BT or the access provider has no visibility as to where you are going and no capability of blocking it. BT has rightly raised that. There will be different views about where we should go with this, but the question is absolutely legitimate as to what the Government’s intentions are, which is what we want to try to tease out with this amendment.
Again, we should be really candid. Somebody who is determined to bypass all the access controls will do so. There is no world in which we can say that we can guarantee that somebody with a UK internet connection can never get to a particular website. What we are seeking to do is to make violating services unavailable for most of the people most of the time. We would be unhappy if it was only some of the people some of the time, but it is not going to be all of the people all of the time. So the question is: what constitutes a sufficient access restriction to either bring them to heel or to ensure that, over the longer term, the harm is not propagated, because these services are generally not made available? It would be really helpful if the Minister was able to tease that out.
Certainly, in my view, there are services such as TOR—the Onion Router—where there is no entity that you can ask to block stuff, so if someone was using that, there is nothing that you can reasonably do. At the other end of the spectrum, there are services such as BT and TalkTalk, where it is relatively straightforward to say to them that they should block. Then there are people in between, such as browser owners that are putting in place these encrypted tunnels for very good reasons, for privacy, but which can also add value-added stuff—helping to manage bandwidth better, and so on. Is it the Government’s intention that they are going to be served with access restriction orders? That is a valid question. We might have different views about what is the right solution, but it is really important for the sector that it understands and is able to prepare if that is the Government’s intention. So we need to tease that out; that is the area in which we are looking for answers from the Government.
The second piece is to think about the long term. If our prediction—or our hope and expectation—is that most companies will come into line, that is fine; the internet will carry on as it does today but in a safer way. However, if we have misjudged the mood, and a significant numbers of services just stick their thumb up at Ofcom and say, “We are not going to play—block us if you dare”, that potentially has significant consequences for the internet as it will operate in the United Kingdom. It would be helpful to understand from the Minister whether the Government have any projections or predictions as to which way we are going to go. Are we talking about the vast majority of the internet continuing as it is today within the new regime, with the odd player that will be outside that, or is it the Government’s expectation that there may need to be blocking of significant numbers of services, essentially for the foreseeable future?
Other countries such as France and Germany have been dealing with this recently, as the noble Lord, Lord Bethell, is probably aware of. They have sought to restrict access to pornography services, and there have been all sorts of consequent knock-on effects and challenges at a technical level. It would be helpful to understand whether our expectation is that we will see the same in the United Kingdom or that something else is going to happen. If the Government do not have that information today, or if they have not made those projections, it would be helpful to know their thinking on where that might happen. Who will be able to inform us as to what that the future landscape is likely to look like as it evolves, and as Ofcom gains these powers and starts to instruct companies that they must obtain licences, and then seeks to take enforcement action against those that choose not to play the game?
My Lords, I support Amendment 217 in the name of the noble Lord, Lord Bethell, and very much support the comments that he has made. I will speak to Amendments 218C, 218E, 218H and 218K in my name within this group. I also support the intent of the other amendments in this group tabled by the noble Lord, Lord Bethell.
I appreciate the process helpfully outlined by the noble Lord, Lord Allan. However, when looking at Ofcom’s implementation of existing provisions on video-sharing platforms, the overwhelming impression is of a very drawn-out process, with Ofcom failing to hold providers to account. Despite being told by Ofcom that a simple tick-box declaration by the user confirming that they are over 18 is not sufficient age verification, some providers are still using only that system. Concerningly, Ofcom has not taken decisive action.
When children are at severe risk, it is not appropriate to wait. Why, for example, should we allow porn sites to continue to host 10 million child sexual abuse videos while Ofcom simply reports that it is continuing to partner with these platforms to get a road map of action together? As has been mentioned by the noble Lord, Lord Bethell, Visa and Mastercard did not think it was appropriate to wait in such circumstances—they just acted.
Similarly, when systems are not in place to protect children from accessing pornography, we cannot just sit by and allow all the egregious associated harms to continue. Just as in Formula 1, when a red flag is raised and the cars must stop and go into the pits until the dangerous debris is cleared, sometimes it is too dangerous to allow platforms to operate until the problems are fixed. It seems to me that platforms would act very swiftly to put effective systems and processes in place if they could not operate in the interim.
The Bill already contains this emergency handbrake; the question is when it should be used. My answer is that it should be used when the evidence of severe harm presents itself, and not only when the regulator has a moment of self-doubt that its “road maps”, which it is normally so optimistic about, will eventually fix the problem. Ofcom should not be allowed to sit on the evidence hoping, with a wing and a prayer, that things will fix themselves in the end.