(1 year, 6 months ago)
Lords ChamberMy Lords, I have Amendments 185A and 268AA in this group. They are on different subjects, but I will deal with them in the same contribution.
Amendment 185A is a new clause that would introduce duties on online marketplaces to limit child access to listings of knives and take proactive steps to identify and remove any listings of knives or products such as ornamental zombie knives that are suggestive of acts of violence or self-harm. I am sure the Minister will be familiar with the Ronan Kanda case that has given rise to our bringing this amendment forward. The case is particularly horrible; as I understand it, sentencing is still outstanding. Two young boys bought ninja blades and machetes online and ultimately killed another younger boy with them. It has been widely featured in news outlets and is particularly distressing. We have had some debate on this in another place.
As I understand it, the Government have announced a consultation on this, among other things, looking at banning the sale of machetes and knives that appear to have no practical use other than being designed to look menacing or suitable for combat. We support the consultation and the steps set out in it, but the amendment provides a chance to probe the extent to which this Bill will apply to the dark web, where a lot of these products are available for purchase. The explanatory statement contains a reference to this, so I hope the Minister is briefed on the point. It would be very helpful to know exactly what the Government’s intention is on this, because we clearly need to look at the sites and try to regulate them much better than they are currently regulated online. I am especially concerned about the dark web.
The second amendment relates to racist abuse; I have brought the subject before the House before, but this is rather different. It is a bit of a carbon copy of Amendment 271, which noble Lords have already debated. It is there for probing purposes, designed to tease out exactly how the Government see public figures, particularly sports stars such as Marcus Rashford and Bukayo Saka, and how they think they are supposed to deal with the torrents of racist abuse that they receive. I know that there have been convictions for racist content online, but most of the abuse goes unpunished. It is not 100% clear that much of it will be identified and removed under the priority offence provisions. For instance, does posting banana emojis in response to a black footballer’s Instagram post constitute an offence, or is it just a horrible thing that people do? We need to understand better how the law will act in this field.
There has been a lot of debate about this issue, it is a very sensitive matter and we need to get to the bottom of it. A year and a half ago, the Government responded to my amendment bringing online racist abuse into the scope of what is dealt with as an offence, which we very much welcomed, but we need to understand better how these provisions will work. I look forward to the Minister setting that out in his response. I beg to move.
My Lords, I rise to speak primarily to the amendments in the name of my noble friend Lord Clement-Jones, but I will also touch on Amendment 268AA at the same time. The amendments that I am particularly interested in are Amendments 200 and 201 on regulatory co-operation. I strongly support the need for this, and I will illustrate that with some concrete examples of why this is essential to bring to life the kinds of challenges that need to be dealt with.
The first example relates to trying to deal with the sexual grooming of children online, where platforms are able to develop techniques to do that. They can do that by analysing the behaviour of users and trying to detect whether older users are consistently trying to approach younger users, and the kind of content of the messages they may be sending to them where that is visible. These are clearly highly intrusive techniques. If a platform is subject to the general data protection regulation, or the UK version of that, it needs to be very mindful of privacy rights. We clearly have, there, two potentially interested bodies in the UK environment. We have the child protection agencies, and we will have, in future, Ofcom seeking to ensure that the platform has met its duty of care, and we will have the Information Commission’s Office.
A platform, in a sense, can be neutral as to what it is instructed to do by the regulator. Certainly, my experience was that the platforms wanted to do those kinds of activities, but they are neutral in the sense that they will do what they are told is legal. There, you need clarity from the regulators together to say, “Yes, we have looked at this and you are not going to do something on the instruction of the child safety agency and then get criticised, and potentially fined, by the Data Protection Agency for doing the thing you have been instructed to do”—so we need those agencies to work together.
The second example is in the area of co-operation around antiterrorism, another key issue. The platforms have created something called the Global Internet Forum to Counter Terrorism. Within that forum, they share tools and techniques—things such as databases of information about terrorist content and systems that you can use to detect them—and you are encouraged within that platform to share those tools and techniques with smaller platforms and competitors. Clearly, again, there is a very significant set of questions, and if you are in a discussion around that, the lawyers will say, “Have the competition lawyers cleared this?” Again, therefore, something that is in the public interest—that all the platforms should be using similar kinds of technology to detect terrorist content—is something where you need a view not just from the counterterrorism people but also, in our case, from the Competition and Markets Authority. So, again, you need those regulators to work together.
The final example is one which I know is dear to the heart of the noble Baroness, Lady Morgan of Cotes, which is fraudsters, which we have dealt with, where you might have patterns of behaviour where you have information that comes from the telecoms companies regulated by Ofcom, the internet service providers, regulated by Ofcom, and financial institutions, regulated by their own family of regulators—and they may want to share data with each other, which is something that is subject to the Information Commission’s Office again. So, again, if we are going to give platforms instructions, which we rightly do in this legislation, and say, “Look, we want you to get tougher on online fraudsters; we want you to demonstrate a duty of care there”, the platforms will need—certainly those regulators: financial regulators, Ofcom and the Information Commissioner’s Office—to sort those things out.
Having a forum such as the one proposed in Amendment 201, where these really difficult issues can be thrashed out and clear guidance can be given to online services, will be much more efficient than what sometimes happened in the past, where you had the left hand and the right hand of the regulatory world pulling you in different directions. I know that we have the Digital Regulation Cooperation Forum. If we can build on those institutions, it is essential and ideal that they have their input before the guidance is issued, rather than have a platform comply with guidance from regulator A and then get dinged by regulator B for doing the thing that they have been instructed to do.
That leads to the very sensible Amendment 201 on skilled persons. Again, Ofcom is going to be able to call in skilled persons. In an area such as data protection, that might be a data protection lawyer, but, equally, it might be that somebody who works at the Information Commissioner’s Office is actually best placed to give advice. Amendment 200—the first of the two that talks about skilled persons being able to come from regulators—makes sense.
Finally, I will touch on the issues raised in Amendment 268AA—I listened carefully and understand that it is a probing amendment. It raises some quite fundamental questions of principle—I suspect that the noble Baroness, Lady Fox, might want to come in on these—and it has been dealt with in the context of Germany and its network enforcement Act: I know the noble Lord, Lord Parkinson of Whitley Bay, can say that in the original German. That Act went in the same direction, motivated by similar concerns around hate speech.