Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Fox of Buckley
Main Page: Baroness Fox of Buckley (Non-affiliated - Life peer)Department Debates - View all Baroness Fox of Buckley's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I am pleased that the noble Lord, Lord Knight of Weymouth, has given us an opportunity to talk about transparency reports with these amendments, which are potentially a helpful addition to the Bill. Transparency is one of the huge benefits that the legislation may bring. One of the concerns that the public have and that politicians have always had with online platforms is that they appear to be a black box—you cannot see what is going on in them.
In the entire edifice that we are constructing in the Online Safety Bill, there are huge opportunities to change that. The platforms will have to do risk assessments —there are measures in the Bill to make sure that information about these is put out—and they will have to take active steps to mitigate any risks they find. Again, we may get directions and guidance from Ofcom that will explain to the public exactly what is expected of them. The final piece of the jigsaw is the transparency reports that show the outcomes—how a platform has performed and what it has done to meet its obligations in dealing with content and behaviour on its services.
For the record, I previously worked for one of the platforms, and I would have said that I was on the pro-transparency wing of the transparency party inside the company. I believed that it was in the platform’s interest: if you do not tell people what you are doing, they will make things up about you, and what they make up will generally be worse than what you are actually doing. So there are huge advantages to the platforms from being transparent.
The noble Lord, Lord Knight, has picked up on some important points in his Amendment 160B, which talks about making sure that the transparency report is not counterproductive by giving the bad guys information that they could use to ill effect. That is a valid point; it is often debated inside the platforms. Sometimes, I argued furiously with my colleagues in the platforms about why we should disclose information. They would ask, “What about the bad guys?” Sometimes I challenged that, but other times it would have been a genuine and accurate concern. The noble Lord mentioned things such as child sexual abuse material, and we have to recognise that the bad guys are incredibly devious and creative, and if you show them anything that they can use against you to get around your systems, they will try to do that. That is a genuine and valid concern.
The sort of thing that you might put into a transparency report is, for example, whether you have banned particular organisations. I would be in favour of indicating to the public that an organisation is banned, but you can see that the potential impact of that is that all the people you are concerned about would create another organisation with a different name and then get back on to your platform. We need to be alive to those kinds of concerns.
It is also relevant to Amendment 165 and the terms of service that the more granular and detailed your terms of service are, the better they are for public information, but there are opportunities to get around them. Again, we would have that argument internally. I would say, “If we are prohibiting specific hate speech terms, tell people that, and then they won’t use them”. For me, that would be a success, as they are not using those hate speech terms anymore, but, of course, they may then find alternative hate speech terms that they can use instead. You are facing that battle all the time. That is a genuine concern that I hope we will be able to debate. I hope that Ofcom will be able to mitigate that risk by discussing with platforms what these transparency reports should look like. In a sense, we are doing a risk assessment of the transparency report process.
Amendment 229 on effectiveness is really interesting. My experience was that if you did not have a transparency report, you were under huge pressure to produce one and that once you produced one, nobody was interested. For fear of embarrassing anyone in the Committee, I would be curious to know how many noble Lords participating in this debate have read the transparency reports already produced by Meta Platforms, Google and others. If they have not read them, they should not be embarrassed, because my experience was that I would talk to regulators and politicians about something they had asked me to come in to talk about, such as hate speech or child sexual abuse material, and I learned to print off the transparency report. I would go in and say, “Well, you know what we are doing; it’s in our transparency report”. They would ask, “What transparency report?”, and I would have to show them. So, having produced a transparency report, every time we published it, we would expect there to be public interest, but little use was made of it. That is not a reason not to do them—as I said, I am very much in favour of doing them—but, on their own, they may not be effective, and Amendment 229 touches on that.
I was trying to think of a collective noun for transparency reports and, seeing as they shed light, I think it may be a “chandelier”. Where we may get the real benefit is if Ofcom can produce a chandelier of transparency reports, taking all the information it gets from the different platforms, processing it and selecting the most relevant information—the reports are often too long for people to work their way through—so that it can enable comparisons. That is really good and it is quite good for the industry that people know that platform A did this, platform B did that, and platform C did something else. They will take note of that, compare with each other and want to get into the best category. It is also critical that Ofcom puts this into user-friendly language, and Ofcom has quite a good record of producing intelligible reports. In the context of Amendment 229, a review process is good. One of the things that might come out of that, thinking ahead, would be Ofcom’s role in producing meta transparency reports, the chandelier that will shed light on what the whole sector is doing.
My Lords, for once I want to be really positive. I am actually very positive about this whole group of amendments because more transparency is essential in what we are discussing. I especially like Amendment 165 from the noble Lord, Lord Stevenson of Balmacara, because it is around terms of service for user-to-user services and ensures that information can be sought on the scope as well as the application. This is important because so much has been put on user-to-user services as well as on terms of service. You need to know what is going on.
I want particularly to compliment Amendment 229 that says that transparency reports should be
“of sufficient quality to enable service users and researchers to make informed judgements”,
et cetera. That is a very elegant way in which to say that they should not be gobbledegook. If we are going to have them, they should be clear and of a quality that we can read. Obviously, we do not want them to be unreadable and full of jargon and legalistic language. I am hoping that that is the requirement.
My Lords, so few of us are involved in this discussion that we are now able to write each other’s speeches. I thank the noble Lord, Lord Allan of Hallam, for articulating some of my concerns, probably more elegantly than I will myself. I will focus on two amendments in this group; in fact, there are lots of interesting things, but I will focus on both the amendments from the noble Lord, Lord Bassam of Brighton.
On the issue of proactive steps to remove listings of knives for young people, I am so sympathetic to this because in a different area of my life I am pretty preoccupied with the problem of knife crime among young people. It really bothers me and I worry about how we tackle it. My concern of course is that the police should be working harder to solve that problem and that we cannot anticipate that the Bill will solve all social problems. There is a danger of removing the focus from law enforcement in a real-world problem, as though removing how you buy the knife is the issue. I am not convinced that that helps us.
I wanted to reflect on the kind of dilemmas I am having around this in relation to the story of Mizzy that is doing the rounds. He is the 18 year-old who has been posting his prank videos on TikTok and has caused quite a stir. People have seen him wandering into strangers’ homes uninvited, asking random people in the street if they want to die, running off with an elderly lady’s dog and making fun of Orthodox Jews—generally speaking, this 18 year-old is obnoxious. His TikTok videos have gone viral; everybody is discussing them.
This cruelty for kicks genre of filming yourself, showing your face full to the camera and so on, is certainly abhorrent but, as with the discussion about knife crime, I have noticed that some people outside this House are attempting to blame the technology for the problem, saying that the videos should have been removed earlier and that it is TikTok’s fault that we have this anti-social behaviour, whereas I think it is a much deeper, broader social problem to do with the erosion of adult authority and the reluctance of grown-ups to intervene clearly when people are behaving badly—that is my thesis. It is undoubtedly a police matter. The police seem to have taken ages to locate Mizzy. They eventually got him and charged him with very low offences, so he was on TV being interviewed the other evening, laughing at how weak the law was. Under the laws he was laughing at, he could freely walk into somebody’s house or be obnoxious and get away with it. He said, “We can do what we want”. That mockery throws up problems, but I do not necessarily think that the Bill is the way to solve it.
That leads me to my concerns about Amendment 268AA, because Mizzy was quoted in the Independent newspaper as saying:
“I’m a Black male doing these things and that’s why there’s such an uproar”.
I then went on a social media thread in which any criticism of Mizzy’s behaviour was described as racist harassment. That shows the complexity of what is being called for in Amendment 268AA, which wants platforms to take additional steps
“to combat incidents of online racially aggravated harassment”.
My worry is that we end up with not only Mizzy’s TikTok videos being removed but his critics being removed for racially harassing him, so we have to be very careful here.
Amendment 268AA goes further, because it wants tech companies to push for prosecution. I really think it is a dangerous step to encourage private companies to get tangled up in deciding what is criminal and so on. The noble Lord, Lord Allan, has exactly described my concerns, so I will not repeat them. Maybe I can probe this probing amendment. It also broadens the issue to all forms of harassment.
By the way, the amendment’s explanatory statement mentions the appalling racist abuse aimed at footballers and public figures, but one of the fascinating things was that when we number-crunched and went granular, we found that the majority of that racist abuse seemed to have been generated by bots, which takes us to the position of the noble Lord, Lord Knight, earlier: who would you prosecute in that instance? Bots not even based in the UK were generating what was assumed to be an outbreak of racist abuse among football fans in the UK, but the numbers did not equate to that. There were some people being racist and vile and some things that were generated in these bot farms.
To go back to the amendment, it goes on to broaden the issue out to
“other forms of harassment and threatening or abusive behaviour”.
Again, this is much more complicated in today’s climate, because those kinds of accusation can be deployed for bad faith reasons, particularly against public figures.
We have an example close to this House. I hope that Members have been following and will show solidarity over what has been happening to the noble Baroness, Lady Falkner of Margravine, who is chair of the Equality and Human Rights Commission and tasked with upholding the equality law but is at the centre of a vicious internal row after her officials filed a dossier of complaints about her. They have alleged that she is guilty of harassment. A KC is being brought in, there are 40 complaints and the whole thing is costing a fortune for both taxpayers and the noble Baroness herself.
It coincided with the noble Baroness, Lady Falkner, advising Ministers to update the definition of sex in the Equality Act 2010 to make clear that it refers to biological sex and producing official advice clarifying that trans women can be lawfully excluded from female-only spaces. We know how toxic that whole debate is.
Many of us feel that a lot of the accusations against the noble Baroness are ideologically and politically motivated vexatious complaints. I am distressed to read newspaper reports that say that she has been close to tears and has asked why anyone would go into public service. All this is for the crime of being a regulator upholding and clarifying the law. I hope it does not happen to the person who ends up regulating Ofcom—ending up close to tears as he stands accused of harassment, abusive behaviour and so on.
The point is that she is the one being accused of harassment. I have seen the vile abuse that she has received online. It is completely defamatory, vicious abuse and yet somehow it ends up being that, because she does not provide psychological safety at work and because of her views, she is accused of harassment and is the one in the firing line. I do not want us to introduce that kind of complexity—this is what I have been worried about throughout—into what is banned, removed or sent to the police as examples of harassment or hate crime.
I know that is not the intention of these amendments; it is the unintended consequences that I dread.
My Lords, I will speak chiefly to Amendment 262 in my name, although in speaking after the noble Baroness, Lady Fox, who suggested that the grown-ups should control anti-social behaviour by young people online, I note that there is a great deal of anti-social behaviour online from people of all ages. This is relevant to my Amendment 262.
It is a very simple amendment and would require the Secretary of State to consult with young people by means of an advisory board consisting of people aged 25 and under when reviewing the effectiveness and proportionality of this legislation. This amendment is a practical delivery of some of the discussion we had earlier in this Committee when we were talking about including the Convention on the Rights of the Child in the Bill. There is a commonly repeated phrase, “Nothing about us without us”. It was popularised by disability activists in the 1990s, although in doing a little research for this I found that it originates in Latin in Poland in the 15th century. So it is an idea that has been around for a long while and is seen as a democratic standard. It is perhaps a variation of the old “No taxation without representation”.
This suggestion of an advisory board for the Secretary of State is because we know from the discussion earlier on the children’s rights amendments that globally one in three people online is a child under the age of 18. This comes to the point of the construction of your Lordships’ House. Most of us are a very long way removed in experiences and age—some of us further than others. The people in this Committee thinking about a 12 year-old online now are parents, grandparents and great-grandparents. I venture to say that it is very likely that the Secretary of State is at least a generation older than many of the people who will be affected by its provisions.
This reflects something that I also did on the Health and Care Bill. To introduce an advisory panel of young people reporting directly to the Secretary of State would ensure a direct voice for legislation that particularly affects young people. We know that under-18s across the UK do not have any role in elections to the other place, although 16 and 17 year-olds have a role in other elections in Wales and Scotland now. This is really a simple, clear, democratic step. I suspect the Minister might be inclined to say, “We are going to talk to charities and adults who represent children”. I suggest that what we really need here is a direct voice being fed in.
I want to reflect on a recent comment piece in the Guardian that made a very interesting argument: that there cannot be, now or in the future, any such thing as a digital native. Think of the experience of someone 15 or 20 years ago; yes, they already had the internet but it was a very different beast to what we have now. If we refer back to some of the earlier groups, we were starting to ask what an internet with widespread so-called generative artificial intelligence would look like. That is an internet which is very different from even the one that a 20 year-old is experiencing now.
It is absolutely crucial that we have that direct voice coming in from young people with experience of what it is like. They are an expert on what it is like to be a 12 year-old, a 15 year-old or a 20 year-old now, in a way that no one else can possibly be, so that is my amendment.
I shall speak very briefly at this hour, just to clarify as much as anything. It seems important to me that there is a distinction between small platforms and large platforms, but my view has never been that if you are small, you have no potential harms, any more than if you are large, you are harmful. The exception should be the rule. We have to be careful of arbitrary categorisation of “small”. We have to decide who is going to be treated as though they are a large category 1 platform. I keep saying but stress again: do not assume that everybody agrees what significant risk of harm or hateful content is. It is such highly disputed political territory outside the online world and this House that we must recognise that it is not so straightforward.
I am very sympathetic, by the way, to the speeches made about eating disorders and other issues. I see that very clearly, but other categories of speech are disputed and argued over—I have given loads of examples. We end up where it is assumed that the manifestoes of mass shooters appear on these sites, but if you read any of those manifestoes of mass shooters, they will often be quoting from mainstream journalists in mainstream newspapers, the Bible and a whole range of things. Just because they are on 4Chan, or wherever, is not necessarily the problem; it is much more complicated.
I ask the Minister, and the proposers of the amendment, to some extent: would it not be straightforwardly the case that if there is a worry about a particular small platform, it might be treated differently—
I just want to react to the manifestos of mass shooters. While source material such the Bible is not in scope, I think the manifesto of a shooter is clear incitement to terrorism and any platform that is comfortable carrying that is problematic in my view, and I hope it would be in the noble Baroness’s view as well.
I was suggesting that we have a bigger problem than it appearing on a small site. It quotes from mainstream media, but it ends up being broadly disseminated and not because it is on a small site. I am not advocating that we all go round carrying the manifestos of mass shooters and legitimising them. I was more making the point that it can be complicated. Would not the solution be that you can make appeals that a small site is treated differently? That is the way we deal with harmful material in general and the way we have dealt with, for example, RT as press without compromising on press freedom. That is the kind of point I am trying to make.
I understand lots of concerns but I do not want us to get into a situation where we destroy the potential of all smaller platforms—many of them doing huge amounts of social good, part of civil society and all the rest of it—by treating them as though they are large platforms. They just will not have the resources to survive, that is all my point is.
My Lords, I am going to be extremely brief given the extremely compelling way that these amendments have been introduced by the noble Baroness, Lady Morgan, and the noble Lord, Lord Griffiths, and contributed to by the noble Baroness, Lady Bull. I thank her for her comments about my noble friend Lady Parminter. I am sure she would have wanted to be here and would have made a very valuable contribution as she did the other day on exactly this subject.
As the noble Baroness, Lady Fox, has illustrated, we have a very different view of risk across this Committee and we are back, in a sense, into that whole area of risk. I just wanted to say that I think we are again being brought back to the very wise words of the Joint Committee. It may sound like special pleading. We keep coming back to this, and the noble Lord, Lord Stevenson, and I are the last people standing on a Thursday afternoon.
We took a lot of evidence in this particular area. We took the trouble to go to Brussels and had a very useful discussion with the Centre on Regulation in Europe and Dr Sally Broughton Micova. We heard a lot about interconnectedness between some of these smaller services and the impact in terms of amplification across other social media sites.
We heard in the UK from some of the larger services about their concerns about the activities of smaller services. You might say “They would say that, wouldn’t they?” but they were pretty convincing. We heard from HOPE not Hate, the Antisemitism Policy Trust and Stonewall, stressing the role of alternative services.
Of course, we know that these amendments today—some of them sponsored by the Mental Health Foundation, as the noble Lord, Lord Griffiths, said, and Samaritans—have a very important provenance. They recognise that these are big problems. I hope that the Minister will think strongly about this. The injunction from the noble Lord, Lord Allan, to consider how all this is going to work in practice is very important. I very much hope that when we come to consider how this works in practical terms that the Minister will think very seriously about the way in which risk is to the fore— the more nuanced approach that we suggested—and the whole way that profiling by Ofcom will apply. I think that is going to be extremely important as well. I do not think we have yet got to the right place in the Bill which deals with these risky sites. I very much hope that the Minister will consider this in the quite long period between now and when we next get together.