Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateLord Bishop of Guildford
Main Page: Lord Bishop of Guildford (Bishops - Bishops)Department Debates - View all Lord Bishop of Guildford's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, one of our clergy in the diocese of Guildford has been campaigning for more than a decade, as have others in this Committee, on children’s access to online pornography. With her, I support the amendments in the names of the noble Baronesses, Lady Kidron and Lady Harding.
Her concerns eventually made their way to the floor of the General Synod of the Church of England in a powerful debate in July last year. The synod voted overwhelmingly in favour of a motion, which said that we
“acknowledge that our children and young people are suffering grave harm from free access to online pornography”
and urged us to
“have in place age verification systems to prevent children from having access to those sites”.
It asked Her Majesty’s Government to use their best endeavours to secure the passage and coming into force of legislation requiring age-verification systems preventing access by people under the age of 18. It also recommended more social and educational programmes to increase awareness of the harms of pornography, including self-generated sexually explicit images.
Introducing the motion, my chaplain, Reverend Jo Winn-Smith, said that age verification
“ought to be a no-brainer … Exposure to sexualised material is more likely to lead to young people engaging in more sexualised behaviour and to feel social pressure to have sex”,
as well as normalising sexual violence against girls and women. A speech from the chaplain-general of the Prison Service towards the end of the debate highlighted just where such behaviours and pressures could lead in extreme circumstances.
One major theme that emerged during the debate is highlighted by the amendments this afternoon: that access to online pornography goes far beyond materials that fall into what the Bill defines as Part 5 services. Another is highlighted in a further group of amendments: age assurance needs to be both mandatory and effective beyond reasonable doubt.
It was also commented on how this whole area has taken such an age to get on to the statute book, given David Cameron’s proposals way back in 2013 and further legislation proposed in 2018 that was never enacted. Talk of secondary legislation to define harmful content in that regard is alarming, as a further amendment indicates, given the dragging of feet that has now been perpetuated for more than a decade. That is a whole generation of children and young people.
In an imaginative speech in the synod debate, the most reverend Primate the Archbishop of York, Archbishop Stephen, reminded us that the internet is not a platform; it is a public space, where all the rights and norms you would expect in public should apply. In the 1970s, he continued, we famously put fluoride in the water supply, because we knew it would be great for dental health; now is the opportunity to put some fluoride into the internet. I add only this: let us not water down the fluoride to a point where it becomes feeble and ineffective.
My Lords, I will speak in support of the amendments in this group in the names of the intrepid noble Baroness, Lady Kidron, the noble Baroness, Lady Harding, and my noble friend Lord Storey—we are kindred spirits.
As my noble friend said, the expectations of parents are clear: they expect the Bill to protect their children from all harm online, wherever it is encountered. The vast majority of parents do not distinguish between the different content types. To restrict regulation to user-to-user services, as in Part 3, would leave a great many websites and content providers, which are accessed by children, standing outside the scope of the Bill. This is a flagship piece of legislation; there cannot be any loopholes leaving any part of the internet unregulated. If there is a website, app, online game, educational platform or blog—indeed, any content that contains harmful material—it must be in the scope of the Bill.
The noble Baroness, Lady Kidron, seeks to amend the Bill to ensure that it aligns with the Information Commissioner’s age-appropriate design code—it is a welcome amendment. As the Bill is currently drafted, the threshold for risk assessment is too high. It is important that the greatest number of children and young people are protected from harmful content online. The amendments achieve that to a greater degree than the protection already in the Bill.
While the proposal to align with the age-appropriate design code is welcome, I have one reservation. Up until recently, it appears that the ICO was reluctant to take action against pornography platforms that process children’s data. It has perhaps been deemed that pornographic websites are unlikely to be accessed by children. Over the years, I have shared with this House the statistics of how children are accessing pornography and the harm it causes. The Children’s Commissioner also recently highlighted the issue and concerns. Pornography is being accessed by our children, and we must ensure that the provisions of the Bill are the most robust they can be to ensure that children are protected online.
I am concerned with ensuring two things: first, that any platform that contains harmful material falls under the scope of the Bill and is regulated to ensure that children are kept safe; and, secondly, that, as far as possible, what is harmful offline is regulated in the same way online. The amendments in the name of my noble friend Lord Storey raise the important question of online-offline equality. Amendments 33A and 217A seek to regulate online video games to ensure they meet the same BBFC ratings as would be expected offline, and I agree with that approach. Later in Committee, I will raise this issue in relation to pornographic content and how online content should be subject to the same BBFC guidance as content offline. I agree with what my noble friend proposes: namely, that this should extend to video game content as well. Video games can be violent and sexualised in nature, and controls should be in place to ensure that children are protected. The BBFC guidelines used offline appear to be the best way to regulate online as well.
Children must be kept safe wherever they are online. This Bill must have the widest scope possible to keep children safe, but ensuring online/offline alignment is crucial. The best way to keep children safe is to legislate for regulation that is as far reaching as possible but consistently applied across the online/offline world. These are the reasons why I support the amendments in this group.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Bishop of Guildford
Main Page: Lord Bishop of Guildford (Bishops - Bishops)Department Debates - View all Lord Bishop of Guildford's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I will speak to Amendments 128, 130 and 132, as well as Amendments 143 to 153 in this grouping. They were tabled in the name of my right reverend colleague the Bishop of Derby, who is sorry that she cannot be here today.
The Church of England is the biggest provider of youth provision in our communities and educates around 1 million of our nation’s children. My colleague’s commitment to the principles behind these amendments also springs from her experience as vice chair of the Children’s Society. The amendments in this grouping are intended to strengthen legislation on online grooming for the purpose of child criminal exploitation, addressing existing gaps and ensuring that children are properly protected. They are also intended to make it easier for evidence of children being groomed online for criminal exploitation to be reported by online platforms to the police and the National Crime Agency.
Research from 2017 shows that one in four young people reported seeing illicit drugs advertised for sale on social media—a percentage that is likely to be considerably higher six years on. According to the Youth Endowment Fund in 2022, 20% of young people reported having seen online content promoting gang membership in the preceding 12 months, with 24% reporting content involving the carrying, use or promotion of weapons.
In relation to drugs, that later research noted that these platforms provide opportunities for dealers to build trust with potential customers, with young people reporting that they are more likely to see a groomer advertising drugs as a friend than as a dealer. This leaves young people vulnerable to exploitation, thereby reducing the scruples or trepidation they might feel about buying drugs in the first place. Meanwhile, it is also clear that social media is changing the operation of the county lines model. There is no longer the need to transport children from cities into the countryside to sell drugs, given that children who live in less populated areas can be groomed online as easily as in person. A range of digital platforms is therefore being used to target potential recruits among children and young people, with digital technologies also being deployed—for example, to monitor their whereabouts on a drugs run.
More research is being carried out by the Children’s Society, whose practitioners reported a notable increase in the number of perpetrators grooming children through social media and gaming sites during the first and second waves of the pandemic. Young people were being contacted with promotional material about lifestyles they could lead and the advantages of working within a gang, and were then asked to do jobs in exchange for money or status within this new group. It is true that some such offences could be prosecuted under the Modern Slavery Act 2015, but there remains a huge disparity between the scale of exploitation and the number of those being charged under the Act. Without a definition of child exploitation for criminal purposes, large numbers of children are being groomed online and paying the price for crimes committed by some of their most dangerous and unscrupulous elders.
It is vital that we protect our children from online content which facilitates that criminal exploitation, in the same way that we are looking to protect them from sexual exploitation. Platforms must be required to monitor for illegal content related to child criminal exploitation on their sites and to have mechanisms in place for users to flag it with those platforms so it can be removed. This can be achieved by including modern slavery and trafficking, of which child criminal exploitation is a form, into the scope of illegal content within the Bill, which is what these amendments seek to do. It is also vital that the law sets out clear expectations on platforms to report evidence of child criminal exploitation to the National Crime Agency in the same way as they are expected to report content involving child sexual exploitation and abuse to enable child victims to be identified and to receive support. Such evidence may enable action against the perpetrators without the need of a disclosure from child victims. I therefore fully support and endorse the amendments standing in the name of the right reverend Prelate.
My Lords, this is again a very helpful set of amendments. I want to share some experience that shows that legality tests are really hard. Often from the outside there is an assumption that it is easy to understand what is legal and illegal in terms of speech, but in practice that is very rarely the case. There is almost never a bright line, except in a small class of child sexual abuse material where it is always illegal and, as soon as you see the material, you know it is illegal and you can act on it. In pretty much every other case, you have to look at what is in front of you.
I will take a very specific example. Something we had to deal with was images of Abdullah Öcalan, the leader of the PKK in Turkey. If somebody shared a picture of Abdullah Öcalan, were they committing a very serious offence, which is the promotion of terrorism? Were they indicating support for the peace process that was taking place in Turkey? Were they showing that they support his socialist and feminist ideals? Were they supporting the YPG, a group in Syria to which we were sending arms, that venerates him? This is one example of many I could give where the content in front of you does not tell you very clearly whether or not the speech is illegal or speech that should be permitted. Indeed, we would take speech like that down and I would get complaints, including from Members of Parliament, saying, “Why have you removed that speech? I’m entitled to talk about Abdullah Öcalan”, and we would enter into an argument with them.
We would often ask lawyers in different countries whether they could tell us whether a speech was legal or illegal. The answer would come back as probably illegal, likely illegal, maybe illegal and, occasionally, definitely not illegal, but it was nearly always on the spectrum. The amendments we are proposing today are to try to understand where the Government intend people to draw that line when they get that advice. Let us assume the company wants to do the right thing and follow the instructions of the Bill and remove illegal content. At what level do they say it has met the test sufficiently, given that in the vast majority of cases, apart from the small class of illegal content, they are going to be given only a likelihood or a probability? As the noble Lord, Lord Moylan, pointed out, we have to try to insert this notion of sufficient evidence with Amendments 273, 275, 277, 280 and 281 in the names of my noble friend Lord Clement-Jones and the noble Viscount, Lord Colville, who is unable to be in his place today. I think the noble Baroness, Lady Kidron, may also have signed them. We are trying to flesh out the point at which that illegality standard should kick in.
Just to understand again how this often works when the law gets involved, I say that there is a law in Germany; the short version is NetzDG. If there are any German speakers who can pronounce the compound noun that is its full title, there will be a prize. It is a long compound word that means “network enforcement Act”. It has been in place for a few years and it tells companies to do something similar—to remove content that is illegal in Germany. There would be cases where we would get a report from somebody saying, “This is illegal”, and we would take action; then it went into the German system and three months later we would finally get told whether it was actually illegal in a 12-page judgment that a German court had figured out. In the meantime, all we could do was work on our best guess while that process was going on. I think we need to be very clear that illegality is hard.
Cross-jurisdictional issues present us with another set of challenges. If both the speaker and the audience are in the United Kingdom, it is fairly clear. But in many cases, when we are talking about online platforms, one or other, or even both the speaker and the audience, may be outside the United Kingdom. Again, when does the speech become illegal? It may be entirely legal speech between two people in the United States. I think—and I would appreciate clarification from the Minister—that the working assumption is that if the speech was reported by someone not in the United State but in the UK, the platform would be required to restrict access to it from the UK, even though the speech is entirely legal in the jurisdiction in which it took place. Because the person in the UK encountered it, there would be a duty to restrict it. Again, it has been clarified that there is certainly not a duty to take the speech down, because it is entirely legal speech outside the UK. These cross-jurisdictional issues are interesting; I hope the Minister can clarify that.
The amendments also try to think about how this would work in practice. Amendment 287 talks about how guidance should be drawn up in consultation with UK lawyers. That is to avoid a situation where platforms are guessing too much at what UK lawyers want; they should at least have sought UK legal advice. That advice will then be fed into the guidance given to their human reviewers and their algorithms. That is the way, in practice, in which people will carry out the review. There is a really interesting practical question—which, again, comes up under NetzDG—about the extent to which platforms should be investing in legal review of content that is clearly against their terms of service.
There will be two kinds of platform. There will be some platforms that see themselves as champions of freedom of expression and say they will only remove stuff that is illegal in the UK, and everything else can stay up. I think that is a minority of platforms—they tend to be on the fringes. As soon as a platform gets a mainstream audience, it has to go further. Most platforms will have terms of service that go way beyond UK law. In that case, they will be removing the hate speech, and they will be confident that they will remove UK-illegal hate speech within that. They will remove the terrorist content. They will be confident and will not need to do a second test of the legality in order to be able to remove that content. There is a practical question about the extent to which platforms should be required to do a second test if something is already illegal under their terms.
There will be, broadly speaking again, four buckets of content. There will be content that is clearly against a platform’s terms, which it will want to get rid of immediately. It will not want to test it again for legality; it will just get rid of it.
There will be a second bucket of content that is not apparently against a platform’s terms but clearly illegal in the UK. That is a very small subset of content: in Germany, that is Holocaust denial content; in the United Kingdom, this Parliament has looked at Holocaust denial and chosen not to criminalise it, so that will not be there, but an equivalent for us would be migration advice. Migration advice will not be against the terms of service of most platforms, but in the Government’s intention, the Illegal Migration Bill is to make it illegal and require it to be removed, and the consequent effect will be that it will have to be removed under the terms of this Bill. So there will be that small set of content that is illegal in the UK but not against terms of service.
There will be a third bucket of content that is not apparently against the terms or the law, and that actually accounts for most of the complaints that a platform gets. I will choose my language delicately: complaint systems are easy, and people complain to make a point. They use complaint systems such as dislike buttons. The reality is that one of the most common sets of complaints you get is when there is a football match and the two opposing teams report the content on each other’s pages as illegal. They will do that every time, and you get used to it, and that is why you learn to discount mass-volume complaints. But again, we should be clear that there are a great many complaints that are merely vexatious.
The final bucket is of content that is unclear and legal review will be needed. Our amendment is intended to deal with those. A platform will go out and get advice. It is trying to understand at what point something like migration advice tips over into the illegal as opposed to being advice about going on holiday, and it is trying to understand that based on what it can immediately see. Once it has sought that advice, it will feed that back into the guidance to reviewers and the algorithms to try and remove content more effectively and be compliant with the Bill as a whole and not get into trouble with Ofcom.
Some areas are harder than others. The noble Lord, Lord Moylan, already highlighted one: public order offences, which are extremely hard. If somebody says something offensive or holds an offensive political view—I suspect the noble Baroness, Lady Fox, may have something to say on this—people may well make contact and claim that it is in breach of public order law. On the face of it, they may have a reasonably arguable case but again, as a platform, you are left to make a decision.