Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateJohn Nicolson
Main Page: John Nicolson (Scottish National Party - Ochil and South Perthshire)Department Debates - View all John Nicolson's debates with the Department for Digital, Culture, Media & Sport
(2 years, 7 months ago)
Commons ChamberThank you, Madam Deputy Speaker, but I was under the impression that I was to wind up for my party, rather than speaking at this juncture.
If the hon. Gentleman would prefer to save his slot until later—
Then we shall come to that arrangement. I call Dame Margaret Hodge.
Everyone wants to be safe online and everyone wants to keep their children safe online but, from grooming to religious radicalisation and from disinformation to cruel attacks on the vulnerable, the online world is far from safe. That is why we all agree that we need better controls while we preserve all that is good about the online world, including free speech.
This Bill is an example of how legislation can benefit from a collegiate, cross-party approach. I know because I have served on the Select Committee and the Joint Committee, both of which produced reports on the Bill. The Bill is ambitious and much of it is good, but there are some holes in the legislation and we must make important improvements before it is passed.
Does the hon. Gentleman, with whom I served on the Joint Committee on the draft Bill, agree, having listened to the evidence of the whistleblower Frances Haugen about how disinformation was used in the US Capitol insurrection, that it is completely inadequate that there is only one clause on the subject in the Bill?
Yes, and I shall return to that point later in my speech.
The Secretary of State’s powers in the Bill need to be addressed. From interested charities to the chief executive of Ofcom, there is consensus that the powers of the Secretary of State in the legislation are too wide. Child safety campaigners, human rights groups, women and girls’ charities, sports groups and democracy reform campaigners all agree that the Secretary of State’s powers threaten the independence of the regulator. That is why both the Joint Committee and the Select Committee have, unanimously and across party lines, recommended reducing the proposed powers.
We should be clear about what exactly the proposed powers will do. Under clause 40, the Secretary of State will be able to modify the draft codes of practice, thus allowing the UK Government a huge amount of power over the independent communications regulator, Ofcom. The Government have attempted to play down the powers, saying that they would be used only in “exceptional circumstances”, but the word “exceptional” is nebulous. How frequent is exceptional? All we are told is that the exceptional circumstances could reflect changing Government “public policy”. That is far too vague, so perhaps the Secretary of State will clarify the difference between public policy and Government policy and give us some further definition of “exceptional”.
While of course I am sure Members feel certain that the current Secretary of State would exercise her powers in a calm and level-headed way, imagine if somebody intemperate held her post or—heaven forfend—a woke, left-wing snowflake from the Labour Benches did. The Secretary of State should listen to her own MPs and reduce her powers in the Bill.
Let me turn to misinformation and disinformation. The Bill aims not only to reduce abuse online but to reduce harm more generally. That cannot be done without including in the Bill stronger provisions on disinformation. As a gay man, I have been on the receiving end of abuse for my sexuality, and I have seen the devasting effect that misinformation and disinformation have had on my community. Disinformation has always been weaponised to spread hate; however, the pervasive reach of social media makes disinformation even more dangerous.
The latest battle ground for LGBT rights has seen an onslaught against trans people. Lies about them and their demand for enhanced civil rights have swirled uncontrollably. Indeed, a correspondent of mine recently lamented “trans funding” in the north-east of Scotland, misreading and misunderstanding and believing it to involve the compulsory regendering of retiring oil workers in receipt of transitional funding from the Scottish Government. That is absurd, of course, but it says something about the frenzied atmosphere stirred up by online transphobes.
The brutal Russian invasion of Ukraine, with lies spewed by the Russian Government and their media apologists, has, like the covid pandemic, illustrated some of the other real-world harms arising from disinformation. It is now a weapon of war, with serious national security implications, yet the UK Government still do not seem to be taking it seriously enough. Full Fact, the independent fact-checking service, said that there is currently no credible plan to tackle disinformation. The Government may well argue that disinformation will fall under the false communications provision in clause 151, but in practice it sets what will likely be an unmeetable bar for services. As such, most disinformation will be dealt with as harmful content.
We welcome the Government’s inclusion of functionality in the risk assessments, which will look not just at content but how it spreads. Evidence from the two Committees shows that the dissemination of harm is as important as the content itself, but the Government should be more explicit in favouring content-neutral modes for reducing disinformation, as this will have less of an impact on freedom of speech. That was recommended by the Facebook whistleblowers Sophie Zhang and Frances Haugen.
Will my hon. Friend give way?
No, I will make some progress, if I may.
A vital tool in countering disinformation is education, and Estonia—an early and frequent victim of Russian disinformation—is a remarkable case study. That is why the Government’s decision to drop Ofcom’s clause 104 media duties is perplexing. Media literacy should be a shared responsibility for schools, Government, and wider society. Spreading and enhancing media literacy should be up to not just Ofcom, but the larger platforms too. Ofcom should also be allowed to break platform terms and conditions for the purposes of investigation. For example, it would currently be unable to create fake profiles to analyse various companies’ behaviour, such as their response to abuse. It would empower the regulator.
Various issues arise when trying to legislate for harm that is not currently illegal. This is challenging for us as legislators since we do not know exactly what priority harms will be covered by secondary legislation, but we would like assurances from the Government that Zach’s law, as it has come to be known, will become a standalone offence. Vicious cowards who send seizure-inducing flashing images to people with epilepsy to trigger seizures must face criminal consequences. The Minister told me in a previous debate that this wicked behaviour will now be covered by the harmful communications offence under clause 150, but until a specific law is on the statute book, he will, I imagine, understand families’ desire for certainty.
Finally, I turn to cross-platform abuse. There has been a terrifying increase in online child abuse over the past three years. Grooming offences have increased by 70% in that period. The Select Committee and the Joint Committee received a host of recommendations which, disappointingly, seem to have been somewhat ignored by the Government. On both Committees, we have been anxious to reduce “digital breadcrumbing”, which is where paedophiles post images of children which may look benign and will not, therefore, be picked up by scanners. However, the aim is to induce children, or to encourage other paedophiles, to leave the regulated site and move to unregulated sites where they can be abused with impunity. I urge the Secretary of State to heed the advice of the National Society for the Prevention of Cruelty to Children. Without enacting the measures it recommends, children are at ever greater risk of harm.
The House will have noted that those on the SNP Benches have engaged with the Government throughout this process. Indeed, I am the only Member to have sat on both the Joint Committee and the Select Committee as this Bill has been considered and our reports written. It has been a privilege to hear from an incredible range of witnesses, some of whom have displayed enormous bravery in giving their testimony.
We want to see this legislation succeed. That there is a need for it is recognised across the House—but across the House, including on the Tory Benches, there is also recognition that the legislation can and must be improved. It is our intention to help to improve the legislation without seeking party advantage. I hope the Secretary of State will engage in the same constructive manner.
John Nicolson
Main Page: John Nicolson (Scottish National Party - Ochil and South Perthshire)(2 years, 5 months ago)
Public Bill CommitteesI rise to agree with all the amendments in this group that have been tabled by the Opposition. I want to highlight a couple of additional groups who are particularly at risk in relation to fraudulent advertising. One of those is pensioners and people approaching pension age. Because of the pension freedoms that are in place, we have a lot of people making uninformed decisions about how best to deal with their pensions, and sometimes they are able to withdraw a significant amount of money in one go. For an awful lot of people, withdrawing that money and paying the tax on it leads to a major financial loss—never mind the next step that they may take, which is to provide the money to fraudsters.
For pensioners in particular, requiring adverts to be clearly different from other search results would make a positive difference. The other thing that we have to remember is that pensioners generally did not grow up online, and some of them struggle more to navigate the internet than some of us who are bit younger.
I speak with some experience of this issue, because I had a constituent who was a pensioner and who was scammed of £20,000—her life savings. Does my hon. Friend realise that it is sometimes possible to pressurise the banks into returning the money? In that particular case, I got the money back for my constituent by applying a great deal of pressure on the bank, and it is worth knowing that the banks are susceptible to a bit of publicity. That is perhaps worth bearing in mind, because it is a useful power that we have as Members of Parliament.
I thank my hon. Friend for his public service announcement. His constituent is incredibly lucky that my hon. Friend managed to act in that way and get the money back to her, because there are so many stories of people not managing to get their money back and losing their entire life savings as a result of scams. It is the case that not all those scams take place online—people can find scams in many other places—but we have the opportunity with the Bill to take action on scams that are found on the internet.
The other group I want to mention, and for whom highlighting advertising could make a positive difference, is people with learning disabilities. People with learning disabilities who use the internet may not understand the difference between adverts and search results, as the hon. Member for Worsley and Eccles South mentioned. They are a group who I would suggest are particularly susceptible to fraudulent advertising.
We are speaking a lot about search engines, but a lot of fraudulent advertising takes place on Facebook and so on. Compared with the majority of internet users, there is generally an older population on such sites, and the ability to tackle fraudulent advertising there is incredibly useful. We know that the sites can do it, because there are rules in place now around political advertising on Facebook, for example. We know that it is possible for them to take action; it is just that they have not yet taken proper action.
I am happy to support the amendments, but I am also glad that the Minister has put these measures in the Bill, because they will make a difference to so many of our constituents.
It is, as ever, a pleasure to serve under your chairship, Ms Rees. Amendment 65 would add organisations campaigning for the removal of animal content to the list of bodies that Ofcom must consult. As we all know, Ofcom must produce codes of practice that offer guidance on how regulated services can comply with its duties. Later in the Bill, clause 45 makes clear that if a company complies with the code of practice, it will be deemed to have complied with the Bill in general. In addition, the duties for regulated services come into force at the same time as the codes of practice. That all makes what the codes say extremely important.
The absence of protections relating to animal abuse content is a real omission from the Bill. Colleagues will have seen the written evidence from Action for Primates, which neatly summarised the key issues on which Labour is hoping to see agreement from the Government. Given this omission, it is clear that the current draft of the Bill is not fit for tackling animal abuse, cruelty and violence, which is all too common online.
There are no explicit references to content that can be disturbing and distressing to those who view it—both children and adults. We now know that most animal cruelty content is produced specifically for sharing on social media, often for profit through the monetisation schemes offered by platforms such as YouTube. Examples include animals being beaten, set on fire, crushed or partially drowned; the mutilation and live burial of infant monkeys; a kitten intentionally being set on by a dog and another being stepped on and crushed to death; live and conscious octopuses being eaten; and animals being pitted against each other in staged fights.
Animals being deliberately placed into frightening or dangerous situations from which they cannot escape or are harmed before being “rescued” on camera is becoming increasingly popular on social media, too. For example, kittens and puppies are “rescued” from the clutches of a python. Such fake rescues not only cause immense suffering to animals, but are fraudulent because viewers are asked to donate towards the rescue and care of the animals. This cannot be allowed to continue.
Indeed, as part of its Cancel Out Cruelty campaign, the Royal Society for the Prevention of Cruelty to Animals conducted research, which found that in 2020 there were nearly 500 reports of animal cruelty on social media. That was more than twice the figure reported for 2019. The majority of these incidents appeared on Facebook. David Allen, head of prevention and education at the RSPCA, has spoken publicly about the issue, saying:
“Sadly, we have seen an increase in recent years in the number of incidents of animal cruelty being posted and shared on social media such as Facebook, Instagram, TikTok and Snapchat.”
I totally agree with the points that the hon. Lady is making. Does she agree that the way in which the Bill is structured means that illegal acts that are not designated as “priority illegal” will likely be put at the very end of companies’ to-do list and that they will focus considerably more effort on what they will call “priority illegal” content?
I completely agree with and welcome the hon. Gentleman’s contribution. It is a very valid point and one that we will explore further. It shows the necessity of this harm being classed as a priority harm in order that we protect animals, as well as people.
David Allen continued:
“We’re very concerned that the use of social media has changed the landscape of abuse with videos of animal cruelty being shared for likes and kudos with this sort of content normalising—and even making light of—animal cruelty. What’s even more worrying is the level of cruelty that can be seen in these videos, particularly as so many young people are being exposed to graphic footage of animals being beaten or killed which they otherwise would never have seen.”
Although the Bill has a clear focus on protecting children, we must remember that the prevalence of cruelty to animals online has the potential to have a hugely negative impact on children who may be inadvertently seeing that content through everyday social media channels.
Online Safety Bill (Ninth sitting) Debate
Full Debate: Read Full DebateJohn Nicolson
Main Page: John Nicolson (Scottish National Party - Ochil and South Perthshire)Department Debates - View all John Nicolson's debates with the Department for Digital, Culture, Media & Sport
(2 years, 5 months ago)
Public Bill CommitteesWe are now sitting in public and the proceedings are being broadcast. Please switch electronic devices to silent. Tea and coffee are not allowed during sittings.
Clause 40
Secretary of State’s powers of direction
I beg to move amendment 84, in clause 40, page 38, line 5, leave out subsection (a).
This amendment would remove the ability of the Secretary of State to modify Ofcom codes of practice ‘for reasons of public policy’.
With this it will be convenient to discuss the following:
Clause stand part.
Clause 41 stand part.
New clause 12—Secretary of State’s powers to suggest modifications to a code of practice—
“(1) The Secretary of State may on receipt of a code write within one month of that day to OFCOM with reasoned, evidence-based suggestions for modifying the code.
(2) OFCOM shall have due regard to the Secretary of State’s letter and must reply to the Secretary of State within one month of receipt.
(3) The Secretary of State may only write to OFCOM twice under this section for each code.
(4) The Secretary of State and OFCOM shall publish their letters as soon as reasonably possible after transmission, having made any reasonable redactions for public safety and national security.
(5) If the draft of a code of practice contains modifications made following changes arising from correspondence under this section, the affirmative procedure applies.”
This new clause gives the Secretary of State powers to suggest modifications to a code of practice, as opposed to the powers of direction proposed in clause 40.
Amendment 84 is very simple: it removes one sentence—
“for reasons of public policy”.
Of all the correspondence that I have had on the Bill—there has been quite a lot—this is the clause that has most aggrieved the experts. A coalition of groups with a broad range of interests, including child safety, human rights, women and girls, sport and democracy, all agree that the Secretary of State is granted excessive powers in the Bill, and that it threatens the independence of the independent regulator. Businesses are also wary of this power, in part due to the uncertainty that it causes.
The reduction of Ministers’ powers under the Bill was advised by the Joint Committee on the draft Bill and by the Digital, Culture, Media and Sport Committee. I am sure that the two hon. Members on the Government Benches who sat on those Committees and added their names to their reports—the hon. Members for Watford and for Wolverhampton North East—will vote for the amendment. How could they possibly have put their names to the Select Committee report and the Joint Committee report and then just a few weeks later decide that they no longer support the very proposals that they had advanced?
Could the Minister inform us which special interest groups specifically have backed the Secretary of State’s public policy powers under the Bill? I am fascinated to know. Surely, all of us believe in public policy that is informed by expert evidence. If the Secretary of State cannot produce any experts at all who believe that the powers that she enjoys are appropriate or an advantage, or improve legislation, then we should not be proceeding in the way that we are. Now that I know that our proceedings are being broadcast live, I also renew my call to anyone watching who is in favour of these powers as they are to say so, because so far we have found no one who holds that position.
We should be clear about exactly what these powers do. Under clause 40, the Secretary of State can modify the draft codes of practice, thus allowing the Government a huge amount of power over the independent communications regulator. The Government have attempted to play down these powers by stating that they would be used only in exceptional circumstances. However, the legislation does not define what “exceptional circumstances” means, and it is far too nebulous a term for us to proceed under the current circumstances. Rather, a direction can reflect public policy. Will the Minister also clarify the difference between “public policy” and “government policy”, which was the wording in the draft Bill?
The regulator must not be politicised in this way. Regardless of the political complexion of the Government, when they have too much influence over what people can say online, the implications for freedom of speech are grave, especially when the content that they are regulating is not illegal. I ask the Minister to consider how he would feel if, rather than being a Conservative, the Culture Secretary came from among my friends on the Labour Benches. I would argue that that would be a significant improvement, but I imagine that the Minister would not. I see from his facial expression that that is the case.
There are ways to future-proof and enhance the transparency of Ofcom in the Bill that do not require the overreach of these powers. When we are allowing the Executive powers over the communications regulator, the protections must be absolute and iron-clad. As it stands, the Bill leaves leeway for abuse of these powers. No matter how slim a chance the Minister feels that there is of that, as parliamentarians we must not allow it. That is why I urge the Government to consider amendment 84.
As somebody who is new to these proceedings, I think it would be nice if, just for once, the Government listened to arguments and were prepared to accept them, rather than us going through this Gilbert and Sullivan pantomime where we advance arguments, we vote and we always lose. The Minister often says he agrees with us, but he still rejects whatever we say.
I can provide my hon. Friend with that reassurance on the exceptional circumstances point. The Joint Committee report was delivered in December, approximately six months ago. It was a very long report—I think it had more than 100 recommendations. Of course, members of the Committee are perfectly entitled, in relation to one or two of those recommendations, to have further discussions, listen further and adjust their views if they individually see fit.
Let me just finish this point and then I will give way. The shadow SNP spokesman, the hon. Member for Ochil and South Perthshire, asked about the Government listening and responding, and we accepted 66 of the Joint Committee’s recommendations —a Committee that he served on. We made very important changes to do with commercial pornography, for example, and fraudulent advertising. We accepted 66 recommendations, so it is fair to say we have listened a lot during the passage of this Bill. On the amendments that have been moved in Committee, often we have agreed with the amendments but the Bill has already dealt with the matter. I wanted to respond to those two points before giving way.
I am intrigued, as I am sure viewers will be. What is the new information that has come forward since December that has resulted in the Minister believing that he must stick with this? He has cited new information and new evidence, and I am dying to know what it is.
I am afraid it was not me that cited new information. It was my hon. Friend the Member for Watford who said he had had further discussions with Ministers. I am delighted to hear that he found those discussions enlightening, as I am sure they—I want to say they always are, but let us say they often are.
I will not push the amendment to a vote, but it is important to continue this conversation, and I encourage the Minister to consider the matter as the Bill proceeds. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 86, in clause 50, page 47, line 3, after “material” insert—
“or special interest news material”.
With this it will be convenient to discuss the following:
Amendment 87, in clause 50, page 47, line 28, leave out the first “is” and insert—
“and special interest news material are”.
Amendment 88, in clause 50, page 47, line 42, at end insert—
““special interest news material” means material consisting of news or information about a particular pastime, hobby, trade, business, industry or profession.”
In its current form, the Online Safety Bill states that platforms do not have any duties relating to content from recognised media outlets and new publishers, and the outlets’ websites are also exempt from the scope of the Bill. However, the way the Bill is drafted means that hundreds of independently regulated specialist publishers’ titles will be excluded from the protections afforded to recognised media outlets and news publishers. This will have a long-lasting and damaging effect on an indispensable element of the UK’s media ecosystem.
Specialist publishers provide unparalleled insights into areas that broader news management organisations will likely not analyse, and it would surely be foolish to dismiss and damage specialist publications in a world where disinformation is becoming ever more prevalent. The former Secretary of State, the right hon. Member for Maldon (Mr Whittingdale), also raised this issue on Second Reading, where he stated that specialist publishers
“deserve the same level of protection.”—[Official Report, 19 April 2022; Vol. 712, c. 109.]
Part of the rationale for having the news publishers exemption in the Bill is that it means that the press will not be double-regulated. Special interest material is already regulated, so it should benefit from the same exemptions.
For the sake of clarity, and for the benefit of the Committee and those who are watching, could the hon. Gentleman say a bit more about what he means by specialist publications and perhaps give one or two examples to better illustrate his point?
I would be delighted to do so. I am talking about specific and occasionally niche publications. Let us take an example. Gardeners’ World is not exactly a hotbed of online harm, and nor is it a purveyor of disinformation. It explains freely which weeds to pull up and which not to, without seeking to confuse people in any way. Under the Bill, however, such publications will be needlessly subjected to rules, creating a regulatory headache for the sector. This is a minor amendment that will help many businesses, and I would be interested to hear from the Minister why the Government will not listen to the industry on this issue.
I thank the hon. Member for Ochil and South Perthshire for his amendment and his speech. I have a couple of points to make in reply. The first is that the exemption is about freedom of the press and freedom of speech. Clearly, that is most pertinent and relevant in the context of news, information and current affairs, which is the principal topic of the exemption. Were we to expand it to cover specialist magazines—he mentioned Gardeners’ World—I do not think that free speech would have the same currency when it comes to gardening as it would when people are discussing news, current affairs or public figures. The free speech argument that applies to newspapers, and to other people commenting on current affairs or public figures, does not apply in the same way to gardening and the like.
That brings me on to a second point. Only a few minutes ago, the hon. Member for Batley and Spen drew the Committee’s attention to the risks inherent in the clause that a bad actor could seek to exploit. It was reasonable of her to do so. Clearly, however, the more widely we draft the clause—if we include specialist publications such as Gardeners’ World, whose circulation will no doubt soar on the back of this debate—the greater the risk of bad actors exploiting the exemption.
My third point is about undue burdens being placed on publications. To the extent that such entities count as social media platforms—in-scope services—the most onerous duties under the Bill apply only to category 1 companies, or the very biggest firms such as Facebook and so on. The “legal but harmful” duties and many of the risk assessment duties would not apply to many organisations. In fact, I think I am right to say that if the only functionality on their websites is user comments, they would in any case be outside the scope of the Bill. I have to confess that I am not intimately familiar with the functionality of the Gardeners’ World website, but there is a good chance that if all it does is to provide the opportunity to post comments and similar things, it would be outside the scope of the Bill anyway, because it does not have the requisite functionality.
I understand the point made by the hon. Member for Ochil and South Perthshire, we will, respectfully, resist the amendment for the many reasons I have given.
No, I will let that particular weed die in the bed. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
Briefly, as with earlier clauses, the Labour party recognises the challenge in finding the balance between freedom of expression and keeping people safe online. Our debate on the amendment has illustrated powerfully that the exemptions as they stand in the Bill are hugely flawed.
First, the exemption is open to abuse. Almost any organisation could develop a standards code and complaints process to define itself as a news publisher and benefit from the exemption. Under those rules, as outlined eloquently by my hon. Friend the Member for Batley and Spen, Russia Today already qualifies, and various extremist publishers could easily join it. Organisations will be able to spread seriously harmful content with impunity—I referred to many in my earlier contributions, and I have paid for that online.
Secondly, the exemption is unjustified, as we heard loud and clear during the oral evidence sessions. I recall that Kyle from FairVote made that point particularly clearly. There are already rigorous safeguards in the Bill to protect freedom of expression. The fact that content is posted by a news provider should not itself be sufficient reason to treat such content differently from that which is posted by private citizens.
Furthermore, quality publications with high standards stand to miss out on the exemption. The Minister must also see the lack of parity in the broadcast media space. In order for broadcast media to benefit from the exemption, they must be regulated by Ofcom, and yet there is no parallel stipulation for non-broadcast media to be regulated in order to benefit. How is that fair? For broadcast media, the requirement to be regulated by Ofcom is simple, but for non-broadcast media, the series of requirements are not rational, exclude many independent publishers and leave room for ambiguity.
John Nicolson
Main Page: John Nicolson (Scottish National Party - Ochil and South Perthshire)(2 years, 5 months ago)
Public Bill CommitteesI beg to move amendment 142, in schedule 7, page 183, line 11, leave out from “under” to the end of line and insert
“any of the following provisions of the Suicide Act 1961—
(a) section 2;
(b) section 3A (inserted by section Communication offence for encouraging or assisting self-harm of this Act).”
With this it will be convenient to discuss new clause 36—Communication offence for encouraging or assisting self-harm—
‘(1) In the Suicide Act 1961, after section 3 insert—
“3A Communication offence for encouraging or assisting self-harm
(1) A person (“A”) commits an offence if—
(a) A sends a message,
(b) the message encourages or could be used to assist another person (“B”) to inflict serious physical harm upon themselves, and
(c) A’s act was intended to encourage or assist the infliction of serious physical harm.
(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, A.
(3) A may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.
(4) A person guilty of an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;
(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.
(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.
(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.
(7) If A arranges for a person (“A2”) to do an Act and A2 does that Act, A is also to be treated as having done that Act for the purposes of subsection (1).
(8) In proceedings for an offence to which this section applies, it shall be a defence for A to prove that—
(a) B had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from A;
(b) B’s intention to inflict serious physical harm upon themselves was not initiated by A; and
(c) the message was wholly motivated by compassion towards B or to promote the interests of B’s health or wellbeing.”’
New clause 36 seeks to criminalise the encouragement or assistance of a suicide. Before I move on to the details of the new clause, I would like to share the experience of a Samaritans supporter, who said:
“I know that every attempt my brother considered at ending his life, from his early 20s to when he died in April, aged 40, was based on extensive online research. It was all too easy for him to find step-by-step instructions so he could evaluate the effectiveness and potential impact of various approaches and, most recently, given that he had no medical background, it was purely his ability to work out the quantities of various drugs and likely impact of taking them in combination that equipped him to end his life.”
It is so easy when discussing the minutiae of the Bill to forget its real-world impact. I have worked with Samaritans on the new clause, and I use that quote with permission. It is the leading charity in trying to create a suicide-safer internet. It is axiomatic to say that suicide and self-harm have a devastating impact on people’s lives. The Bill must ensure that the online space does not aid the spreading of content that would promote this behaviour in any way.
There has rightly been much talk about how children are affected by self-harm content online. However, it should be stressed they do not exclusively suffer because of that content. Between 2011 and 2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm, and 82% of those patients were aged over 25. It is likely that, as the Bill stands, suicide-promoting content will be covered in category 1 services, as it will be designated as harmful. Unless this amendment is passed, that content will not be covered on smaller sites, which is crucial. As Samaritans has identified, it is precisely in these smaller fora and websites that harm proliferates. The 151 patients who took their own life after visiting harmful websites may have been part of a handful of people using those sites, which would not fall under the definition of category 1, as I am sure the Minister will confirm.
The hon. Gentleman makes a very important point, which comes to the nub of a lot of the issues we face with the Bill: the issue of volume versus risk. Does he agree that one life lost to suicide is one life too many? We must do everything that we can in the Bill to prevent every single life being lost through suicide, which is the aim of his amendment.
I do, of course, agree. As anyone who has suffered with someone in their family committing suicide knows, it has a lifelong family effect. It is yet another amendment where I feel we should depart from the pantomime of so much parliamentary procedure, where both sides fundamentally agree on things but Ministers go through the torturous process of trying to tell us that every single amendment that any outside body or any Opposition Member, whether from the SNP or the Labour party, comes up with has been considered by the ministerial team and is already incorporated or covered by the Bill. They would not be human if that were the case. Would it not be refreshing if there were a slight change in tactic, and just occasionally the Minister said, “Do you know what? That is a very good point. I think I will incorporate it into the Bill”?
None of us on the Opposition Benches seeks to make political capital out of any of the things we propose. All of us, on both sides of the House, are here with the best of intentions, to try to ensure that we get the best possible Bill. We all want to be able to vote for the Bill at the end of the day. Indeed, as I said, I have worked with two friends on the Conservative Benches—with the hon. Member for Watford on the Joint Committee on the draft Bill and with the hon. Member for Wolverhampton North East on the Select Committee on Digital, Culture, Media and Sport—and, as we know, they have both voted for various proposals. It is perhaps part of the frustration of the party system here that people are forced to go through the hoops and pretend that they do not really agree with things that they actually do agree with.
Let us try to move on with this, in a way that we have not done hitherto, and see if we can agree on amendments. We will withdraw amendments if we are genuinely convinced that they have already been considered by the Government. On the Government side, let them try to accept some of our amendments—just begin to accept some—if, as with this one, they think they have some merit.
I was talking about Samaritans, and exactly what it wants to do with the Bill. It is concerned about harmful content after the Bill is passed. This feeds into potentially the most important aspect of the Bill: it does not mandate risk assessments based exclusively on risk. By adding in the qualifications of size and scope, the Bill wilfully lets some of the most harmful content slip through its fingers—wilfully, but I am sure not deliberately. Categorisation will be covered by a later amendment, tabled by my hon. Friend the Member for Aberdeen North, so I shall not dwell on it now.
In July 2021, the Law Commission for England and Wales recommended the creation of a new narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. The commission identified that there is
“currently no offence that adequately addresses the encouragement of serious self-harm.”
The recommendation followed acknowledgement that
“self-harm content online is a worrying phenomenon”
and should have a
“robust fault element that targets deliberate encouragement of serious self-harm”.
Currently, there are no provisions of the Bill to create a new offence of assisting or encouraging self- harm.
In conclusion, I urge the Minister to listen not just to us but to the expert charities, including Samaritans, to help people who have lived experience of self-harm and suicide who are calling for regulation of these dangerous sites.
Good afternoon, Sir Roger; it is a pleasure, as ever, to serve under your chairship. I rise to speak to new clause 36, which has been grouped with amendment 142 and is tabled in the names of the hon. Members for Ochil and South Perthshire and for Aberdeen North.
I, too, pay tribute to Samaritans for all the work it has done in supporting the Bill and these amendments to it. As colleagues will be aware, new clause 36 follows a recommendation from the Law Commission dating back to July 2021. The commission recommended the creation of a new, narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. It identified that there is
“currently no offence that adequately addresses the encouragement of serious self-harm.”
The recommendation followed acknowledgement that
“self-harm content online is a worrying phenomenon”
and should have a
“robust fault element that targets deliberate encouragement of serious self-harm”.
Currently, there are no provisions in the Bill to create a new offence of assisting or encouraging self-harm, despite the fact that other recommendations from the Law Commission report have been brought into the Bill, such as creating a new offence of cyber-flashing and prioritising tackling illegal suicide content.
We all know that harmful suicide and self-harm content is material that has the potential to cause or exacerbate self-harm and suicidal behaviours. Content relating to suicide and self-harm falls into both categories in the Bill—illegal content and legal but harmful content. Encouraging or assisting suicide is also currently a criminal offence in England and Wales under the Suicide Act 1961, as amended by the Coroners and Justice Act 2009.
Content encouraging or assisting someone to take their own life is illegal and has been included as priority illegal content in the Bill, meaning that platforms will be required to proactively and reactively prevent individuals from encountering it, and search engines will need to structure their services to minimise the risk to individuals encountering the content. Other content, including content that positions suicide as a suitable way of overcoming adversity or describes suicidal methods, is legal but harmful.
The Labour party’s Front-Bench team recognises that not all content falls neatly into the legal but harmful category. What can be helpful for one user can be extremely distressing to others. Someone may find it extremely helpful to share their personal experience of suicide, for example, and that may also be helpful to other users. However, the same material could heighten suicidal feelings and levels of distress in someone else. We recognise the complexities of the Bill and the difficulties in developing a way around this, but we should delineate harmful and helpful content relating to suicide and self-harm, and that should not detract from tackling legal but clearly harmful content.
In its current form, the Bill will continue to allow legal but clearly harmful suicide and self-harm content to be accessed by over-18s. Category 1 platforms, which have the highest reach and functionality, will be required to carry out risk assessments of, and set out in their terms and conditions their approach to, legal but harmful content in relation to over-18s. As the hon. Member for Ochil and South Perthshire outlined, however, the Bill’s impact assessment states that “less than 0.001%” of in-scope platforms
“are estimated to meet the Category 1 and 2A thresholds”,
and estimates that only 20 platforms will be required to fulfil category 1 obligations. There is no requirement on the smaller platforms, including those that actively encourage suicide, to do anything at all to protect over-18s. That simply is not good enough. That is why the Labour party supports new clause 36, and we urge the Minister to do the right thing by joining us.
I am grateful for the element of gender balance that the hon. Member has introduced, and she is right to highlight the suicide risk. Inciting suicide is already a criminal offence under section 2 of the Suicide Act 1961 and we have named it a priority offence. Indeed, it is the first priority offence listed under schedule 7—it appears a third of the way down page 183—for exactly the reason she cited, and a proactive duty is imposed on companies by paragraph 1 of schedule 7.
On amendment 142 and the attendant new clause 36, the Government agree with the sentiment behind them—namely, the creation of a new offence of encouraging or assisting serious self-harm. We agree with the substance of the proposal from the hon. Member for Ochil and South Perthshire. As he acknowledged, the matter is under final consideration by the Law Commission and our colleagues in the Ministry of Justice. The offence initially proposed by the Law Commission was wider in scope than that proposed under new clause 36. The commission’s proposed offence covered the offline world, as well as the online one. For example, the new clause as drafted would not cover assisting a person to self-harm by providing them with a bladed article because that is not an online communication. The offence that the Law Commission is looking at is broader in scope.
The Government have agreed in principle to create an offence based on the Law Commission recommendation in separate legislation, and once that is done the scope of the new offence will be wider than that proposed in the new clause. Rather than adding the new clause and the proposed limited new offence to this Bill, I ask that we implement the offence recommended by the Law Commission, the wider scope of which covers the offline world as well as the online world, in separate legislation. I would be happy to make representations to my colleagues in Government, particularly in the MOJ, to seek clarification about the relevant timing, because it is reasonable to expect it to be implemented sooner rather than later. Rather than rushing to introduce that offence with limited scope under the Bill, I ask that we do it properly as per the Law Commission recommendation.
Once the Law Commission recommendation is enacted in separate legislation, to which the Government have already agreed in principle, it will immediately flow through automatically to be incorporated into clause 52(4)(d), which relates to illegal content, and under clause 176, the Secretary of State may, subject to parliamentary approval, designate the new offence as a priority offence under schedule 7 via a statutory instrument. The purpose of amendment 142 can therefore be achieved through a SI.
The Government publicly entirely agree with the intention behind the proposed new clause 36, but I think the way to do this is to implement the full Law Commission offence as soon as we can and then, if appropriate, add it to schedule 7 by SI. The Government agree with the spirit of the hon. Gentleman’s proposal, but I believe that the Government already have a plan to do a more complete job to create the new offence.
I have nothing to add and, having consulted my hon. Friend the Member for Aberdeen North, on the basis of the Minister’s assurances, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 116, in schedule 7, page 183, line 11, at end insert—
“1A An offence under section 13 of the Criminal Justice Act (Northern Ireland) 1966 (c. 20 (N.I.)) (assisting suicide etc).”
This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.
I beg to move amendment 90, in schedule 7, page 185, line 39, at end insert—
“Human trafficking
22A An offence under section 2 of the Modern Slavery Act 2015.”
This amendment would designate human trafficking as a priority offence.
Our amendment seeks to deal explicitly with what Meta and other companies refer to as “domestic servitude”, which we know better as human trafficking. This abhorrent practice has sadly been part of our society for hundreds if not thousands of years, and today, human traffickers are aided by various apps and platforms. The same platforms that connect us with old friends and family across the globe have been hijacked by the very worst people in our world, who are using them to create networks of criminal enterprise, none more cruel than human trafficking.
Investigations by the BBC and The Wall Street Journal have uncovered how traffickers use Instagram, Facebook and WhatsApp to advertise, sell, and co-ordinate the trafficking of young women. One would think that this issue would be of the utmost importance to Meta—Facebook, as it was at the time—yet, as the BBC reported,
“the social media giant only took ‘limited action’ until ‘Apple Inc. threatened to remove Facebook’s products from the App Store, unless it cracked down on the practice’.”
Those of us who have sat on the DCMS Committee and the Joint Committee on the draft Bill—I and my friends across the aisle, the hon. Members for Wolverhampton North East and for Watford—know exactly what it is like to have Facebook’s high heid yins before you. They will do absolutely nothing to respond to legitimate pressure. They understand only one thing: the force of law and of financial penalty. Only when its profits were in danger did Meta take the issue seriously.
The omission of human trafficking from schedule 7 is especially worrying because if it is not directly addressed as priority illegal content, we can be certain that it will not be prioritised by the platforms. We know that from their previous behaviour.
Can my hon. Friend see any reason—I am baffled by this—why the Government would leave out human trafficking? Can he imagine any justification that the Minister could possibly have for suggesting that it is not a priority offence, given the Conservative party’s stated aims and, to be fair, previous action in respect of, for example, the Modern Slavery Act 2015?
It is an interesting question. Alas, I long ago stopped trying to put myself into the minds of Conservative Ministers—a scary place for any of us to be.
We understand that it is difficult to try to regulate in respect of human trafficking on platforms. It requires work across borders and platforms, with moderators speaking different languages. We established that Facebook does not have moderators who speak different languages. On the Joint Committee on the draft Bill, we discovered that Facebook does not moderate content in English to any adequate degree. Just look at the other languages around the world—do we think Facebook has moderators who work in Turkish, Finnish, Swedish, Icelandic or a plethora of other languages? It certainly does not. The only language that Facebook tries to moderate—deeply inadequately, as we know—is English. We know how bad the moderation is in English, so can the Committee imagine what it is like in some of the world’s other languages? The most terrifying things are allowed to happen without moderation.
Regulating in respect of human trafficking on platforms is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and the costs that will be entailed. If human trafficking is not designated a priority harm, I fear it will fall by the wayside, so I must ask the Minister: is human trafficking covered by another provision on priority illegal content? Like my hon. Friend the Member for Aberdeen North, I cannot see where in the Bill that lies. If the answer is yes, why are the human rights groups not satisfied with the explanation? What reassurance can the Minister give to the experts in the field? Why not add a direct reference to the Modern Slavery Act, as in the amendment?
If the answer to my question is no, I imagine the Minister will inform us that the Bill requires platforms to consider all illegal content. In what world is human trafficking that is facilitated online not a priority? Platforms must be forced to be proactive on this issue; if not, I fear that human trafficking, like so much that is non-priority illegal content, will not receive the attention it deserves.
Schedule 7 sets out the list of criminal content that in-scope firms will be required to remove as a priority. Labour was pleased to see new additions to the most recent iteration, including criminal content relating to online drug and weapons dealing, people smuggling, revenge porn, fraud, promoting suicide and inciting or controlling prostitution for gain. The Government’s consultation response suggests that the systems and processes that services may use to minimise illegal or harmful content could include user tools, content moderation and recommendation procedures.
More widely, although we appreciate that the establishment of priority offences online is the route the Government have chosen to go down with the Bill, we believe the Bill remains weak in relation to addressing harms to adults and wider societal harms. Sadly, the Bill remains weak in its approach and has seemingly missed a number of known harms to both adults and children that we feel are a serious omission. Three years on from the White Paper, the Government know where the gaps are, yet they have failed to address them. That is why we are pleased to support the amendment tabled by the hon. Members for Ochil and South Perthshire and for Aberdeen North.
Human trafficking offences are a serious omission from schedule 7 that must urgently be rectified. As we all know from whistleblower Frances Haugen’s revelations, Facebook stands accused, among the vast array of social problems, of profiting from the trade and sale of human beings—often for domestic servitude—by human traffickers. We also know that, according to internal documents, the company has been aware of the problems since at least 2018. As the hon. Member for Ochil and South Perthshire said, we know that a year later, on the heels of a BBC report that documented the practice, the problem was said to be so severe that Apple itself threatened to pull Facebook and Instagram from its app store. It was only then that Facebook rushed to remove content related to human trafficking and made emergency internal policy changes to avoid commercial consequences described as “potentially severe” by the company. However, an internal company report detailed that the company did not take action prior to public disclosure and threats from Apple—profit over people.
In a complaint to the US Securities and Exchange Commission first reported by The Wall Street Journal, whistleblower Haugen wrote:
“Investors would have been very interested to learn the truth about Facebook almost losing access to the Apple App Store because of its failure to stop human trafficking on its products.”
I cannot believe that the Government have failed to commit to doing more to tackle such abhorrent practices, which are happening every day. I therefore urge the Minister to do the right thing and support amendment 90.
I am happy to give that further consideration, but please do not interpret that as a firm commitment. I repeat that the Modern Slavery Act is brought into the scope of this Bill via clause 52(4)(d).
I have nothing further to add. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Schedule 7, as amended, agreed to.
Clause 53
“Content that is harmful to children” etc
I have had no indication that anybody wishes to move Carla Lockhart’s amendment 98—she is not a member of the Committee.
Question proposed, That the clause stand part of the Bill.
The hon. Member is asking me a somewhat technical question, and I hesitate to answer without taking full advice, but I think the answer is yes. The reason that loot boxes are not considered gambling in our view is that they do not have a monetary value, so the exclusion in clause 53(5)(b)(i) does not apply. On a quick off-the-cuff reading, it does not strike me immediately that the exclusions in (5)(b)(ii) or (iii) would apply to loot boxes either, so I believe—and officials who know more about this than I do are nodding—that the hon. Lady is right to say that it would be possible for loot boxes to become primary priority content or priority content by way of a statutory instrument. Yes, my belief is that that would be possible.
Question put and agreed to.
Clause 53 accordingly ordered to stand part of the Bill.
Clause 54
“Content that is harmful to children” etc
I beg to move amendment 83, in clause 54, page 50, line 39, at end insert—
“(2A) Priority content designated under subsection (2) must include content that contains health-related misinformation and disinformation, where such content is harmful to adults.”
This amendment would amend Clause 54 so that the Secretary of State’s designation of “priority content that is harmful to adults” must include a description of harmful health related misinformation or disinformation (as well as other priority content that might be designated in regulations by the Secretary of State).
The Bill requires category 1 service providers to set out how they will tackle harmful content on their platforms. In order for this to work, certain legal but harmful content must be designated in secondary legislation as
“priority content that is harmful to adults.”
As yet, however, it is not known what will be designated as priority content or when. There have been indications from Government that health-related misinformation and disinformation will likely be included, but there is no certainty. The amendment would ensure that harmful health-related misinformation and disinformation would be designated as priority content that is harmful to adults.
I am grateful to the shadow Minister for confirming her support for free speech. Perhaps I could take this opportunity to apologise to you, Sir Roger, and to Hansard for turning round. I will try to behave better in future.
I find myself not entirely reassured, so I think we should press the amendment to a vote.
Question put, That the amendment be made.
John Nicolson
Main Page: John Nicolson (Scottish National Party - Ochil and South Perthshire)(2 years, 5 months ago)
Public Bill CommitteesThe reason the new clause is drafted in that way is because I wanted to recognise the work of the Joint Committee and to take on board its recommendations. If it had been entirely my drafting, the House of Lords would certainly not have been involved, given that I am not the biggest fan of the House of Lords, as its Members are not elected. However, the decision was made to submit the new clause as drafted.
The Minister has said that the Government have not come to a settled view yet, which I am taking as the Minister not saying no. He is not standing up and saying, “No, we will definitely not have a Standing Committee.” I am not suggesting he is saying yes, but given that he is not saying no, I am happy to withdraw the new clause. If the Minister is keen to come forward at a future stage with suggestions for changes to Standing Orders, which I understand have to be introduced by the Leader of the House or the Cabinet Office, then they would be gladly heard on this side of the House. I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
New Clause 38
Adults’ risk assessment duties
“(1) This section sets out duties which apply in relation to internet services within section 67(2).
(2) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM makes any significant change to a risk profile that relates to services of the kind in question.
(3) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.
(4) A duty to make and keep a written record, in an easily understandable form, of every risk assessment under subsections (2) and (3).
(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—
(a) the user base;
(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed).
(6) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—
(a) the user base;
(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;
(c) the level of risk of harm to adults presented by different kinds of priority content that is harmful to adults;
(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with a certain characteristic or members of a certain group;
(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;
(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;
(g) the nature, and severity, of the harm that might be suffered by adults from the matters identified in accordance with paragraphs (b) to (f);
(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.
(7) In this section references to risk profiles are to the risk profiles for the time being published under section 83 which relate to the risk of harm to adults presented by priority content that is harmful to adults.
(8) The provisions of Schedule 3 apply to any assessment carried out under this section in the same way they apply to any relating to a Part 3 service.”—(John Nicolson.)
This new clause applies adults’ risk assessment duties to pornographic sites.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
New clause 39—Safety duties protecting adults—
“(1) This section sets out duties which apply in relation to internet services within section 67(2).
(2) A duty to summarise in the terms of service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).
(3) A duty to include provisions in the terms of service specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (3), which of those kinds of treatment is to be applied.
(4) These are the kinds of treatment of content referred to in subsection (3)—
(a) taking down the content;
(b) restricting users’ access to the content.
(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—
(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and
(b) any other provisions of the terms of service designed to mitigate or manage those risks.
(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—
(a) are clear and accessible, and
(b) are applied consistently in relation to content which the provider reasonably considers is priority content that is harmful to adults or a particular kind of priority content that is harmful to adults.
(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—
(a) the kinds of such content identified, and
(b) the incidence of those kinds of content on the service.
(8) In this section—
“adults’ risk assessment” has the meaning given by section 12;
“non-designated content that is harmful to adults” means content that is harmful to adults other than priority content that is harmful to adults.”
This new clause applies safety duties protecting adults to regulated provider pornographic content.
New clause 40—Duties to prevent users from encountering illegal content—
“(1) This section sets out duties which apply in relation to internet services within section 67(2).
(2) A duty to operate an internet service using proportionate systems and processes designed to—
(a) prevent individuals from encountering priority illegal content that amounts to an offence in either Schedule 6 or paragraphs 17 and 18 of Schedule 7 by means of the service;
(b) minimise the length of time for which the priority illegal content referred to in subsection (a) is present;
(c) where the provider is alerted by a person to the presence of the illegal content referred to in subsection (a), or becomes aware of it in any other way, swiftly take down such content.
(3) A duty to operate systems and processes that—
(a) verify the identity and age of all persons depicted in the content;
(b) obtain and keep on record written consent from all persons depicted in the content;
(c) only permit content uploads from verified content providers and must have a robust process for verifying the age and identity of the content provider;
(d) all uploaded content must be reviewed before publication to ensure that the content is not illegal and does not otherwise violate its terms of service;
(e) unloaded content must not be marketed by content search terms that give the impression that the content contains child exploitation materials or the depiction of non–consensual activities;
(f) the service must offer the ability for any person depicted in the content to appeal to remove the content in question.”
This new clause applies duties to prevent users from encountering illegal content to regulated providers of pornographic content.
Big porn, or the global online pornography industry, is a proven driver of big harms. It causes the spread of image-based sexual abuse and child sexual abuse material. It normalises sexual violence and harmful sexual attitudes and behaviours, and it offers children easy access to violent, sexist and racist sexual content, which is proven to cause them a whole range of harms. In part, the Government recognised how harmful pornography can be to children by building one small aspect of pornography regulation into the Bill.
The Bill is our best chance to regulate the online pornography industry, which it currently does not mention. Over two decades, the porn industry has shown itself not to be trustworthy about regulating itself. Vanessa Morse, the head of the Centre to End All Sexual Exploitation, said:
“If we fail to see the porn industry as it really is, efforts to regulate will flounder.”
If the Minister has not yet read CEASE’s “Expose Big Porn” report, I recommend that he does so. The report details some of the harrowing harms that are proliferated by porn companies. Importantly, these harms are being done with almost zero scrutiny. We all know who the head of Meta or the chief executive officer of Google is, but can the Minister tell me who is in charge of MindGeek? This company dominates the market, yet it is almost completely anonymous—or at least the high heid yins of the company are.
New clause 38 seeks to identify pornography websites as providers of category 1 services, introduce a relevant code of practice and designate a specific regulator, in order to ensure compliance. Big porn must be made to stop hosting illegal extreme porn and the legal but harmful content prohibited by its own terms of service. If anyone thought that social media platforms were indifferent to a harm taking place on their site, they pale in comparison with porn sites, which will do the absolute minimum that they can. To show the extent of the horrible searches allowed, one video found by CEASE was titled “Oriental slave girl tortured”. I will not read out some of the other titles in the report, but there are search terms that promote non-consensual activity, violence, incest and racial slurs. For example, “Ebony slave girl” is a permitted term. This is just one of the many examples of damaging content on porn sites, which are perpetuating horrific sexual practices that, sadly, are too often being viewed by children.
Over 80% of the UK public would support strict new porn laws. I really think there is an appetite among the public to introduce such laws. The UK Government must not pass up this opportunity to regulate big porn, which is long overdue.
Online Safety Bill Debate
Full Debate: Read Full DebateJohn Nicolson
Main Page: John Nicolson (Scottish National Party - Ochil and South Perthshire)Department Debates - View all John Nicolson's debates with the Department for Digital, Culture, Media & Sport
(2 years, 4 months ago)
Commons ChamberI rise to speak to the amendments in my name and those of other right hon. and hon. Members. I welcome the Minister to his place after his much-deserved promotion; as other hon. Members have said, it is great to have somebody who is both passionate and informed as a Minister. I also pay tribute to the hon. Member for Croydon South (Chris Philp), who is sitting on the Back Benches: he worked incredibly hard on the Bill, displayed a mastery of detail throughout the process and was extremely courteous in his dealings with us. I hope that he will be speedily reshuffled back to the Front Bench, which would be much deserved—but obviously not that he should replace the Minister, who I hope will remain in his current position or indeed be elevated from it.
But enough of all this souking, as we say north of the border. As one can see from the number of amendments tabled, the Bill is not only an enormous piece of legislation but a very complex one. Its aims are admirable—there is no reason why this country should not be the safest place in the world to be online—but a glance through the amendments shows how many holes hon. Members think it still has.
The Government have taken some suggestions on board. I welcome the fact that they have finally legislated outright to stop the wicked people who attempt to trigger epileptic seizures by sending flashing gifs; I did not believe that such cruelty was possible until I was briefed about it in preparation for debates on the Bill. I pay particular tribute to wee Zach, whose name is often attached to what has been called Zach’s law.
The amendments to the Bill show that there has been a great deal of cross-party consensus on some issues, on which it has been a pleasure to work with friends in the Labour party. The first issue is addressed, in various ways, by amendments 44 to 46, 13, 14, 21 and 22, which all try to reduce the Secretary of State’s powers under the Bill. In all the correspondence that I have had about the Bill, and I have had a lot, that is the area that has most aggrieved the experts. A coalition of groups with a broad range of interests, including child safety, human rights, women and girls, sport and democracy, all agree that the Secretary of State is granted too many powers under the Bill, which threatens the independence of the regulator. Businesses are also wary of the powers, in part because they cause uncertainty.
The reduction of ministerial powers under the Bill was advised by the Joint Committee on the Draft Online Safety Bill and by the Select Committee on Digital, Culture, Media and Sport, on both of which I served. In Committee, I asked the then Minister whether any stakeholder had come forward in favour of these powers. None had.
Even DCMS Ministers do not agree with the powers. The new Minister was Chair of the Joint Committee, and his Committee’s report said:
“The powers for the Secretary of State to a) modify Codes of Practice to reflect Government policy and b) give guidance to Ofcom give too much power to interfere in Ofcom’s independence and should be removed.”
The Government have made certain concessions with respect to the powers, but they do not go far enough. As the Minister said, the powers should be removed.
We should be clear about exactly what the powers do. Under clause 40, the Secretary of State can
“modify a draft of a code of practice”.
That allows the Government a huge amount of power over the so-called independent communications regulator. I am glad that the Government have listened to the suggestions that my colleagues and I made on Second Reading and in Committee, and have committed to using the power only in “exceptional circumstances” and by further defining “public policy” motives. But “exceptional circumstances” is still too opaque and nebulous a phrase. What exactly does it mean? We do not know. It is not defined—probably intentionally.
The regulator must not be politicised in this way. Several similar pieces of legislation are going through their respective Parliaments or are already in force. In Germany, Australia, Canada, Ireland and the EU, with the Digital Services Act, different Governments have grappled with the issue of making digital regulation future-proof and flexible. None of them has added political powers. The Bill is sadly unique in making such provision.
When a Government have too much influence over what people can say online, the implications for freedom of speech are particularly troubling, especially when the content that they are regulating is not illegal. There are ways to future-proof and enhance the transparency of Ofcom in the Bill that do not require the overreach that these powers give. When we allow the Executive powers over the communications regulator, the protections must be absolute and iron-clad, but as the Bill stands, it gives leeway for abuse of those powers. No matter how slim the Minister feels the chance of that may be, as parliamentarians we must not allow it.
Amendment 187 on human trafficking is an example of a relatively minor change to the Bill that could make a huge difference to people online. Our amendment seeks to deal explicitly with what Meta and other companies refer to as domestic servitude, which is very newsworthy, today of all days, and which we know better as human trafficking. Sadly, this abhorrent practice has been part of our society for hundreds if not thousands of years. Today, human traffickers are aided by various apps and platforms. The same platforms that connect us with old friends and family across the globe have been hijacked by the very worst people in our world, who are using them to create networks of criminal enterprise, none more cruel than human trafficking.
Investigations by the BBC and The Wall Street Journal have uncovered how traffickers use Instagram, Facebook and WhatsApp to advertise, sell and co-ordinate the trafficking of young women. One would have thought that the issue would be of the utmost importance to Meta—Facebook, as it was at the time—yet, as the BBC reported, The Wall Street Journal found that
“the social media giant only took ‘limited action’ until ‘Apple Inc. threatened to remove Facebook’s products from the App Store, unless it cracked down on the practice’.”
I and my friends across the aisle who sat on the DCMS Committee and the Joint Committee on the draft Bill know exactly what it is like to have Facebook’s high heid yins before us. They will do absolutely nothing to respond to legitimate pressure. They understand only one thing: the force of law and of financial penalty. Only when its profits were in danger did Meta take the issue seriously.
The omission of human trafficking from schedule 7 is especially worrying, because if human trafficking is not directly addressed as priority illegal content, we can be certain that it will not be prioritised by the platforms. We know from their previous behaviour that the platforms never do anything that will cost them money unless they are forced to do so. We understand that it is difficult to regulate in respect of human trafficking on platforms: it requires work across borders and platforms, with moderators speaking different languages. It is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and for the costs that will be entailed. If human trafficking is not designated as a priority harm, I fear that it will fall by the wayside.
In Committee, the then Minister said that the relevant legislation was covered by other parts of the Bill and that it was not necessary to incorporate offences under the Modern Slavery Act 2015 into priority illegal content. He referred to the complexity of offences such as modern slavery, and said how illegal immigration and prostitution priority offences might cover that already. That is simply not good enough. Human traffickers use platforms as part of their arsenal at every stage of the process, from luring in victims to co-ordinating their movements and threatening their families. The largest platforms have ample capacity to tackle these problems and must be forced to be proactive. The consequences of inaction will be grave.
It is a pleasure to follow the hon. Member for Ochil and South Perthshire (John Nicolson).
Let me begin by repeating my earlier congratulations to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) on assuming his place on the Front Bench. Let me also take this opportunity to extend my thanks to those who served on the Bill Committee with me for some 50 sitting hours—it was, generally speaking, a great pleasure—and, having stepped down from the Front Bench, to thank the civil servants who have worked so hard on the Bill, in some cases over many years.
I follow that point. I will channel, with some effort, the hon. Member for Birmingham, Yardley (Jess Phillips), who I suspect would say that these things are already up for debate and discussed in other contexts—the ability to distinguish between art and pornography is something that we have wrestled with in other media. Actually, in relation to the Bill, I think that one of our guiding principles ought to be that we do not reinvent the wheel where we do not have to, and that we seek to apply to the online world the principles and approaches that we would expect in all other environments. That is probably the answer to my hon. Friend’s point.
I think it is very important that we recognise the need for platforms to do all they can to ensure that the wrong type of material does not reach vulnerable users, even if that material is a brief part of a fairly long piece. Those, of course, are exactly the principles that we apply to the classification of films and television. It may well be that a small portion of a programme constitutes material that is unsuitable for a child, but we would still seek to put it the wrong side of the 9 o’clock watershed or use whatever methods we think the regulator ought to adopt to ensure that children do not see it.
Good points are being made. The practicalities are important; it may be that because of a lack of available time and effort in this place, we have to resolve those elsewhere.
I wish to speak to new clause 33, my proposed new schedule 1 and amendments 201 to 203. I notice that the Secretary of State is off again. I place on record my thanks to Naomi Miles of CEASE—the Centre to End All Sexual Exploitation—and Ceri Finnegan of Barnardos for their support.
The UK Government have taken some steps to strengthen protections on pornography and I welcome the fact that young teenagers will no longer be able to access pornography online. However, huge quantities of extreme and harmful pornography remain online, and we need to address the damage that it does. New clause 33 would seek to create parity between online and offline content—consistent legal standards for pornography. It includes a comprehensive definition of pornography and puts a duty on websites not to host content that would fail to attain the British Board of Film Classification standard for R18 classification.
The point of the Bill, as the Minister has repeatedly said, is to make the online world a safer place, by doing what we all agree must be done—making what is illegal offline, illegal online. That is why so many Members think that the lack of regulation around pornography is a major omission in the Bill.
The new clause stipulates age and consent checks for anyone featured in pornographic content. It addresses the proliferation of pornographic content that is both illegal and harmful, protecting women, children and minorities on both sides of the camera.
The Bill presents an opportunity to end the proliferation of illegal and harmful content on the internet. Representations of sexual violence, animal abuse, incest, rape, coercion, abuse and exploitation—particularly directed towards women and children—are rife. Such content can normalise dangerous and abusive acts and attitudes, leading to real-world harm. As my hon. Friend the Member for Pontypridd (Alex Davies-Jones) said in her eloquent speech earlier, we are seeing an epidemic of violence against women and girls online. When bile and hatred is so prolific online, it bleeds into the offline space. There are real-world harms that flow from that.
The Minister has said how much of a priority tackling violence against women and girls is for him. Knowing that, and knowing him, he will understand that pornography is always harmful to children, and certain kinds of pornographic content are also potentially harmful to adults. Under the Video Recordings Act 1984, the BBFC has responsibility for classifying pornographic content to ensure that it is not illegal, and that it does not promote an interest in abusive relationships, such as incest. Nor can it promote acts likely to cause serious physical harm, such as breath restriction or strangulation. In the United Kingdom, it is against the law to supply pornographic material that does not meet this established BBFC classification standard, but there is no equivalent standard in the online world because the internet evolved without equivalent regulatory oversight.
I know too that the Minister is determined to tackle some of the abusive and dangerous pornographic content online. The Bill does include a definition of pornography, in clause 66(2), but that definition is inadequate; it is too brief and narrow in scope. In my amendment, I propose a tighter and more comprehensive definition, based on that in part 3 of the Digital Economy Act 2017, which was debated in this place and passed into law. The amendment will remove ambiguity and prevent confusion, ensuring that all websites know where they stand with regard to the law.
The new duty on pornographic websites aligns with the UK Government’s 2020 legislation regulating UK-established video-sharing platforms and video-on-demand services, both of which appeal to the BBFC’s R18 classification standards. The same “high standard of rules in place to protect audiences”, as the 2020 legislation put it, and “certain content standards” should apply equally to online pornography and offline pornography, UK-established video-sharing platforms and video-on-demand services.
Let me give some examples sent to me by Barnardo’s, the children’s charity, which, with CEASE, has done incredibly important work in this area. The names have been changed in these examples, for obvious reasons.
“There are also children who view pornography to try to understand their own sexual abuse. Unfortunately, what these children find is content that normalises the most abhorrent and illegal behaviours, such as 15-year-old Elizabeth, who has been sexually abused by a much older relative for a number of years. The content she found on pornography sites depicted older relatives having sex with young girls and the girls enjoying it. It wasn’t until she disclosed her abuse that she realised that it was not normal.
Carrie is a 16-year-old who was being sexually abused by her stepfather. She thought this was not unusual due to the significant amount of content she had seen on pornography sites showing sexual relationships within stepfamilies.”
That is deeply disturbing evidence from Barnardo’s.
Although in theory the Bill will prevent under-18s from accessing such content, the Minister knows that under-18s will be able to bypass regulation through technology like VPNs, as the DCMS Committee and the Bill Committee—I served on both—were told by experts in various evidence sessions. The amendment does not create a new law; it merely moves existing laws into the online space. There is good cause to regulate and sometimes prohibit certain damaging offline content; I believe it is now our duty to provide consistency with legislation in the online world.
I want to talk about several things, but particularly new clause 7. I am really pleased that the new clause has come back on Report, as we discussed it in the Bill Committee but unfortunately did not get enough support for it there—as was the case with everything we proposed—so I thank the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for tabling it. I also thank my hon. Friend the Member for Inverclyde (Ronnie Cowan) for his lobbying and for providing us with lots of background information. I agree that it is incredibly important that new clause 7 is agreed, particularly the provisions on consent and making sure that participants are of an appropriate age to be taking part. We have heard so many stories of so many people whose videos are online—whose bodies are online—and there is nothing they can do about it because of the lack of regulation. My hon. Friend the Member for Ochil and South Perthshire (John Nicolson) has covered new clause 33 in an awful lot of detail—very good detail—so I will not comment on that.
The right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) mentioned how we need to get the balance right, and specifically talked about the role of the regulator. In many ways, this Bill has failed to get the balance right in its attempts to protect children online. Many people who have been involved in writing this Bill, talking about this Bill, scrutinising this Bill and taking part in every piece of work that we have done around it do not understand how children use the internet. Some people do, absolutely, but far too many of the people who have had any involvement in this Bill do not. They do not understand the massive benefits to children of using the internet, the immense amount of fun they can have playing Fortnite, Fall Guys, Minecraft, or whatever it is they happen to be playing online and how important that is to them in today’s crazy world with all of the social media pressures. Children need to decompress. This is a great place for children to have fun—to have a wonderful time—but they need to be protected, just as we would protect them going out to play in the park, just the same as we would protect them in all other areas of life. We have a legal age for smoking, for example. We need to make sure that the protections are in place, and the protections that are in place need to be stronger than the ones that are currently in the Bill.
I did not have a chance earlier—or I do not think I did—to support the clause about violence against women and girls. As I said in Committee, I absolutely support that being in the Bill. The Government may say, “Oh we don’t need to have this in the Bill because it runs through everything,” but having that written in the Bill would make it clear to internet service providers—to all those people providing services online and having user-generated content on their sites—how important this is and how much of a scourge it is. Young women who spend their time on social media are more likely to have lower outcomes in life as a result of problematic social media use, as a result of the pain and suffering that is caused. We should be putting such a measure in the Bill, and I will continue to argue for that.
We have talked a lot about pornographic content in this section. There is not enough futureproofing in the Bill. My hon. Friend the Member for Ochil and South Perthshire and I tabled amendment 158 because we are concerned about that lack of futureproofing. The amendment edits the definition of “content”. The current definition of “content” says basically anything online, and it includes a list of stuff. We have suggested that it should say “including but not limited to”, on the basis that we do not know what the internet will look like in two years’ time, let alone what it will look like in 20 years’ time. If this Bill is to stand the test of time, it needs to be clear that that list is not exhaustive. It needs to be clear that, when we are getting into virtual reality metaverses where people are meeting each other, that counts as well. It needs to be clear that the sex dungeon that exists in the child’s game Roblox is an issue—that that content is an issue no matter whether it fits the definition of “content” or whether it fits the fact that it is written communication, images or whatever. It does not need to fit any of that. If it is anything harmful that children can find on the internet, it should be included in that definition of “content”, no matter whether it fits any of those specific categories. We just do not know what the internet is going to look like.
I have one other specific thing in relation to the issues of content and pornography. One of the biggest concerns that we heard is the massive increase in the amount of self-generated child sexual abuse images. A significant number of new images of child sexual abuse are self-generated. Everybody has a camera phone these days. Kids have camera phones these days. They have much more potential to get themselves into really uncomfortable and difficult situations than when most of us were younger. There is so much potential for that to be manipulated unless we get this right.
Online Safety Bill Debate
Full Debate: Read Full DebateJohn Nicolson
Main Page: John Nicolson (Scottish National Party - Ochil and South Perthshire)Department Debates - View all John Nicolson's debates with the Department for Digital, Culture, Media & Sport
(2 years ago)
Commons ChamberI thank my right hon. Friend for her question, which I have previously addressed. The problem is the precedent it would set. Any special Committee set up by a Bill would be appointed by the Whips, so we might as well forget about the Select Committee system. This is not a huge concern for the Digital, Culture, Media and Sport Committee, because the advent of any such special Committee would probably be beyond the next general election, and I am not thinking to that timeframe. I am concerned about the integrity of Parliament. The problem is that if we do that in this Bill, the next Government will come along and do it with another Bill and then another Bill. Before we know it, we will have a Select Committee system that is Whips-appointed and narrow in definition, and that cuts across something we all vote for.
There are means by which we can have legislative scrutiny—that is the point I am making in my speech. I would very much welcome a Committee being set up after a year, temporarily, to carry out post-legislative scrutiny. My Committee has a Sub-Committee on disinformation and fake news, which could also look at this Bill going forward. So I do not accept my right hon. Friend’s point, but I appreciate completely the concerns about our needing proper scrutiny in this area. We must also not forget that any changes to Ofcom’s parameters can be put in a statutory instrument, which can by prayed against by the Opposition and thus we would have the scrutiny of the whole House in debate, which is preferable to having a Whips-appointed Committee.
I have gone into quite a bit of my speech there, so I am grateful for that intervention in many respects. I am not going to touch on every aspect of this issue, but I urge right hon. and hon. Members in all parts of the House to think about the fact that although this is far from perfect legislation and it is a shame that we have not found a way to work through the legal but harmful material issue, we have to understand the parameters we are working in, in the real world, with these companies. We need to see that there is a patchwork of legislation, and the biggest way in which we can effectively let the social media companies know they have skin in the game in society—a liberal society that created them—is through competition legislation, across other countries and other jurisdictions. I am talking about our friends in the European Union and in the United States. We are working together closely now to come up with a suite of competition legislation. That is how we will be able to cover off some of this going forward. I will be supporting this Bill tonight and I urge everyone to do so, because, frankly, after five years I have had enough.
I rise to speak to the amendments in my name and those of my right hon. and hon. Friends, which of course I support.
It is welcome to see the Online Safety Bill back in the House. As we have debated this Bill and nursed it, as in my case, through both the Bill Committee and the Joint Committee, we have shone a light into some dark corners and heard some deeply harrowing stories. Who can forget the testimony given to us by Molly Russell’s dad, Ian? As we have heard, in the Public Gallery we have bereaved families who have experienced the most profound losses due to the extreme online harms to which their loved ones have been exposed; representatives of those families are watching the proceedings today. The hon. Member for Pontypridd (Alex Davies-Jones) mentioned that Ian is here, but let me mention the names of the children. Amanda and Stuart Stephens are here, and they are the parents of Olly; Andy and Judy Thomas are here, and they are the parents of Frankie; and Lorin LaFave, the mother of Breck is here, as is Ruth Moss, the mother of Sophie. All have lost children in connection with online harms, and I extend to each our most sincere condolences, as I am sure does every Member of the House. We have thought of them time and time again during the passage of this legislation; we have thought about their pain. All of us hope that this Bill will make very real changes, and we keep in our hearts the memories of those children and other young people who have suffered.
In our debates and Committee hearings, we have done our best to harry the social media companies and some of their secretive bosses. They have often been hiding away on the west coast of the US, to emerge blinking into the gloomy Committee light when they have to answer some questions about their nefarious activities and their obvious lack of concern for the way in which children and others are impacted.
We have debated issues of concern and sometimes disagreement in a way that shows the occasional benefits of cross-House co-operation. I have been pleased to work with friends and colleagues in other parties at every stage of the Bill, not least on Zach’s law, which we have mentioned. The result is a basis of good, much-needed legislation, and we must now get it on to the statute book.
It is unfortunate that the Bill has been so long delayed, which has caused great stress to some people who have been deeply affected by the issues raised, so that they have sometimes doubted our good faith. These delays are not immaterial. Children and young teenagers have grown older in an online world full of self-harm—soon to be illegal harms, we hope. It is a world full of easy-to-access pornography with no meaningful age verification and algorithms that provide harmful content to vulnerable people.
I have been pleased to note that calls from Members on the SNP Benches and from across the House to ensure that specific protection is granted to women and girls online have been heeded. New communications offences on cyber-flashing and intimate image abuse, and similar offences, are to be incorporated. The requirements for Ofcom to consult with the Victims’ Commissioner and the Domestic Abuse Commissioner are very welcome. Reporting tools should also be more responsive.
New clause 28 is an important new clause that SNP Members have been proud to sponsor. It calls for an advocacy body to represent the interests of children. That is vital, because the online world that children experience is ever evolving. It is not the online world that we in this Chamber tend to experience, nor is it the one experienced by most members of the media covering the debate today. We need, and young people deserve, a dedicated and appropriately funded body to look out for them online—a strong, informed voice able to stand up to the representations of big tech in the name of young people. This will, we hope, ensure that regulators get it right when acting on behalf of children online.
I am aware that there is broad support for such a body, including from those on the Labour Benches. We on the SNP Benches oppose the removal of the aspect of the Bill related to legal but harmful material. I understand the free speech arguments, and I have heard Ministers argue that the Government have proposed alternative approaches, which, they say, will give users control over the content that they see online. But adults are often vulnerable, too. Removing measures from the Bill that can protect adults, especially those in a mental health spiral or with additional learning needs, is a dereliction of our duty. An on/off toggle for harmful content is a poor substitute for what was originally proposed.
The legal but harmful discussion was and is a thorny one. It was important to get the language of the Bill right, so that people could be protected from harm online without impinging on freedom of expression, which we all hold dear. However, by sending aspects of the Bill back to Committee, with the intention of removing the legal but harmful provisions, I fear that the Government are simply running from a difficult debate, or worse, succumbing to those who have never really supported this Bill—some who rather approve of the wild west, free-for-all internet. It is much better to rise to the challenge of resolving the conflicts, such as they are, between free speech and legal but harmful. I accept that the Government’s proposals around greater clarity and enforcement of terms and conditions and of transparency in reporting to Ofcom offer some mitigation, but not, in my view, enough.
The hon. Gentleman will remember that, when we served on the Joint Committee that scrutinised the draft Bill, we were concerned that the term “legal but harmful” was problematic and that there was a lack of clarity. We thought it would be better to have more clarity and enforcement based on priority illegal offences and on the terms of service. Does he still believe that, or has he changed his mind?
It is a fine debate. Like so much in legislation, there is not an absolute right and an absolute wrong. We heard contradictory evidence. It is important to measure the advantages and the disadvantages. I will listen to the rest of the debate very carefully, as I have done throughout.
As a journalist in a previous life, I have long been a proponent of transparency and open democracy—something that occasionally gets me into trouble. We on the SNP Benches have argued from the outset that the powers proposed for the Secretary of State are far too expensive and wide-reaching. That is no disrespect to the Minister or the new Secretary of State, but they will know that there have been quite a few Culture Secretaries in recent years, some more temperate than others.
In wishing to see a diminution of the powers proposed we find ourselves in good company, not least with Ofcom. I note that there have been some positive shifts in the proposals around the powers of the Secretary of State, allowing greater parliamentary oversight. I hope that these indicate a welcome acknowledgement that our arguments have fallen on fertile Government soil—although, of course, it could be that the Conservative Secretary of State realises that she may soon be the shadow Secretary of State and that it will be a Labour Secretary of State exercising the proposed powers. I hope she will forgive me for that moment’s cynicism.
I thank the hon. Gentleman for his intervention. He is absolutely right: inciting a child to harm their body, whatever that harm is, should be criminalised, and I support the sentiment of new clause 16, which seeks to do that. Sadly, lots of children, particularly girls, go online and type in “I don’t like my body”. Maybe they are drawn to eating disorder sites, as my right hon. Friend the Member for Chelmsford (Vicky Ford) has mentioned, but often they are drawn into sites that glorify transition, often with adult men that they do not even know in other countries posting pictures of double mastectomies on teenage girls.
The hon. Lady must realise that this is fantasy land. It is incredibly difficult to get gender reassignment surgery. The “they’re just confused” stuff is exactly what was said to me as a young gay man. She must realise that this really simplifies a complicated issue and patronises people going through difficult choices.
I really wish it was fantasy land, but I am in contact with parents each and every day who tell me stories of their children being drawn into this. Yes, in this country it is thankfully very difficult to get a double mastectomy when you are under 18, but it is incredibly easy to buy testosterone illegally online and to inject it, egged on by adults in other countries. Once a girl has injected testosterone during puberty, she will have a deep voice and facial hair for life and male-pattern baldness, and she will be infertile. That is a permanent change, it is self-harm and it should be criminalised under this Bill, whether through this clause or through the Government’s new plans. The hon. Member for Kirkcaldy and Cowdenbeath (Neale Hanvey) is absolutely right: this is happening every day and it should be classed as self-harm.
Going back to my comments about the effect on children of viewing pornography, I absolutely support the idea of putting children’s experience at the heart of the Bill but it needs to be about children’s welfare and not about what children want. One impact of the internet has been to blur the boundary between adults and children. As adults, we need to be able to say, “This is the evidence of what is harmful to children, and this is what children should not be seeing.” Of course children will say that they want free access to all content, just like they want unlimited sweets and unlimited chocolate, but as adults we need to be able to say what is harmful for children and to protect them from seeing it.
This bring me to Government new clause 11, which deals with making sure that child sexual abuse material is taken offline. There is a clear link between the epidemic of pornography and the epidemic of child sexual abuse material. The way the algorithms on porn sites work is to draw users deeper and deeper into more and more extreme content—other Members have mentioned this in relation to other areas of the internet—so someone might go on to what they think is a mainstream pornography site and be drawn into more and more explicit, extreme and violent criminal pornography. At the end of this, normal people are drawn into watching children being abused, often in real time and often in other countries. There is a clear link between the epidemic of porn and the child sexual abuse material that is so prevalent online.
Last week in the Home Affairs Committee we heard from Professor Alexis Jay, who led the independent inquiry into child sexual abuse. Her report is harrowing, and it has been written over seven years. Sadly, its conclusion is that seven years later, there are now even more opportunities for people to abuse children because of the internet, so making sure that providers have a duty to remove any child sexual abuse material that they find is crucial. Many Members have referred to the Internet Watch Foundation. One incredibly terrifying statistic is that in 2021, the IWF removed 252,194 web pages containing child sexual abuse material and an unknown number of images. New clause 11 is really important, because it would put the onus on the tech platforms to remove those images when they are found.
It is right to put the onus on the tech companies. All the way through the writing of this Bill, at all the consultation meetings we have been to, we have heard the tech companies say, “It’s too hard; it’s not possible because of privacy, data, security and cost.” I am sure that is what the mine owners said in the 19th century when they were told by the Government to stop sending children down the mines. It is not good enough. These are the richest, most powerful companies in the world. They are more powerful than an awful lot of countries, yet they have no democratic accountability. If they can employ real-time facial recognition at airports, they can find a way to remove child abuse images from the internet.
This leads me on to new clause 17, tabled by the right hon. Member for Barking (Dame Margaret Hodge), which would introduce individual director liability for non-compliance. I completely support that sentiment and I agree that this is likely to be the only way we will inject some urgency into the process of compliance. Why should directors who are profiting from the platforms not be responsible if children suffer harm as a result of using their products? That is certainly the case in many other industries. The right hon. Lady used the example of the building trade. Of course there will always be accidents, but if individual directors face the prospect of personal liability, they will act to address the systemic issues, the problems with the processes and the malevolent algorithms that deliberately draw users towards harm.
Online Safety Bill (Programme) (No. 4) Debate
Full Debate: Read Full DebateJohn Nicolson
Main Page: John Nicolson (Scottish National Party - Ochil and South Perthshire)Department Debates - View all John Nicolson's debates with the Department for Digital, Culture, Media & Sport
(2 years ago)
Commons ChamberMy hon. Friend the Member for Aberdeen North (Kirsty Blackman) would have been speaking in this debate but she is indisposed, so I am delighted to offer some of her bons mots to the House. The effect of this motion is to revive the Third Reading debate that was previously programmed to take place immediately after the Report stage ended. It is fair to say that there has been a bit of chaos in the UK Government in recent times, with a disastrous yet thankfully short prime ministerial period when it looked as if the Online Safety Bill might be scrapped altogether. We on the SNP Benches are glad to see the Bill return to finish its Report stage. Although we are not entirely happy with the contents of the Bill—as Members can see by the number of amendments we had rejected in Committee and the number of amendments we tabled on Report today—we strongly believe that this version is better than the version the Government are proposing to create by recommitting the Bill later today. If this programme motion were to fall, the Government might not be able to recommit the Bill.
During the progress of both the legislative and pre-legislative stages of the Bill, as well as in the Digital, Culture, Media and Sport Committee, we have heard from survivors who have been permanently scarred as a result of so-called legal but harmful content. We have heard from families whose loved ones have died as a result of accessing this content, as Members around the House well know. It is surely imperative that action is taken; otherwise, we will see more young people at risk. Having protections in place for children is a good step forward, but it is not sufficient. Therefore we will be voting against this programme motion, which creates the conditions for recommitting a Bill that—as I well know, having sat through it—has already had 50 hours of Committee scrutiny and countless hours in the pre-legislative Joint Committee.