Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate

Alex Davies-Jones

Main Page: Alex Davies-Jones (Labour - Pontypridd)
John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I do, of course, agree. As anyone who has suffered with someone in their family committing suicide knows, it has a lifelong family effect. It is yet another amendment where I feel we should depart from the pantomime of so much parliamentary procedure, where both sides fundamentally agree on things but Ministers go through the torturous process of trying to tell us that every single amendment that any outside body or any Opposition Member, whether from the SNP or the Labour party, comes up with has been considered by the ministerial team and is already incorporated or covered by the Bill. They would not be human if that were the case. Would it not be refreshing if there were a slight change in tactic, and just occasionally the Minister said, “Do you know what? That is a very good point. I think I will incorporate it into the Bill”?

None of us on the Opposition Benches seeks to make political capital out of any of the things we propose. All of us, on both sides of the House, are here with the best of intentions, to try to ensure that we get the best possible Bill. We all want to be able to vote for the Bill at the end of the day. Indeed, as I said, I have worked with two friends on the Conservative Benches—with the hon. Member for Watford on the Joint Committee on the draft Bill and with the hon. Member for Wolverhampton North East on the Select Committee on Digital, Culture, Media and Sport—and, as we know, they have both voted for various proposals. It is perhaps part of the frustration of the party system here that people are forced to go through the hoops and pretend that they do not really agree with things that they actually do agree with.

Let us try to move on with this, in a way that we have not done hitherto, and see if we can agree on amendments. We will withdraw amendments if we are genuinely convinced that they have already been considered by the Government. On the Government side, let them try to accept some of our amendments—just begin to accept some—if, as with this one, they think they have some merit.

I was talking about Samaritans, and exactly what it wants to do with the Bill. It is concerned about harmful content after the Bill is passed. This feeds into potentially the most important aspect of the Bill: it does not mandate risk assessments based exclusively on risk. By adding in the qualifications of size and scope, the Bill wilfully lets some of the most harmful content slip through its fingers—wilfully, but I am sure not deliberately. Categorisation will be covered by a later amendment, tabled by my hon. Friend the Member for Aberdeen North, so I shall not dwell on it now.

In July 2021, the Law Commission for England and Wales recommended the creation of a new narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. The commission identified that there is

“currently no offence that adequately addresses the encouragement of serious self-harm.”

The recommendation followed acknowledgement that

“self-harm content online is a worrying phenomenon”

and should have a

“robust fault element that targets deliberate encouragement of serious self-harm”.

Currently, there are no provisions of the Bill to create a new offence of assisting or encouraging self- harm.

In conclusion, I urge the Minister to listen not just to us but to the expert charities, including Samaritans, to help people who have lived experience of self-harm and suicide who are calling for regulation of these dangerous sites.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Good afternoon, Sir Roger; it is a pleasure, as ever, to serve under your chairship. I rise to speak to new clause 36, which has been grouped with amendment 142 and is tabled in the names of the hon. Members for Ochil and South Perthshire and for Aberdeen North.

I, too, pay tribute to Samaritans for all the work it has done in supporting the Bill and these amendments to it. As colleagues will be aware, new clause 36 follows a recommendation from the Law Commission dating back to July 2021. The commission recommended the creation of a new, narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. It identified that there is

“currently no offence that adequately addresses the encouragement of serious self-harm.”

The recommendation followed acknowledgement that

“self-harm content online is a worrying phenomenon”

and should have a

“robust fault element that targets deliberate encouragement of serious self-harm”.

Currently, there are no provisions in the Bill to create a new offence of assisting or encouraging self-harm, despite the fact that other recommendations from the Law Commission report have been brought into the Bill, such as creating a new offence of cyber-flashing and prioritising tackling illegal suicide content.

We all know that harmful suicide and self-harm content is material that has the potential to cause or exacerbate self-harm and suicidal behaviours. Content relating to suicide and self-harm falls into both categories in the Bill—illegal content and legal but harmful content. Encouraging or assisting suicide is also currently a criminal offence in England and Wales under the Suicide Act 1961, as amended by the Coroners and Justice Act 2009.

Content encouraging or assisting someone to take their own life is illegal and has been included as priority illegal content in the Bill, meaning that platforms will be required to proactively and reactively prevent individuals from encountering it, and search engines will need to structure their services to minimise the risk to individuals encountering the content. Other content, including content that positions suicide as a suitable way of overcoming adversity or describes suicidal methods, is legal but harmful.

The Labour party’s Front-Bench team recognises that not all content falls neatly into the legal but harmful category. What can be helpful for one user can be extremely distressing to others. Someone may find it extremely helpful to share their personal experience of suicide, for example, and that may also be helpful to other users. However, the same material could heighten suicidal feelings and levels of distress in someone else. We recognise the complexities of the Bill and the difficulties in developing a way around this, but we should delineate harmful and helpful content relating to suicide and self-harm, and that should not detract from tackling legal but clearly harmful content.

In its current form, the Bill will continue to allow legal but clearly harmful suicide and self-harm content to be accessed by over-18s. Category 1 platforms, which have the highest reach and functionality, will be required to carry out risk assessments of, and set out in their terms and conditions their approach to, legal but harmful content in relation to over-18s. As the hon. Member for Ochil and South Perthshire outlined, however, the Bill’s impact assessment states that “less than 0.001%” of in-scope platforms

“are estimated to meet the Category 1 and 2A thresholds”,

and estimates that only 20 platforms will be required to fulfil category 1 obligations. There is no requirement on the smaller platforms, including those that actively encourage suicide, to do anything at all to protect over-18s. That simply is not good enough. That is why the Labour party supports new clause 36, and we urge the Minister to do the right thing by joining us.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is, as always, a great pleasure to serve under your chairmanship, Sir Roger. The hon. Member for Ochil and South Perthshire made an observation in passing about the Government’s willingness to listen and respond to parliamentarians about the Bill. We listened carefully to the extensive prelegislative scrutiny that the Bill received, including from the Joint Committee on which he served. As a result, we have adopted 66 of the changes that that Committee recommended, including on significant things such as commercial pornography and fraudulent advertising.

If Members have been listening to me carefully, they will know that the Government are doing further work or are carefully listening in a few areas. We may have more to say on those topics as the Bill progresses; it is always important to get the drafting of the provisions exactly right. I hope that that has indicated to the hon. Gentleman our willingness to listen, which I think we have already demonstrated well.

On new clause 36, it is important to mention that there is already a criminal offence of inciting suicide. It is a schedule 7 priority offence, so the Bill already requires companies to tackle content that amounts to the existing offence of inciting suicide. That is important. We would expect the promotion of material that encourages children to self-harm to be listed as a primary priority harm relating to children, where, again, there is a proactive duty to protect them. We have not yet published that primary priority harm list, but it would be reasonable to expect that material promoting children to self-harm would be on it. Again, although we have not yet published the list of content that will be on the adult priority harm list—obviously, I cannot pre-empt the publication of that list—one might certainly wish for content that promotes adults to self-harm to appear on it too.

The hon. Gentleman made the point that duties relating to adults would apply only to category 1 companies. Of course, the ones that apply to children would apply to all companies where there was significant risk, but he is right that were that priority harm added to the adult legal but harmful list, it would apply only to category 1 companies.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

These amendments pick up a question asked by the hon. Member for Aberdeen North much earlier in our proceedings. In schedule 7 we set out the priority offences that exist in English and Welsh law. We have consulted the devolved Administrations in Scotland and Northern Ireland extensively, and I believe we have agreed with them a number of offences in Scottish and Northern Irish law that are broadly equivalent to the English and Welsh offences already in schedule 7. Basically, Government amendments 116 to 126 add those devolved offences to the schedule.

In future, if new Scottish or Northern Irish offences are created, the Secretary of State will be able to consult Scottish or Northern Irish Ministers and, by regulations, amend schedule 7 to add the new offences that may be appropriate if conceived by the devolved Parliament or Assembly in due course. That, I think, answers the question asked by the hon. Lady earlier in our proceedings. As I say, we consulted the devolved Administrations extensively and I hope that the Committee will assent readily to the amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The amendments aim to capture all the criminal offences in other parts of the UK to be covered by the provisions of the Bill, as the Minister outlined. An offence in one part of the UK will be considered an offence elsewhere, for the purposes of the Bill.

With reference to some of the later paragraphs, I am keen for the Minister to explain briefly how this will work in the case of Scotland. We believe that the revenge porn offence in Scotland is more broadly drawn than the English version, so the level of protection for women in England and Wales will be increased. Can the Minister confirm that?

The Bill will not apply the Scottish offence to English offenders, but it means that content that falls foul of the law in Scotland, but not in England or Wales, will still be relevant regulated content for service providers, irrespective of the part of the UK in which the service users are located. That makes sense from the perspective of service providers, but I will be grateful for clarity from the Minister on this point.

--- Later in debate ---
John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

It is an interesting question. Alas, I long ago stopped trying to put myself into the minds of Conservative Ministers—a scary place for any of us to be.

We understand that it is difficult to try to regulate in respect of human trafficking on platforms. It requires work across borders and platforms, with moderators speaking different languages. We established that Facebook does not have moderators who speak different languages. On the Joint Committee on the draft Bill, we discovered that Facebook does not moderate content in English to any adequate degree. Just look at the other languages around the world—do we think Facebook has moderators who work in Turkish, Finnish, Swedish, Icelandic or a plethora of other languages? It certainly does not. The only language that Facebook tries to moderate—deeply inadequately, as we know—is English. We know how bad the moderation is in English, so can the Committee imagine what it is like in some of the world’s other languages? The most terrifying things are allowed to happen without moderation.

Regulating in respect of human trafficking on platforms is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and the costs that will be entailed. If human trafficking is not designated a priority harm, I fear it will fall by the wayside, so I must ask the Minister: is human trafficking covered by another provision on priority illegal content? Like my hon. Friend the Member for Aberdeen North, I cannot see where in the Bill that lies. If the answer is yes, why are the human rights groups not satisfied with the explanation? What reassurance can the Minister give to the experts in the field? Why not add a direct reference to the Modern Slavery Act, as in the amendment?

If the answer to my question is no, I imagine the Minister will inform us that the Bill requires platforms to consider all illegal content. In what world is human trafficking that is facilitated online not a priority? Platforms must be forced to be proactive on this issue; if not, I fear that human trafficking, like so much that is non-priority illegal content, will not receive the attention it deserves.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Schedule 7 sets out the list of criminal content that in-scope firms will be required to remove as a priority. Labour was pleased to see new additions to the most recent iteration, including criminal content relating to online drug and weapons dealing, people smuggling, revenge porn, fraud, promoting suicide and inciting or controlling prostitution for gain. The Government’s consultation response suggests that the systems and processes that services may use to minimise illegal or harmful content could include user tools, content moderation and recommendation procedures.

More widely, although we appreciate that the establishment of priority offences online is the route the Government have chosen to go down with the Bill, we believe the Bill remains weak in relation to addressing harms to adults and wider societal harms. Sadly, the Bill remains weak in its approach and has seemingly missed a number of known harms to both adults and children that we feel are a serious omission. Three years on from the White Paper, the Government know where the gaps are, yet they have failed to address them. That is why we are pleased to support the amendment tabled by the hon. Members for Ochil and South Perthshire and for Aberdeen North.

Human trafficking offences are a serious omission from schedule 7 that must urgently be rectified. As we all know from whistleblower Frances Haugen’s revelations, Facebook stands accused, among the vast array of social problems, of profiting from the trade and sale of human beings—often for domestic servitude—by human traffickers. We also know that, according to internal documents, the company has been aware of the problems since at least 2018. As the hon. Member for Ochil and South Perthshire said, we know that a year later, on the heels of a BBC report that documented the practice, the problem was said to be so severe that Apple itself threatened to pull Facebook and Instagram from its app store. It was only then that Facebook rushed to remove content related to human trafficking and made emergency internal policy changes to avoid commercial consequences described as “potentially severe” by the company. However, an internal company report detailed that the company did not take action prior to public disclosure and threats from Apple—profit over people.

In a complaint to the US Securities and Exchange Commission first reported by The Wall Street Journal, whistleblower Haugen wrote:

“Investors would have been very interested to learn the truth about Facebook almost losing access to the Apple App Store because of its failure to stop human trafficking on its products.”

I cannot believe that the Government have failed to commit to doing more to tackle such abhorrent practices, which are happening every day. I therefore urge the Minister to do the right thing and support amendment 90.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The first thing to make clear to the Committee and anyone listening is that, of course, offences under the Modern Slavery Act 2015 are brought into the scope of the illegal content duties of this Bill through clause 52(4)(d), because such offences involve an individual victim.

Turning to the priority offences set out in schedule 7 —I saw this when I was a Home Office Minister—modern slavery is generally associated with various other offences that are more directly visible and identifiable. Modern slavery itself can be quite hard to identify. That is why our approach is, first, to incorporate modern slavery as a regular offence via clause 52(4)(d) and, secondly, to specify as priority offences those things that are often identifiable symptoms of it and that are feasibly identified. Those include many of the offences listed in schedule 7, such as causing, inciting or controlling prostitution for gain, as in paragraph 16 on sexual exploitation, which is often the manifestation of modern slavery; money laundering, which is often involved where modern slavery takes place; and assisting illegal immigration, because modern slavery often involves moving somebody across a border, which is covered in paragraph 15 on assisting illegal immigration, as per section 25 of the Immigration Act 1971.

Modern slavery comes into scope directly via clause 52(4)(d) and because the practicably identifiable consequences of modern slavery are listed as priority offences, I think we do have this important area covered.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I have had no indication that anybody wishes to move Carla Lockhart’s amendment 98—she is not a member of the Committee.

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

It is absolutely right that the Government have included a commitment to children in the form of defining primary priority content that is harmful. We all know of the dangerous harms that exist online for children, and while the Opposition support the overarching aims of the Bill, we feel the current definitions do not go far enough—that is a running theme with this Bill.

The Bill does not adequately address the risks caused by the design—the functionalities and features of services themselves—or those created by malign contact with other users, which we know to be an immense problem. Research has found that online grooming of young girls has soared by 60% in the last three years—and four in five victims are girls. We also know that games increasingly have addictive gambling-style features. Those without user-to-user functionalities, such as Subway Surfers, which aggressively promotes in-app purchases, are currently out of scope of the Bill.

Lastly, research by Parent Zone found that 91% of children say that loot boxes are available in the games they play and 40% have paid to open one. That is not good enough. I urge the Minister to consider his approach to tackling harmful content and the impact that it can have in all its forms. When considering how children will be kept safe under the new regime, we should consider concerns flagged by some of the civil society organisations that work with them. Organisations such as the Royal College of Psychiatrists, The Mix, YoungMinds and the Mental Health Foundation have all been instrumental in their calls for the Government to do more. While welcoming the intention to protect children, they note that it is not clear at present how some categories of harm, including material that damages people’s body image, will be regulated—or whether it will be regulated at all.

While the Bill does take steps to tackle some of the most egregious, universally damaging material that children currently see, it does not recognise the harm that can be done through the algorithmic serving of material that, through accretion, will cause harm to children with particular mental health vulnerabilities. For example, beauty or fitness-related content could be psychologically dangerous to a child recovering from an eating disorder. Research from the Mental Health Foundation shows how damaging regular exposure to material that shows conventionally perfect images of bodies, often edited digitally and unattainable, are to children and young people.

This is something that matters to children, with 84% of those questioned in a recent survey by charity The Mix saying the algorithmic serving of content was a key issue that the Bill should address. Yet in its current form it does not give children full control over the content they see. Charities also tell us about the need to ensure that children are exposed to useful content. We suggest that the Government consider a requirement for providers to push material on social media literacy to users and to provide the option to receive content that can help with recovery where it is available, curated by social media companies with the assistance of trusted non-governmental organisations and public health bodies. We also hope that the Government can clarify that material damaging to people’s body image will be considered a form of harm.

Additionally, beyond the issue of the content itself that is served to children, organisations including YoungMinds and the Royal College of Psychiatrists have raised the potential dangers to mental health inherent in the way services can be designed to be addictive.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

My hon Friend raises an important point about media literacy, which we have touched on a few times during this debate. We have another opportunity here to talk about that and to say how important it is to think about media literacy within the scope of the Bill. It has been removed, and I think we need to put it back into the Bill at every opportunity—I am talking about media literacy obligations for platforms to help to responsibly educate children and adults about the risks online. We need to not lose sight of that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree with my hon. Friend. She is right to talk about the lack of a social and digital media strategy within the Bill, and the need to educate children and adults about the harmful content that we see online. How to stay safe online in all its capacities is absolutely fundamental to the Bill. We cannot have an Online Safety Bill without teaching people how to be safe online. That is important for how children and young people interact online. We know that they chase likes and the self-esteem buzz they get from notifications popping up on their phone or device. That can be addictive, as has been highlighted by mental health and young persons’ charities.

I urge the Minister to address those issues and to consider how the Government can go further, whether through this legislation or further initiatives, to help to combat some of those issues.

--- Later in debate ---
With a third of internet users unaware of the potential for inaccurate or biased information online, it is vital that this amendment on health-related misinformation and disinformation is inserted into the Bill during Committee stage. It would give Parliament the time to scrutinise what content is in scope and ensure that regulation is in place to promote proportionate and effective responses. We must make it incumbent on platforms to be proactive in reducing that pernicious form of disinformation, designed only to hurt and to harm. As we have seen from the pandemic, the consequences can be grave if the false information is believed, as, sadly, it so often is.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Again, Labour supports moves to ensure that there is some clarity about specific content that is deemed to be harmful to adults, but of course the Opposition have concerns about the overall aim of defining harm.

The Government’s chosen approach to regulating the online space has left too much up to secondary legislation. We are also concerned that health misinformation and disinformation—a key harm, as we have all learned from the coronavirus pandemic—is missing from the Bill. That is why we too support amendment 83. The impact of health misinformation and disinformation is very real. Estimates suggest that the number of social media accounts posting misinformation about vaccines, and the number of users following those accounts, increased during the pandemic. Research by the Centre for Countering Digital Hate, published in November 2020, suggested that the number of followers of the largest anti-vaccination social media accounts had increased by 25% since 2019. At the height of the pandemic, it was also estimated that there were 5.4 million UK-based followers of anti-vaccine Twitter accounts.

Interestingly, an Ofcom survey of around 200 respondents carried out between 12 and 14 March 2021 found that 28% of respondents had come across information about covid-19 that could be considered false or misleading. Of those who had encountered such information, respondents from minority ethnic backgrounds were twice as likely to say that the claim made to them made them think twice about the issue compared with white respondents. The survey found that of those people who were getting news and information about the coronavirus within the preceding week, 15% of respondents had come across claims that the coronavirus vaccines would alter human DNA; 18% had encountered claims that the coronavirus vaccines were a cover for the implant of trackable microchips, and 10% had encountered claims that the vaccines contained animal products.

Public health authorities, the UK Government, social media companies and other organisations all attempted to address the spread of vaccine misinformation through various strategies, including moderation of vaccine misinformation on social media platforms, ensuring the public had access to accurate and reliable information and providing education and guidance to people on how to address misinformation when they came across it.

Although studies do not show strong links between susceptibility to misinformation and ethnicity in the UK, some practitioners and other groups have raised concerns about the spread and impact of covid-19 vaccine misinformation among certain minority ethnic groups. Those concerns stem from research that shows historically lower levels of vaccine confidence and uptake among those groups. Some recent evidence from the UK’s vaccine roll-out suggests that that trend has continued for the covid-19 vaccine.

Data from the OpenSAFELY platform, which includes data from 40% of GP practices in England, covering more than 24 million patients, found that up to 7 April 2021, 96% of white people aged over 60 had received a vaccination compared with only 77% of people from a Pakistani background, 76% from a Chinese background and 69% of black people within the same age group. A 2021 survey of more than 172,000 adults in England on attitudes to the vaccine also found that confidence in covid-19 vaccines was highest in those of white ethnicity, with some 92.6% saying that they had accepted or would accept the vaccine. The lowest confidence was found in those of black ethnicity, at 72.5%. Some of the initiatives to tackle vaccine misinformation and encourage vaccine take-up were aimed at specific minority ethnic groups, and experts have emphasised the importance of ensuring that factual information about covid-19 vaccines is available in multiple different languages.

Social media companies have taken various steps to tackle misinformation on their platforms during the covid-19 pandemic, including removing or demoting misinformation, directing users to information from official sources and banning certain adverts. So, they can do it when they want to—they just need to be compelled to do it by a Bill. However, we need to go further. Some of the broad approaches to content moderation that digital platforms have taken to address misinformation during the pandemic are discussed in the Parliamentary Office of Science and Technology’s previous rapid response on covid-19 and misinformation.

More recently, some social media companies have taken specific action to counter vaccine misinformation. In February 2021, as part of its wider policies on coronavirus misinformation, Facebook announced that it would expand its efforts to remove false information about covid-19 vaccines, and other vaccines more broadly. The company said it would label posts that discuss covid-19 vaccines with additional information from the World Health Organisation. It also said it would signpost its users to information on where and when they could get vaccinated. Facebook is now applying similar measures to Instagram.

In March 2021, Twitter began applying labels to tweets that could contain misinformation about covid-19 vaccines. It also introduced a strike policy, under which users that violate its covid-19 misinformation policy five or more times would have their account permanently suspended.

YouTube announced a specific ban on covid-19 anti-vaccination videos in October 2020. It committed to removing any videos that contradict official information about the vaccine from the World Health Organisation. In March, the company said it had removed more than 30,000 misleading videos about the covid-19 vaccine since the ban was introduced. However, as with most issues, until the legislation changes, service providers will not feel truly compelled to do the right thing, which is why we must legislate and push forward with amendment 83.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

I would like to speak to the clause rather than the amendment, Sir Roger. Is now the right time to do so, or are we only allowed to speak to the amendment?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. There is an obligation on the Secretary of State to consult—[Interruption.] Did I hear someone laugh?—before proposing a statutory instrument to add things. There is a consultation first and then, if extra things are going to be added—in my hon. Friend’s language, if the scope is increased—that would be votable by Parliament because it is an affirmative SI. So the answer is yes to both questions. Yes there will be consultation in advance, and yes, if this Government or a future Government wanted to add anything, Parliament could vote on it if it wanted to because it will be an affirmative SI. That is a really important point.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a moment; I want to answer the other point made by my hon. Friend the Member for Don Valley first. He said that two wrongs don’t make a right. I am not defending the fact that social media firms act in a manner that is arbitrary and censorious at the moment. I am not saying that it is okay for them to carry on. The point that I was making was a different one. I was saying that they act censoriously and arbitrarily at times at the moment. The Bill will diminish their ability to do that in a couple of ways. First, for the legal but harmful stuff, which he is worried about, they will have a duty to act consistently. If they do not, Ofcom will be able to enforce against them. So their liberty to behave arbitrarily, for this category of content at least, will be circumscribed. They will now have to be consistent. For other content that is outside the scope of this clause —which I guess therefore does not worry my hon. Friend—they can still be arbitrary, but for this they have got to be consistent.

There is also the duty to have regard to freedom of expression, and there is a protection of democratic and journalistic importance in clauses 15 and 16. Although those clauses are not perfect and some people say they should be stronger, they are at least better than what we have now. When I say that this is good for freedom of speech, I mean that nothing here infringes on freedom of speech, and to the extent that it moves one way or the other, it moves us somewhat in the direction of protecting free speech more than is the case at the moment, for the reasons I have set out. I will be happy to debate the issue in more detail either in this Committee or outside, if that is helpful and to avoid trying the patience of colleagues.

None Portrait The Chair
- Hansard -

Order. Before we go any further, I know it is tempting to turn around and talk to Back Benchers, but that makes life difficult for Hansard because you tend to miss the microphone. It is also rather discourteous to the Chair, so in future I ask the Minister to please address the Chair. I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank the Minister for giving way; I think that is what he was doing as he sat down.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

indicated assent.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Just for clarity, the hon. Member for Don Valley and the Minister have said that Labour Members are seeking to curtail or tighten freedom of expression and freedom of speech, but that is not the case. We fundamentally support free speech, as we always have been. The Bill addresses systems and processes, and that is what it should do—the Minister, the Labour party and I are in full alignment on that. We do not think that the Bill should restrict freedom of speech. I would just like to put that on the record.

We also share the concerns expressed by the hon. Member for Don Valley about the Secretary of State’s potential powers, the limited scope and the extra scrutiny that Parliament might have to undertake on priority harms, so I hope he will support some of our later amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to the shadow Minister for confirming her support for free speech. Perhaps I could take this opportunity to apologise to you, Sir Roger, and to Hansard for turning round. I will try to behave better in future.

--- Later in debate ---
None Portrait The Chair
- Hansard -

As I have indicated already, I do not propose that we have a clause stand part debate. It has been exhaustively debated, if I may say so.

Clause 54 ordered to stand part of the Bill.

Clause 55

Regulations under sections 53 and 54

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 62, in clause 55, page 52, line 4, after “OFCOM” insert

“and other stakeholders, including organisations that campaign for the removal of harmful content online”.

This amendment requires the Secretary of State to consult other stakeholders before making regulations under clause 53 or 54.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Clause 56 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We all know that managing harmful content, unlike illegal content, is more about implementing systems that prevent people from encountering it rather than removing it entirely. At the moment, there are no duties on the Secretary of State to consult anyone other than Ofcom ahead of making regulations under clauses 53 and 54. We have discussed at length the importance of transparency, and surely the Minister can agree that the process should be widened, as we have heard from those on the Government Back Benches.

Labour has said time and again that it should not be for the Secretary of State of the day to determine what constitutes harmful content for children or adults. Without the important consultation process outlined in amendment 62, there are genuine concerns that that could lead to a damaging precedent whereby a Secretary of State, not Parliament, has the ability to determine what information is harmful. We all know that the world is watching as we seek to work together on this important Bill, and Labour has genuine concerns that without a responsible consultation process, as outlined in amendment 62, we could inadvertently be suggesting to the world that this fairly dogmatic approach is the best way forward.

Amendment 62 would require the Secretary of State to consult other stakeholders before making regulations under clauses 53 and 54. As has been mentioned, we risk a potentially dangerous course of events if there is no statutory duty on the Secretary of State to consult others when determining the definition of harmful content. Let me draw the Minister’s attention to the overarching concerns of stakeholders across the board. Many are concerned that harmful content for adults requires the least oversight, although there are potential gaps that mean that certain content—such as animal abuse content—could completely slip through the net. The amendment is designed to ensure that sufficient consultation takes place before the Secretary of State makes important decisions in directing Ofcom.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

On that point, I agree wholeheartedly with my hon. Friend. It is important that the Secretary of State consults campaign organisations that have expertise in the relevant areas. Much as we might want the Secretary of State to be informed on every single policy issue, that is unrealistic. It is also important to acknowledge the process that we have been through with the Bill: the expertise of organisations has been vital in some of the decisions that we have had to make. My hon. Friend gave a very good example, and I am grateful to animal welfare groups for their expertise in highlighting the issue of online abuse of animals.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I completely agree with my hon. Friend. As parliamentarians we are seen as experts in an array of fields. I do not purport to be an expert in all things, as it is more a jack of all trades role, and it would be impossible for one Secretary of State to be an expert in everything from animal abuse to online scam ads, from fraud to CSAM and terrorism. That is why it is fundamental that the Secretary of State consults with experts and stakeholders in those fields, for whom these things are their bread and butter—their day job every day. I hope the Minister can see that regulation of the online space is a huge task to take on for us all. It is Labour’s view that any Secretary of State would benefit from the input of experts in specific fields. I urge him to support the amendment, especially given the wider concerns we have about transparency and power sharing in the Bill.

It is welcome that clause 56 will force Ofcom, as the regulator, to carry out important reviews that will assess the extent to which content is harmful to children and adults when broadly appearing on user-to-user services. As we have repeatedly said, transparency must be at the heart of our approach. While Labour does not formally oppose the clause, we have concerns about subsection (5), which states:

“The reports must be published not more than three years apart.”

The Minister knows that the Bill has been long awaited, and we need to see real, meaningful change and updates now. Will he tell us why it contains a three-year provision?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister for his clarification earlier and his explanation of how the categories of primary priority content and priority content can be updated. That was helpful.

Amendment 62 is excellent, and I am more than happy to support it.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have heard my right hon. Friend’s points about a standing Joint Committee for post-legislative implementation scrutiny. On the comments about the time, I agree that the Ofcom review needs to be far enough into the future that it can be meaningful, hence the three-year time period.

On the substance of amendment 62, tabled by the shadow Minister, I can confirm that the Government are already undertaking research and working with stakeholders on identifying what the priority harms will be. That consideration includes evidence from various civil society organisations, victims organisations and many others who represent the interests of users online. The wider consultation beyond Ofcom that the amendment would require is happening already as a matter of practicality.

We are concerned, however, that making this a formal consultation in the legal sense, as the amendment would, would introduce some delays while we do so, because a whole sequence of things have to happen after Royal Assent. First, we have to designate the priority harms by statutory instrument, and then Ofcom has to publish its risk assessments and codes of practice. If we insert into that a formal legal consultation step, it would add at least four or even six months into the process of implementing the Act. I know that that was not the hon. Lady’s intention and that she is concerned about getting the Act implemented quickly. For that reason, the Government do not want to insert a formal legal consultation step into the process, but I am happy to confirm that we are engaging in the consultation already on an informal basis and will continue to do so. I ask respectfully that amendment 62 be withdrawn.

The purpose of clauses 55 and 56 has been touched on already, and I have nothing in particular to add.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful for the Minister’s comments on the time that these things would take. I cannot see how they could not happen succinctly along with the current consultation, and why it would take an additional four to six months. Could he clarify that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

A formal statutory consultation could happen only after the passage of the Bill, whereas the informal non-statutory consultation we can do, and are doing, now.

Question put, That the amendment be made.

--- Later in debate ---
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I have some brief comments on the clause. The Labour party very much welcomes the addition to user verification duties in the revised Bill. A range of groups, including Clean Up the Internet, have long campaigned for a verification requirement process, so this is a positive step forward.

We do, however, have some concerns about the exact principles and minimum standards for the user verification duty, which I will address when we consider new clause 8. We also have concerns about subsection (2), which states:

“The verification process may be of any kind (and in particular, it need not require documentation to be provided).”

I would be grateful if the Minister could clarify exactly what that process will look like in practice.

Lastly, as Clean Up the Internet has said, we need further clarification on whether users will be given a choice of how they verify and of the verification provider itself. We can all recognise that there are potential down- sides to the companies that own the largest platforms —such as Meta, Google, Twitter and ByteDance—developing their own in-house verification processes and making them the only option for users wishing to verify on their platform. Indeed, some users may have reservations about sharing even more personal data with those companies. Users of multiple social media platforms can find it inconvenient and confusing, and could be required to go through multiple different verification processes on different platforms to achieve the same outcome of confirming their real name.

There is a risk of the largest platforms seeking to leverage their dominance of social media to capture the market for ID verification services, raising competition concerns. I would be grateful if the Minister could confirm his assessment of the potential issues around clause 57 as it stands.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to welcome clause 57. It is an important part of the Bill and shows the Government acknowledging that anonymity can have a significant impact on the harms that affect victims. There is a catalogue of evidence of the harm done by those posting anonymously. Anonymity appears to encourage abusive behaviour, and there is evidence dating back to 2015 showing that anonymous accounts are more likely to share sexist comments and that online harassment victims are often not able to identify their perpetrators because of the way anonymity works online. The Government are doing an important thing here and I applaud them.

I underline that again by saying that recent research from Compassion in Politics showed that more than one in four people were put off posting on social media because of the fear of abuse, particularly from anonymous posters. Far from the status quo promoting freedom of speech, it actually deters freedom of speech, as we have said in other debates, and it particularly affects women. The Government are to be applauded for this measure.

In the work I was doing with the FA and the Premier League around this very issue, I particularly supported their call for a twin-track approach to verified accounts that said that they should be the default and that people should automatically be able to opt out of receiving posts from unverified accounts. The Bill does not go as far as that, and I can understand the Government’s reasons, but I gently point out that 81% of the people who took part in the Compassion in Politics research would willingly provide identification to get a verified account if it reduced unverified posts. They felt that was important. Some 72% supported the idea if it reduced the amount of anonymous posting.

I am touching on clause 58, but I will not repeat myself when we debate that clause. I hope that it will be possible in the code of practice for Ofcom to point out the clear benefits of having verified accounts by default and perhaps urge responsible providers to do the responsible thing and allow their users to automatically filter out unverified accounts. That is what users want, and it is extraordinary that large consumer organisations do not seem to want to give consumers what they want. Perhaps Ofcom can help those organisations understand what their consumers want, certainly in Britain.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we have said previously, it is absolutely right that Ofcom produces guidance for providers of category 1 services to assist with their compliance with the duty. We very much welcome the inclusion and awareness of identity verification forms for vulnerable adult users in subsection (2); once again, however, we feel that that should go further, as outlined in new clause 8.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 58, which was touched on in our last debate, simply sets out Ofcom’s duty to publish guidance for category 1 services to assist them in complying with the user identification duty set out in clause 57. We have probably covered the main points, so I will say nothing further.

Question put and agreed to.

Clause 58 accordingly ordered to stand part of the Bill.

Clause 59

Requirement to report CSEA content to the NCA

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to consider the following:

Clause 67 stand part.

That schedule 9 be the Ninth schedule to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Labour welcomes the important changes that have been made to the Bill since its original draft, which applied only to user-generated pornographic content. The Bill now includes all pornography, and that is a positive step forward. It is also welcome that the provisions do not apply only to commercial pornography. We all know that some of the biggest commercial pornography sites could have switched their business models had these important changes not been made. As we have reiterated, our priority in regulating pornographic content is to keep children safe. The question that we should continue to ask each other is simple: “Is this content likely to harm children?”

We have a few concerns—which were also outlined in evidence by Professor Clare McGlynn—about the definition of “provider pornographic content” in clause 66(3). It is defined as

“pornographic content that is published or displayed on the service by the provider of the service or by a person acting on behalf of the provider (including pornographic content published or displayed…by means of software or an automated tool or algorithm”.

That definition separates provider porn from content that is uploaded or shared by users, which is outlined in clause 49(2). That separation is emphasised in clause 66(6), which states:

“Pornographic content that is user-generated content in relation to an internet service is not to be regarded as provider pornographic content in relation to that service.”

However, as Professor McGlynn emphasised, it is unclear is exactly what will be covered by the words

“acting on behalf of the provider”.

I would appreciate some clarity from the Minister on that point. Could he give some clear examples?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his intervention and for his work on the Joint Committee, which has had a huge impact, as we have seen. I hope that colleagues will join me in thanking the members of the Joint Committee for their work.

My final point on this important clause is in response to a question that the shadow Minister raised about clause 66(3), which makes reference to

“a person acting on behalf of the provider”.

That is just to ensure that the clause is comprehensively drafted without any loopholes. If the provider used an agent or engaged some third party to disseminate content on their behalf, rather than doing so directly, that would be covered too. We just wanted to ensure that there was absolutely no loophole—no chink of light—in the way that the clause was drafted. That is why that reference is there.

I am delighted that these clauses seem to command such widespread support. It therefore gives me great pleasure to commend them to the Committee.

Question put and agreed to.

Clause 66 accordingly ordered to stand part of the Bill.

Clause 67 ordered to stand part of the Bill.

Schedule 9 agreed to.

Clause 68

Duties about regulated provider pornographic content

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move amendment 114, in clause 68, page 60, line 13, at end insert—

“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.

(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.

(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.

(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”

This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 115, in clause 68, page 60, line 17, after “(2)” insert “to (2D)”.

Clause stand part.

New clause 2—Duties regarding user-generated pornographic content: regulated services

“(1) This section sets out the duties which apply to regulated services in relation to user-generated pornographic content.

(2) A duty to verify that each individual featuring in the pornographic content has given their permission for the content in which they feature to be published or made available by the service.

(3) A duty to remove pornographic content featuring a particular individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.

(4) For the meaning of ‘pornographic content’, see section 66(2).

(5) In this section, ‘user-generated pornographic content’ means any content falling within the meaning given by subsection (4) and which is also generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.

(6) For the meaning of ‘regulated service’, see section 2(4).”

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Clause 68 outlines the duties covering regulated provider pornographic content, and Ofcom’s guidance on those duties. Put simply, the amendments are about age verification and consent, to protect women and children who are victims of commercial sexual exploitation.

I am moving a series of targeted amendments, tabled by my right hon. Friend the Member for Kingston upon Hull North (Dame Diana Johnson), which I hope that all hon. Members will be able to support because this is an issue that goes beyond party lines. This is about children who have been sexually abused, women who have been raped, and trafficking victims who have been exploited, who have all suffered the horror of filmed footage of their abuse being published on some of the world’s biggest pornography websites. This is about basic humanity.

Currently, leading pornography websites allow members of the public to upload pornographic videos without verifying that everyone in the film is an adult, that they gave their permission for it to be uploaded to a pornography website, or even that they know the film exists. It is sadly not surprising that because of the absence of even the most basic safety measures, hugely popular and profitable pornography websites have been found hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse. This atrocious practice is ongoing and well documented.

In 2019, PayPal stopped processing payments for Pornhub—one of the most popular pornography websites in the world—after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. That included an account on the site dedicated to posting so-called creepshots of UK schoolgirls. In 2020, The New York Times documented the presence of child abuse videos on Pornhub, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site.

New York Times reporter Nicholas Kristof wrote of Pornhub:

“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”

That particular pornography website is now subject to multiple lawsuits launched against its parent company, MindGeek, by victims whose abuse was published on the site. Plaintiffs include victims of image-based sexual abuse in the UK, such as Crystal Palace footballer Leigh Nicol. Her phone was hacked, and private content was uploaded to Pornhub without her knowledge. She bravely and generously shared her experience in an interview for Sky Sports News, saying:

“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do… The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”

I agree. It is grotesque that pornography website operators do not even bother to verify that everyone featured in films on their sites is an adult or even gave permission for the film to be uploaded. That cannot be allowed to continue.

These amendments, which I hope will receive the cross-party backing that they strongly deserve, would stop pornography websites publishing and profiting from videos of rape and child sexual abuse by requiring them to implement the most basic of prevention measures.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I support the hon. Member’s amendments. The cases that she mentions hammer home the need for women and girls to be mentioned in the Bill. I do not understand how the Government can justify not doing so when she is absolutely laying out the case for doing so.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I agree with the hon. Member and welcome her intervention. We will be discussing these issues time and again during our proceedings. What is becoming even more apparent is the need to include women and girls in the Bill, call out violence against women and girls online for what it is, and demand that the Government go further to protect women and girls. This is yet another example of where action needs to happen. I hope the Minister is hearing our pleas and that this will happen at some point as we make progress through the Bill.

More needs to be done to tackle this problem. Pornography websites need to verify that every individual in pornographic videos published on their site is an adult and gave their permission for the video to be published, and enable individuals to withdraw their consent for pornography of them to remain on the site. These are rock-bottom safety measures for preventing the most appalling abuses on pornography websites.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I add my voice to the arguments made by my hon. Friend and the hon. Member for Aberdeen North. Violence against women and girls is a fundamental issue that the Bill needs to address. We keep coming back to that, and I too hope that the Minister hears that point. My hon. Friend has described some of the most horrific harms. Surely, this is one area where we have to be really clear. If we are to achieve anything with the Bill, this is an area that we should be working on.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I wholeheartedly agree with my hon. Friend. As I have said, the amendments would put in place rock-bottom safety measures that could prevent the most appalling abuses on pornography websites, and it is a scandal that, hitherto, they have not been implemented. We have the opportunity to change that today by voting for the amendments and ensuring that these measures are in place. I urge the Minister and Conservative Members to do the right thing.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank the hon. Lady for giving way. I can understand the intent behind what she is saying and I have a huge amount of sympathy for it, but we know as a matter of fact that many of the images that are lodged on these sorts of websites were never intended to be pornographic in the first place. They may be intimate images taken by individuals of themselves—or, indeed, of somebody else—that are then posted as pornographic images. I am slightly concerned that an image such as that may not be caught by the hon. Lady’s amendments. Would she join me in urging the Government to bring forward the Law Commission’s recommendations on the taking, making and sharing of intimate images online without consent, which are far broader? They would probably do what she wants to do but not run into the problem of whether an image was meant to be pornographic in the first place.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the right hon. Member for her intervention. She knows that I have the utmost respect for all that she has tried to achieve in this area in the House along with my right hon. Friend the Member for Kingston upon Hull North.

We feel these amendments would encapsulate the specific issue of consent-based imagery or video content for which consent has not been obtained. Many of these people do not even know that the content has been taken in the first place, and it is then uploaded to these websites. It would be the website’s duty to verify that consent had been obtained and that the people in the video were of the age of consent. That is why we urge hon. Members to back the amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has laid out compellingly how awful the displaying of images of children on pornography websites and the displaying of images where the consent of the person has not been obtained are. Let me take each of those in turn, because my answers will be a bit different in the two cases.

First, all material that contains the sexual abuse of children or features children at all—any pornographic content featuring children is, by definition, sexual abuse—is already criminalised through the criminal law. Measures such as the Protection of Children Act 1978, the Criminal Justice Act 1988 and the Coroners and Justice Act 2009 provide a range of criminal offences that include the taking, making, circulating, possessing with a view to distributing, or otherwise possessing indecent photos or prohibited images of children. As we would expect, everything that the hon. Lady described is already criminalised under existing law.

This part of the Bill—part 5—covers publishers and not the user-to-user stuff we talked about previously. Because they are producing and publishing the material themselves, publishers of such material are covered by the existing criminal law. What they are doing is already illegal. If they are engaged in that activity, they should—and, I hope, will—be prosecuted for doing it.

The new clause and the amendments essentially seek to duplicate what is already set out very clearly in criminal law. While their intentions are completely correct, I do not think it is helpful to have duplicative law that essentially tries to do the same thing in a different law. We have well established and effective criminal laws in these areas.

In relation to the separate question of people whose images are displayed without their consent, which is a topic that my right hon. Friend the Member for Basingstoke has raised a few times, there are existing criminal offences that are designed to tackle that, including the recent revenge pornography offences in particular, as well as the criminalisation of voyeurism, harassment, blackmail and coercive or controlling behaviour. There is then the additional question of intimate image abuse, where intimate images are produced or obtained without the consent of the subject, and are then disseminated.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I welcome the Minister’s comments and commitment to look at this further, and the Law Commission’s review being taken forward. With that in mind, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 68 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned—(Steve Double.)