Damian Collins debates involving the Department for Digital, Culture, Media & Sport during the 2019-2024 Parliament

Tue 17th Jan 2023
Mon 9th Jan 2023
Channel 4
Commons Chamber
(Urgent Question)
Thu 15th Dec 2022
ONLINE SAFETY BILL (Third sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 3rd sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Mon 5th Dec 2022

Online Safety Bill

Damian Collins Excerpts
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am aware of that case, which is truly appalling and shocking. That is exactly why we need such protections in the Bill: to stop those cases proliferating online, to stop the platforms from choosing their own terms of service, and to give Ofcom real teeth, as a regulator, to take on those challenges.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

Does the hon. Lady accept that the Bill does give Ofcom the power to set minimum safety standards based on the priority legal offences written into the Bill? That would cover almost all the worst kinds of offences, including child sexual exploitation, inciting violence and racial hatred, and so on. Those are the minimum safety standards that are set, and the Bill guarantees them.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

What is not in those minimum safety standards is all the horrendous and harmful content that I have described: covid disinformation, harmful content from state actors, self-harm promotion, antisemitism, misogyny and the incel culture, all of which is proliferating online and being amplified by the algorithms. This set of minimum safety standards can be changed overnight.

Damian Collins Portrait Damian Collins
- Hansard - -

As the hon. Lady knows, foreign-state disinformation is covered because it is part of the priority offences listed in the National Security Bill, so those accounts can be disabled. Everything that meets the criminal threshold is in this Bill because it is in the National Security Bill, as she knows. The criminal threshold for all the offences she lists are set in schedule 7 of this Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

That is just the problem, though, isn’t it? A lot of those issues would not be covered by the minimum standards—that is why we have tabled new clause 4—because they do not currently meet the legal threshold. That is the problem. There is a grey area of incredibly harmful but legal content, which is proliferating online, being amplified by algorithms and by influencers—for want of a better word—and being fed to everybody online. That content is then shared incredibly widely, and that is what is causing harm and disinformation.

Damian Collins Portrait Damian Collins
- Hansard - -

Will the hon. Lady give way one more time?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

No, I will not. I need to make progress; we have a lot to cover and a lot of amendments, as I have outlined.

Under the terms of the Bill, platforms can issue whatever minimum standards they wish and then simply change them at will overnight. In tabling new clause 4, our intention is to ensure that the platforms are not able to avoid safety duties by changing their terms and conditions. As I have said, this group of amendments will give Ofcom the relevant teeth to act and keep everybody safe online.

We all recognise that there will be a learning curve for everyone involved once the legislation is enacted. We want to get that right, and the new clauses will ensure that platforms have specific duties to keep us safe. That is an important point, and I will continue to make it clear at every opportunity, because the platforms and providers have, for far too long, got away with zero regulation—nothing whatsoever—and enough is enough.

During the last Report stage, I made it clear that Labour considers individual liability essential to ensuring that online safety is taken seriously by online platforms. We have been calling for stronger criminal sanctions for months, and although we welcome some movement from the Government on that issue today, enforcement is now ultimately a narrower set of measures because the Government gutted much of the Bill before Christmas. That last minute U-turn is another one to add to a long list, but to be frank, very little surprises me when it comes to this Government’s approach to law-making.

--- Later in debate ---
Luke Pollard Portrait Luke Pollard (Plymouth, Sutton and Devonport) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

I rise to speak in favour of new clause 4, on minimum standards. In particular, I shall restrict my remarks to minimum standards in respect of incel culture.

Colleagues will know of the tragedy that took place in Plymouth in 2021. Indeed, the former Home Secretary, the right hon. Member for Witham (Priti Patel), visited Plymouth to meet and have discussions with the people involved. I really want to rid the internet of the disgusting, festering incel culture that is capturing so many of our young people, especially young men. In particular, I want minimum standards to apply and to make sure that, on big and small platforms where there is a risk, those minimum standards include the recognition of incel content. At the moment, incel content is festering in the darkest corners of the internet, where young men are taught to channel their frustrations into an insidious hatred of women and to think of themselves as brothers in arms in a war against women. It is that serious.

In Parliament this morning I convened a group of expert stakeholders, including those from the Centre for Countering Digital Hate, Tech Against Terrorism, Moonshot, Girlguiding, the Antisemitism Policy Trust and the Internet Watch Foundation, to discuss the dangers of incel culture. I believe that incel culture is a growing threat online, with real-world consequences. Incels are targeting young men, young people and children to swell their numbers. Andrew Tate may not necessarily be an incel, but his type of hate and division is growing and is very popular online. He is not the only one, and the model of social media distribution that my right hon. Friend the Member for Barking (Dame Margaret Hodge) spoke about incentivises hate to be viewed, shared and indulged in.

This Bill does not remove incel content online and therefore does not prevent future tragedies. As chair of the all-party parliamentary group on social media, I want to see minimum standards to raise the internet out of the sewer. Where is the compulsion for online giants such as Facebook and YouTube to remove incel content? Five of the most popular incel channels on YouTube have racked up 140,000 subscribers and 24 million views between them, and YouTube is still platforming four of those five. Why? How can these channels apparently pass YouTube’s current terms and conditions? The content is truly harrowing. In these YouTube videos, men who have murdered women are described as saints and lauded in incel culture.

We know that incels use mainstream platforms such as YouTube to reel in unsuspecting young men—so-called normies—before linking them to their own small, specialist websites that show incel content. This is called breadcrumbing: driving traffic and audiences from mainstream platforms to smaller platforms—which will be outside the scope of category 1 provisions and therefore any minimum standards—where individuals start their journey to incel radicalisation.

I think we need to talk less about freedom of speech and more about freedom of reach. We need to talk about enabling fewer and fewer people to see that content, and about down-ranking sites with appalling content like this to increase the friction to reduce audience reach. Incel content not only includes sexist and misogynist material; it also frequently includes anti-Semitic, racist, homophobic and transphobic items layered on top of one another. However, without a “legal but harmful” provision, the Bill does nothing to force search engines to downrate harmful content. If it is to be online, it needs to be harder and harder to find.

I do not believe that a toggle will be enough to deal with this. I agree with amendment 43—if we are to have a toggle, the default should be the norm—but I do not think a toggle will work because it will be possible to evade it with a simple Google Chrome extension that will auto-toggle and therefore make it almost redundant immediately. It will be a minor inconvenience, not a game changer. Some young men spent 10 hours a day looking at violent incel content online. Do we really think that a simple button, a General Data Protection Regulation annoyance button, will stop them from doing so? It will not, and it will not prevent future tragedies.

However, this is not just about the effect on other people; it is also about the increase in the number of suicides. One of the four largest incel forums is dedicated to suicide and self-harm. Suicide is normalised in the forum, and is often referred to as “catching the bus.” People get together to share practical advice on how they can take their own lives. That is not content to which we should be exposing our young people, but it is currently legal. It is harmful, but it will remain legal under the Bill because the terms and conditions of those sites are written by incels to promote incel content. Even if the sites were moved from category 2 to category 1, they would still pass the tests in the Bill, because the incels have written the terms and conditions to allow that content.

Why are smaller platforms not included in the Bill? Ofcom should have the power to bring category 2 sites into scope on the basis of risk. Analysis conducted by the Center for Countering Digital Hate shows that on the largest incel website, rape is mentioned in posts every 29 minutes, with 89% of those posts referring to it in a positive sense. Moreover, 50% of users’ posts about child abuse on the same site are supportive of paedophilia. Indeed, the largest incel forum has recently changed its terms and conditions to allow mention of the sexualisation of pubescent minors—unlike pre-pubescent minors; it makes that distinction. This is disgusting and wrong, so why is it not covered in the Bill? I think there is a real opportunity to look at incel content, and I would be grateful if the Minister met the cross-party group again to discuss how we can ensure that it is harder and harder to find online and is ultimately removed, so that we can protect all our young people from going down this path.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made an excellent speech about new clause 2, a clause with which I had some sympathy. Indeed, the Joint Committee that I chaired proposed that there should be criminal liability for failure to meet the safety duties set out in the Bill, and that that should apply not just to child safety measures, but to any such failure.

However, I agree with my right hon. and learned Friend that, as drafted, the new clause is too wide. If it is saying that the liability exists when the failure to meet the duties has occurred, who will be the determinant of that factor? Will it be determined when Ofcom has issued a notice, or when it has issued a fine? Will it be determined when guidance has been given and has not been followed? What we do not want to see is a parallel judicial system in which decisions are made that are different from those of the regulator in respect of when the safety duties had not been met.

I think it is when there are persistent breaches of the safety duties, when companies have probably already been fined and issued with guidance, and when it has been demonstrated that they are clearly in breach of the codes of practice and are refusing to abide by them, that the criminal liability should come in. Similar provisions already exist in the GDPR legislation for companies that are in persistent breach of their duties and obligations. The Joint Committee recommended that this should be included in the Bill, and throughout the journey of this legislation the inclusion of criminal liability has been consistently strengthened. When the draft Bill was published there was no immediate commencement of any criminal liability, even for not complying with the information notices given by Ofcom, but that was included when the Bill was presented for Second Reading. I am pleased that the Government are now going to consider how we can correctly define what a failure to meet the safety duties would be and therefore what the committal sanction that goes with it would be. That would be an important measure for companies that are in serial breach of their duties and obligations and have no desire to comply.

--- Later in debate ---
Baroness May of Maidenhead Portrait Mrs Theresa May (Maidenhead) (Con)
- Hansard - - - Excerpts

My hon. Friend has referenced the proposals from my hon. Friend the Member for Dover (Mrs Elphicke). I am grateful to the Minister and the Secretary of State for the discussions they have had with me on making modern slavery a specific priority offence, as well as illegal immigration. I think this is very important.

Damian Collins Portrait Damian Collins
- Hansard - -

I agree with my right hon. Friend; that is exactly right, and it is also right that we look at including additional offences on the face of the Bill in schedule 7 as offences that will be considered as part of the legislation.

Where this touches on advertising, the Government have already accepted, following the recommendation of the Joint Committee, that the promotion of fraud should be regulated in the Bill, even if it is in advertising. There are other aspects of this, too, including modern slavery and immigration, where we need to move at pace to close the loophole where consideration was to be given to advertising outside of the Bill through the online advertising review. The principle has already been accepted that illegal activity promoted through an advert on an online platform should be regulated as well as if it was an organic posting. That general provision does not yet exist, however. Given that the Government have considered these additional amendments, which was the right thing to do, they also need to look at the general presumption that any illegal activity that is a breach of the safety duties should be included and regulated, and that if somebody includes it in an advert it does not become exempt, when it would be regulated if it was in an organic posting.

Matt Rodda Portrait Matt Rodda (Reading East) (Lab)
- View Speech - Hansard - - - Excerpts

I would like to focus on new clause 1, dealing with redress, new clause 43, dealing with the toggle default, and new clause 4 on minimum standards. This Bill is a very important piece of legislation, but I am afraid that it has been seriously watered down by the Government. In particular, it has been seriously weakened by the removal of measures to tackle legal but harmful content. I acknowledge that some progress has been made recently, now that the Government have accepted the need for criminal sanctions for senior managers of tech companies. However, there are still many gaps in the Bill and I want to deal with some of them in the time available to me tonight.

First, I pay tribute to the families who have lost children due to issues related to social media. Some of those families are in the Public Gallery tonight. In particular, I want to mention the Stephens family from my Reading East constituency. Thirteen-year-old Olly Stephens was murdered in an horrific attack following a plot hatched on social media. The two boys who attacked Olly had both shared dozens of images of knives online, and they used 11 different social media platforms to do so. Sadly, none of the platforms took down the content, which is why these matters are so important to all of us and our communities.

Following this awful case, I support a number of new clauses that I believe would lead to a significant change in the law to prevent a similar tragedy. I stress the importance of new clause 1, which would help parents to make complaints. As Olly’s dad, Stuart, often says, “You simply cannot contact the tech companies. You send an email and get no reply.” It is important to tackle this matter, and I believe that new clause 1 would go some way towards doing that.

As others have said, surely it makes sense for parents to know their children have some protection from harmful content. New clause 43 would provide reassurance by introducing a default position of protecting children. I urge Members on both sides of the House to support this new clause. Both children and vulnerable adults should be better protected from legal but harmful content, and further action should be taken. New clause 43 would take clear steps in that direction.

I am aware of time, and I support many other important new clauses. I reiterate my support and backing for my Front-Bench colleague, my hon. Friend the Member for Pontypridd (Alex Davies-Jones). Thank you, Madam Deputy Speaker, for the opportunity to contribute to this debate.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

For the purpose of future-proofing, we have tried to make the Bill as flexible and as technologically neutral as possible so that it can adapt to changes. I think we will need to review it, and indeed I am sure that, as technology changes, we will come back with new legislation in the future to ensure that we continue to be world-beating—but let us see where we end up with that.

Damian Collins Portrait Damian Collins
- Hansard - -

May I follow up my hon. Friend’s response to our right hon. Friend the Member for Bromsgrove (Sajid Javid)? If it is the case that coroners cannot access data and information that they need in order to go about their duties—which was the frustrating element in the Molly Russell case—will the Government be prepared to close that loophole in the House of Lords?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We will certainly work with others to address that, and if there is a loophole, we will seek to act, because we want to ensure—

--- Later in debate ---
Michelle Donelan Portrait Michelle Donelan
- Hansard - - - Excerpts

I met my right hon. Friend today to discuss that very point, which is particularly important and powerful. I look forward to continuing to work with her and the Ministry of Justice as we progress this Bill through the other place.

The changes are balanced with new protections for free speech and journalism—two of the core pillars of our democratic society. There are amendments to the definition of recognised news publishers to ensure that sanctioned outlets such as RT must not benefit.

Since becoming Secretary of State I have made a number of my own changes to the Bill. First and foremost, we have gone even further to boost protections for children. Social media companies will face a new duty on age limits so they can no longer turn a blind eye to the estimated 1.6 million underage children who currently use their sites. The largest platforms will also have to publish summaries of their risk assessments for illegal content and material that is harmful for children—finally putting transparency for parents into law.

I believe it is blindingly obvious and morally right that we should have a higher bar of protection when it comes to children. Things such as cyber-bullying, pornography and posts that depict violence do enormous damage. They scar our children and rob them of their right to a childhood. These measures are all reinforced by children and parents, who are given a real voice in the legislation by the inclusion of the Children’s Commissioner as a statutory consultee. The Bill already included provisions to make senior managers liable for failure to comply with information notices, but we have now gone further. Senior managers who deliberately fail children will face criminal liability. Today, we are drawing our line in the sand and declaring that the UK will be the world’s first country to comprehensively protect children online.

Those changes are completely separate to the changes I have made for adults. Many Members and stakeholders had concerns over the “legal but harmful” section of the Bill. They were concerned that it would be a serious threat to legal free speech and would set up a quasi-legal grey area where tech companies would be encouraged to take down content that is perfectly legal to say on our streets. I shared those concerns, so we have removed “legal but harmful” for adults. We have replaced it with a much simpler and fairer and, crucially, much more effective mechanism that gives adults a triple shield of protection. If it is illegal, it has to go. If it is banned under the company’s terms and conditions, it has to go.

Lastly, social media companies will now offer adults a range of tools to give them more control over what they see and interact with on their own feeds.

Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. Friend makes an important point about things that are illegal offline but legal online. The Bill has still not defined a lot of content that could be illegal and yet promoted through advertising. As part of their ongoing work on the Bill and the online advertising review, will the Government establish the general principle that content that is illegal will be regulated whether it is an ad or a post?

Michelle Donelan Portrait Michelle Donelan
- Hansard - - - Excerpts

I completely agree with my hon. Friend on the importance of this topic. That is exactly why we have the online advertising review, a piece of work we will be progressing to tackle the nub of the problem he identifies. We are protecting free speech while putting adults in the driving seat of their own online experience. The result is today’s Bill.

I thank hon. Members for their hard work on this Bill, including my predecessors, especially my right hon. Friend the Member for Mid Bedfordshire (Ms Dorries). I thank all those I have worked with constructively on amendments, including my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates), for Stone (Sir William Cash), for Dover (Mrs Elphicke), for Rutland and Melton (Alicia Kearns), and my right hon. Friends the Members for South Holland and The Deepings (Sir John Hayes), for Chelmsford (Vicky Ford), for Basingstoke (Dame Maria Miller) and for Romsey and Southampton North (Caroline Nokes).

I would like to put on record my gratitude for the hard work of my incredibly dedicated officials—in particular, Sarah Connolly, Orla MacRae and Emma Hindley, along with a number of others; I cannot name them all today, but I note their tremendous and relentless work on the Bill. Crucially, I thank the charities and devoted campaigners, such as Ian Russell, who have guided us and pushed the Bill forward in the face of their own tragic loss. Thanks to all those people, we now have a Bill that works.

Legislating online was never going to be easy, but it is necessary. It is necessary if we want to protect our values —the values that we protect in the real world every single day. In fact, the NSPCC called this Bill “a national priority”. The Children’s Commissioner called it

“a once-in-a-lifetime opportunity to protect all children”.

But it is not just children’s organisations that are watching. Every parent across the country will know at first hand just how difficult it is to shield their children from inappropriate material when social media giants consistently put profit above children’s safety. This legislation finally puts it right.

Channel 4

Damian Collins Excerpts
Monday 9th January 2023

(1 year, 10 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts

Urgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.

Each Urgent Question requires a Government Minister to give a response on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Michelle Donelan Portrait Michelle Donelan
- View Speech - Hansard - - - Excerpts

The creative sector is important to the whole UK economy, not just to London. That is why I am delighted that, as part of this package, Channel 4 has also agreed to double the number of jobs outside London, which goes to the hon. Gentleman’s point that it is important that we are boosting the creative sector all around the UK.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

I agree with my right hon. Friend that reform is needed for Channel 4 to thrive in the future. Can she say whether the review will include Channel 4 having the ability not just to take a stake in programmes, which it cannot do at the moment, but to attract additional investment to go into programme making, as Channel 4 requested as part of its response to the Government’s review?

Michelle Donelan Portrait Michelle Donelan
- View Speech - Hansard - - - Excerpts

On accessing borrowing, we will make it easier for Channel 4 to draw down on its existing allowance, but any additional borrowing will be taken on a case-by-case basis.

ONLINE SAFETY BILL (Third sitting)

Damian Collins Excerpts
Committee stage (re-committed clauses and schedules)
Thursday 15th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 15 December 2022 - (15 Dec 2022)
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Ofcom will assess services that are close to meeting the threshold conditions of category 1 services and will publish a publicly available list of those emerging high-risk services. A service would have to meet two conditions to be added to the emerging services list: it would need at least 75% of the number of user figures in any category 1 threshold condition, and at least one functionality of a category 1 threshold condition, or one specified combination of a functionality and a characteristic or factor of a category 1 threshold condition.

Ofcom will monitor the emergence of new services. If it becomes apparent that a service has grown sufficiently to meet the threshold of becoming a category 1 service, Ofcom will be required to add that service to the register. The new clause and the consequential amendments take into account the possibility of quick growth.

Following the removal of “legal but harmful” duties, category 1 services will be subject to new transparency, accountability and free speech duties, as well as duties relating to protection for journalists and democratic content. Requiring all companies to comply with that full range of category 1 duties would pose a disproportionate regulatory burden on smaller companies that do not exert the same influence on public discourse, and that would possibly divert those companies’ resources away from tackling vital tasks.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

Will my hon. Friend confirm that the risk assessments for illegal content—the priority illegal offences; the worst kind of content—apply to all services, whether or not they are category 1?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. All companies will still have to tackle the risk assessment, and will have to remove illegal content. We are talking about the extra bits that could take a disproportionate amount of resource from core functions that we all want to see around child protection.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. The Department has techniques for dealing with misinformation and disinformation as well, but we will absolutely push Ofcom to work as quickly as possible. As my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the former Secretary of State, has said, once an election is done, it is done and it cannot be undone.

Damian Collins Portrait Damian Collins
- Hansard - -

Could the Minister also confirm that the provisions of the National Security Bill read across to the Online Safety Bill? Where disinformation is disseminated by networks operated by hostile foreign states, particularly Russia, as has often been the case, that is still in scope. That will still require a risk assessment for all platforms, whether or not they are category 1.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed. We need to take a wide-ranging, holistic view of disinformation and misinformation, especially around election times. There is a suite of measures available to us, but it is still worth pushing Ofcom to make sure that it works as quickly as possible.

Amendment 48 agreed to.

Amendment made: 49, in clause 82, page 72, line 23, after “conditions” insert

“or the conditions in section (List of emerging Category 1 services)(2)”.—(Paul Scully.)

This is a technical amendment ensuring that references to assessments of user-to-user services in the new clause inserted by NC7 relate to the user-to-user part of the service.

Clause 82, as amended, ordered to stand part of the Bill.

Schedule 11

Categories of regulated user-to-user services and regulated search services: regulations

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I rise briefly to support everything the hon. Member for Aberdeen North just said. We have long called for the Bill to take a harm-led approach; indeed, the Government initially agreed with us, as when it was in its first iteration it was called the Online Harms Bill rather than the Online Safety Bill. Addressing harm must be a central focus of the Bill, as we know extremist content is perpetuated on smaller, high-harm platforms; this is something that the Antisemitism Policy Trust and Hope not Hate have long called for with regards to the Bill.

I want to put on the record our huge support for the amendment. Should the hon. Lady be willing to push it to a vote—I recognise that we are small in number—we will absolutely support her.

Damian Collins Portrait Damian Collins
- Hansard - -

I want to speak briefly to the amendment. I totally understand the reasons that the hon. Member for Aberdeen North has tabled it, but in reality, the kinds of activities she describes would be captured anyway, because most would fall within the remit of the priority illegal harms that all platforms and user-to-user services have to follow. If there were occasions when they did not, being included in category 1 would mean that they would be subject to the additional transparency of terms of service, but the smaller platforms that allow extremist behaviour are likely to have extremely limited terms of service. We would be relying on the priority illegal activity to set the minimum safety standards, which Ofcom would be able to do.

It would also be an area where we would want to move at pace. Even if we wanted to bring in extra risk assessments on terms of service that barely exist, the time it would take to do that would not give a speedy resolution. It is important that in the way Ofcom exercises its duties, it does not just focus on the biggest category 1 platforms but looks at how risk assessments for illegal activity are conducted across a wide range of services in scope, and that it has the resources needed to do that.

Even within category 1, it is important that is done. We often cite TikTok, Instagram and Facebook as the biggest platforms, but I recently spoke to a teacher in a larger secondary school who said that by far the worst platform they have to deal with in terms of abuse, bullying, intimidation, and even sharing of intimate images between children, is Snapchat. We need to ensure that those services get the full scrutiny they should have, because they are operating at the moment well below their stated terms of service, and in contravention of the priority illegal areas of harm.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am glad that we are all in agreement on the need for a review. It is important that we have a comprehensive and timely review of the regulatory regime and how it is built into legislation. It is important that we understand that the legislation has the impact that we intend.

The legislation clearly sets out what the review must consider, how Ofcom is carrying out its role and if the legislation is effective in dealing with child protection, which as the hon. Lady rightly says is its core purpose. We have struck the balance of specifying two to five years after the regime comes into force, because it provides a degree of flexibility to future Ministers to judge when it should happen. None the less, I take the hon. Lady’s point that technology is developing. That is why this is a front-footed first move in this legislation, when other countries are looking at what we are doing; because of that less prescriptive approach to technologies, the legislation can be flexible and adapt to emerging new technologies. Inevitably, this will not be the last word. Some of the things in the Digital Economy Act 2017, for example, are already out of date, as is some of the other legislation that was put in place in the early 2000s. We will inevitably come back to this, but I think we have the right balance at the moment in terms of the timing.

I do not think we need to bed in whom we consult, but wider consultation will none the less be necessary to ascertain the effectiveness of the legislation.

Damian Collins Portrait Damian Collins
- Hansard - -

I am following carefully what the Minister says, but I would say briefly that a lot of the debate we have had at all stages of the Bill has rested on how we believe Ofcom will use the powers it has been given, and we need to make sure that it does that. We need to ensure that it is effective and that it has the resources it needs. The hon. Member for Aberdeen North (Kirsty Blackman) makes an important point that it may not be enough to rely on a Select Committee of the Lords or the Commons having the time to do that in the detail we would want. We might need to consider either a post-legislative scrutiny Committee or some other mechanism to ensure that there is the necessary level of oversight.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. The report as is obviously has to be laid before Parliament and will form part of the package of parliamentary scrutiny. But, yes, we will consider how we can utilise the expertise of both Houses in post-legislative scrutiny. We will come back on that.

Question put and agreed to.

Clause 155, as amended, accordingly ordered to stand part of the Bill.

Clause 169

Individuals providing regulated services: liability

Amendment made: 57, in clause 169, page 143, line 15, at end insert—

“(fa) Chapter 2A of Part 4 (terms of service: transparency, accountability and freedom of expression);”.—(Paul Scully.)

Clause 169 is about liability of providers who are individuals. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6, so that individuals may be jointly and severally liable for the duties imposed by that Chapter.

Clause 169, as amended, ordered to stand part of the Bill.

Clause 183 ordered to stand part of the Bill.

Schedule 17

Video-sharing platform services: transitional provision etc

Amendments made: 94, in schedule 17, page 235, line 43, leave out paragraph (c).

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 95, in schedule 17, page 236, line 27, at end insert—

“(da) the duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (terms of service);”.—(Paul Scully.)

This amendment ensures that services already regulated under Part 4B of the Communications Act 2003 (video-sharing platform services) are not required to comply with the new duties imposed by NC3 and NC4 during the transitional period.

Question proposed, That the schedule, as amended, be the Seventeenth schedule to the Bill.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The clause provides legal certainty about the meaning of those terms as used in the Bill: things such as “content”, “encounter”, “taking down” and “terms of service”. That is what the clause is intended to do. It is intentional and is for the reasons the hon. Lady said. Oral means speech and speech only. Aural is speech and other sounds, which is what can be heard on voice calls. That includes music as well. One is speech. The other is the whole gamut.

Damian Collins Portrait Damian Collins
- Hansard - -

I am intrigued, because the hon. Member for Aberdeen North makes an interesting point. It is not one I have heard made before. Does the Minister think there is a distinction between oral and aural, where oral is live speech and aural is pre-recorded material that might be played back? Are those two are considered distinct?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My knowledge is being tested, so I will write to the hon. Member for Aberdeen North and make that available to the Committee. Coming back to the point she made about oral and aural on Tuesday about another clause on the exclusions, as I said, we have a narrow exemption to ensure that traditional phone calls are not subject to regulation. But that does mean that if a service such as Fortnite, which she spoke about previously, enables adults and children to have one-to-one oral calls, companies will still need to address the surrounding functionality around how that happens, because to enable that might cause harm—for example if an adult can contact an unknown child. That is still captured within the Bill.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

I want to briefly speak on this amendment, particularly as my hon. Friend the Member for Don Valley referenced the report by the Joint Committee, which I chaired. As he said, the Joint Committee considered the question of systematic abuse. A similar provision exists in the data protection legislation, whereby any company that is consistently in breach could be considered to have failed in its duties under the legislation and there could be criminal liability. The Joint Committee considered whether that should also apply with the Online Safety Bill.

As the Bill has gone through its processes, the Government have brought forward the commencement of criminal liability for information offences, whereby if a company refuses to respond to requests for information or data from the regulator, that would be a breach of their duties; it would invoke criminal liability for a named individual. However, I think the question of a failure to meet the safety duty set out in the Bill really needs to be framed along the lines of being a systematic and persistent breach, as the Joint Committee recommended. If, for example, a company was prepared to ignore requests from Ofcom, use lawyers to evade liability for as long as possible and consistently pay fines for serious breaches without ever taking responsibility for them, what would we do then? Would there be some liability at that point?

The amendment drafted by my hon. Friend the Member for Stone (Sir William Cash) is based on other existing legislation, and on there being knowledge—with “consent or connivance”. We can see how that would apply in cases such as the diesel emissions concerns raised at Volkswagen, where there was criminal liability, or maybe the LIBOR bank rate rigging and the serious failures there. In those cases, what was discovered was senior management’s knowledge and connivance; they were part of a process that they knew was illegal.

With the amendment as drafted, the question we would have is: could it apply for any failure? Where management could say, “We have created a system to resolve this system that hasn’t worked on this occasion”, would that trigger it? Or is it something broader and more systematic? These failures will be more about the failure to design a regime that takes into account the required stated duties, rather than a particular individual act, such as the rigging of the LIBOR rates or giving false public information on diesel emissions, which could only be made at a corporate level.

When I chaired the Joint Committee, we raised the question, “What about systematic failure, as we have that as an offence in data protection legislation?” I still think that would be an interesting question to consider when the Bill goes to another place. However, I have concerns that the current drafting would not fit quite as well in the online safety regime as it does in other industries. It would really need to reflect consistent, persistent failures on behalf of a company that go beyond the criminal liabilities that already exist in the Bill around information offences.

None Portrait The Chair
- Hansard -

Just to be clear, it is new clause 9 that we are reading a Second time, not an amendment.

Damian Collins Portrait Damian Collins
- Hansard - -

Forgive me, Dame Angela.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

I rise to recognise the spirit and principle behind new clause 9, while, of course, listening carefully to the comments made by my hon. Friend the Member for Folkestone and Hythe. He is right to raise those concerns, but my question is: is there an industry-specific way in which the same responsibility and liability could be delivered?

I recognise too that the Bill is hugely important. It is a good Bill that has child protection at its heart. It also contains far more significant financial penalties than we have previously seen—as I understand it, 10% of qualifying revenue up to £18 million. This will drive some change, but it comes against the backdrop of multi-billion-pound technology companies.

I would be interested to understand whether a double lock around the board-level responsibility might further protect children from some of the harrowing and harmful content we see online. What we need is nothing short of transformation and significant culture change. Even today, The Guardian published an article about TikTok and a study by the Centre for Countering Digital Hate, which found that teenagers who demonstrated an interest in self-harm and eating disorders were having algorithms pushing that content on to them within minutes. That is most troubling.

We need significant, serious and sustained culture change. There is precedent in other sectors, as has been mentioned, and there was a previous recommendation, so clearly there is merit in this. My understanding is that there is strong public support, because the public recognise that this new responsibility cannot be strengthened by anything other than liability. If there is board-level liability, that will drive priorities and resources, which will broker the kind of change we are looking for. I look forward to what the Minister might share today, as this has been a good opportunity to bring these issues into further consideration, and they might then be carried over into subsequent stages of this excellent Bill.

ONLINE SAFETY BILL (Second sitting)

Damian Collins Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Hate crime legislation will always be considered by the Ministry of Justice, but I am not committing to any changes. That is beyond my reach, but the two shields that we talked about are underpinned by a safety net.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

Does my hon. Friend agree that the risk assessments that will be done on the priority illegal offences are very wide ranging, in addition to the risk assessments that will be done on meeting the terms of service? They will include racially and religiously motivated harassment, and putting people in fear of violence. A lot of the offences that have been discussed in the debate would already be covered by the adult safety risk assessments in the Bill.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I absolutely agree. As I said in my opening remarks about the racial abuse picked up in relation to the Euro 2020 football championship, that would have been against the terms and conditions of all those platforms, but it still happened as the platforms were not enforcing those terms and conditions. Whether we put them on a list in the Bill or talk about them in the terms of the service, they need to be enforced, but the terms of service are there.

Damian Collins Portrait Damian Collins
- Hansard - -

On that point, does my hon. Friend also agree that the priority legal offences are important too? People were prosecuted for what they posted on Twitter and Instagram about the England footballers, so that shows that we understand what racially motivated offences are and that people are prosecuted for them. The Bill will require a minimum regulatory standard that meets that threshold and requires companies to act in cases such as that one, where we know what this content is, what people are posting and what is required. Not only will the companies have to act, but they will have to complete risk assessments to demonstrate how they will do that.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed. I absolutely agree with my hon. Friend and that is a good example of enforcement being used. People can be prosecuted if such abuse appears on social media, but a black footballer, who would otherwise have seen that racial abuse, can choose in the user enforcement to turn that off so that he does not see it. That does not mean that we cannot pursue a prosecution for racial abuse via a third-party complaint or via the platform.

--- Later in debate ---
As a result of the Bill, people will be able to stop seeing content on YouTube, for example, promoting eating disorders, but they might not be able to stop seeing content promoting online poker sites, when that might be causing a significant issue for their health, so not including that is bit of an oversight. As I say, user empowerment is important, but the Government have not implemented it in nearly as good a way as they should have done, and the Opposition amendments would make the Government amendments better.
Damian Collins Portrait Damian Collins
- Hansard - -

I rise briefly to say that the introduction of the shields is a significant additional safety measure in the Bill and shows that the Government have thought about how to improve certain safety features as the Bill has progressed.

In the previous version of the Bill, as we have discussed at length, there were the priority legal offences that companies had to proactively identify and mitigate, and there were the measures on transparency and accountability on the terms of service. However, if pieces of content fell below the threshold for the priority legal offences or were not covered, or if they were not addressed in the terms of service, the previous version of the Bill never required the companies to act in any particular way. Reports might be done by Ofcom raising concerns, but there was no requirement for further action to be taken if the content was not a breach of platform policies or the priority safety duties.

The additional measure before us says that there may be content where there is no legal basis for removal, but users nevertheless have the right to have that content blocked. Many platforms offer ad tools already—they are not perfect, but people can opt in to say that they do not want to see ads for particular types of content—but there was nothing for the types of content covered by the Online Safety Bill, where someone could say, “I want to make sure I protect myself from seeing this at all,” and then, for the more serious content, “I expect the platforms to take action to mitigate it.” So this measure is an important additional level of protection for adult users, which allows them to give themselves the certainty that they will not see certain types of content and puts an important, additional duty on the companies themselves.

Briefly, on the point about gambling, the hon. Member for Aberdeen North is quite right to say that someone can self-exclude from gambling at the betting shop, but the advertising code already requires that companies do not target people who have self-excluded with advertising messages. As the Government complete their online advertising review, which is a separate piece of work, it is important that that is effectively enforced on big platforms, such as Facebook and Google, to ensure that they do not allow companies to advertise to vulnerable users in breach of the code. However, that can be done outside the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My concern is not just about advertising content or stuff that is specifically considered as an advert. If someone put up a TikTok video about how to cheat an online poker system, that would not be classed as an advert and therefore would not be caught. People would still be able to see it, and could not opt out.

Damian Collins Portrait Damian Collins
- Hansard - -

I totally appreciate the point that the hon. Lady makes, which is a different one. For gambling, the inducement to act straightaway often comes in the form of advertising. It usually comes in the form of free bets and immediate inducements to act. People who have self-excluded should not be targeted in that way. We need to ensure that that is rigorously enforced on online platforms too.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. It is lovely to be back in a Public Bill Committee with many familiar faces—and a few new ones, including the Minister. However, after devoting many weeks earlier this year to the previous Committee, I must admit that it is with some frustration that we are back here with the Government intent on further weakening their Bill.

Throughout the passage of the Bill, I have raised a number of specific concerns, from democratic and journalistic exemptions, to age verification, recognised news publishers, advocacy bodies and media literacy. On clause 14, while I support the principles of Government amendments 15 and 16, I draw the Minister’s attention to the importance of amendment (a) to amendment 15 and amendment (a) to amendment 16. He has already said that he is sympathetic to those amendments. Let me try to convince him to turn that sympathy into action.

I will focus primarily on an issue that is extremely important to me and to many others: extremism and radicalisation. However, while I will focus on the dangers of extremism and radicalisation, be it right-wing, Islamist, incel or other, the dangers that I am about to set out—the chain of events that leads to considerable harm online—are the same for self-harm content, eating disorder content, health disinformation, climate change disinformation or any dangerous, hateful material directed at people based on their sex, sexual orientation, ethnicity, religion or other characteristics.

Such content is not just deeply offensive and often wholly inaccurate; it is dangerous and vile and serves only to spread harm, misinformation and conspiracy. To be clear, such content is not about a social media user stating how upset and angry they are about the football result, or somebody disagreeing legitimately and passionately about a political issue. It is not the normal, everyday social media content that most people see on their feeds.

This is content that is specifically, carefully and callously designed to sit just below the criminal threshold, yet that can still encourage violence, self-harm or worse. It is content used by extremists of all types that lures vulnerable people in, uses social media likes and comments to create the illusion of legitimacy and popularity, and then directly targets those most likely to be susceptible, encouraging them either to commit harm or to move on to smaller but high-harm platforms that may fall out of the scope of the Bill. This is not free speech; it is content that can act as a dangerous gateway to radicalisation and extremism. The Government know how dangerous it is because their own report from His Majesty’s Prison and Probation Service last year found:

“The Internet appears to be playing an increasingly prominent role in radicalisation processes of those convicted of extremist offences in England and Wales.”

Hon. Members will understand my deep and personal interest in this matter. Since the murder of my sister, a Member of this House, six and a half years ago by a far-right extremist, I have worked hard to bring communities and people together in the face of hatred. Some of that work has included meeting former extremists and discussing how they were radicalised. Those conversations were never easy, but what became very clear to me was that such people are not born extremists. Their radicalisation starts somewhere, and it is often somewhere that appears to be completely innocent, such as a Facebook group about issues or problems in their community, a Twitter discussion about current affairs or the state of the country, or even a page for supporters of their football team.

One day, a comment is posted that is not illegal and is not hate speech, but that references a conspiracy or a common trope. It is an ideological remark placed there to test the water. The conversation moves on and escalates. More disturbing or even violent comments start to be made. They might be accompanied by images or videos, leading those involved down a more sinister path. Nothing yet is illegal, but clearly—I hope we would all agree—it is unacceptable.

The number of contributors reduces, but a few remain. No warnings are presented, no flags are raised and it appears like normal social media content. However, the person reading it might be lonely or vulnerable, and now feels that they have found people to listen to them. They might be depressed or unhappy and looking to blame their situation on something or someone. They might feel that nobody understands them, but these people seem to.

The discussion is then taken to a more private place, to the smaller but more harmful platforms that may fall outside the scope of the Bill, but that will now become the go-to place for spreading extremism, misinformation and other harmful content. The radicalisation continues there—harder to track, harder to monitor and harder to stop. Let us remember, however, that all of that started with those legal but harmful comments being witnessed. They were clearly unacceptable, but mainstream social media give them legitimacy. The Online Safety Bill will do nothing to stop that.

Unfortunately, that chain of events occurs far too often. It is a story told many times, about how somebody vulnerable is lured in by those wishing to spread their hatred. It is hosted by major social media platforms. Hon. Members may remember the case of John, a teenager radicalised online and subsequently sentenced. His story was covered by The Guardian last year. John was feeling a sense of hopelessness, which left him susceptible to the messaging of the far right. Aged 15, he felt “written off”: he was in the bottom set at school, with zero exam expectations, and feeling that his life opportunities would be dismal. The far right, however, promised him a future. John became increasingly radicalised by an online barrage of far-right disinformation. He said:

“I was relying on the far right for a job. They were saying that when they got power they would be giving jobs to people like me”.

John now says:

“Now I know the posts were all fake, but the 15-year-old me didn’t bother to fact-check.”

For some people in the room, that might seem like a totally different world. Thankfully, for most of us, it is. However, if Members take the time to see some of that stuff online, it is extremely disturbing and alarming. It is a world that we do not understand, but we have to be aware that it exists. The truth, as we can see, is that such groups use popular online platforms to lure in young people and give them a sense of community. One white nationalist group actively targets younger recruits and recently started Call of Duty warcraft gaming tournaments for its supporters. Let us be clear: John was 15, but he could easily have been 18, 19 or indeed significantly older.

John was radicalised by the far right, but we know that similar methods are used by Islamist extremists. A 2020 report from New York University’s Centre for Global Affairs stated:

“The age of social media has allowed ISIS to connect with a large-scale global audience that it would not be able to reach without it...Through strategic targeting, ISIS selects those who are most vulnerable and susceptible to radicalization”.

That includes those who are

“searching for meaning or purpose in their life, feeling anger and…alienated from society”.

The ages that are most vulnerable are 15 to 25.

Social media platforms allow ISIS to present its propaganda as mainstream news at little to no cost. Preventing that harm and breaking those chains of radicalisation is, however, possible, and the Bill could go much further to put the responsibility not on the user, but on the platforms. I believe that those platforms need unique regulation, because social media interaction is fundamentally different from real-life social interaction.

Social media presents content to us as if it is the only voice and viewpoint. On social media, people are far more likely to say things that they never would in person. On social media, those views spread like wildfire in a way that they would not in real life. On social media, algorithms find such content and pump it towards us, in a way that can become overwhelming and that can provide validity and reassurance where doubt might otherwise set in.

Allowing that content to remain online without warnings, or allowing it to be visible to all users unless they go searching through their settings to turn it off—which is wholly unrealistic—is a dereliction of duty and a missed opportunity to clean up the platforms and break the chains of radicalisation. As I set out, the chain of events is not unique to one form of radicalisation or hateful content. The same online algorithms that present extremist content to users also promote negative body image, eating disorders, and self-harm and suicide content.

I hope the Committee realises why I am so impassioned about “legal but harmful” clauses, and why I am particularly upset that a few Conservative Members appear to believe that such content should remain unchecked online because of free speech, with full knowledge that it is exactly that content that serves as the gateway for people to self-harm and to be radicalised. That is not free speech.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am seeking to impose new duties on category 1 services to ensure that they are held accountable to their terms of service and to protect free speech. Under the status quo, companies get to decide what we do and do not see online. They can arbitrarily ban users or remove their content without offering any form of due process and with very few avenues for users to achieve effective redress. On the other hand, companies’ terms of service are often poorly enforced, if at all.

I have mentioned before the horrendous abuse suffered by footballers around the 2020 Euro final, despite most platforms’ terms and conditions clearly not allowing that sort of content. There are countless similar instances, for example, relating to antisemitic abuse—as we have heard—and other forms of hate speech, that fall below the criminal threshold.

This group of amendments relates to a series of new duties that will fundamentally reset the relationship between platforms and their users. The duties will prevent services from arbitrarily removing content or suspending users without offering users proper avenues to appeal. At the same time, they will stop companies making empty promises to their users about their terms of service. The duties will ensure that where companies say they will remove content or ban a user, they actually do.

Government new clause 3 is focused on protecting free speech. It would require providers of category 1 services to remove or restrict access to content, or ban or suspend users, only where this is consistent with their terms of service. Ofcom will oversee companies’ systems and processes for discharging those duties, rather than supervising individual decisions.

Damian Collins Portrait Damian Collins
- Hansard - -

I am grateful for what the Minister has said, and glad that Ofcom will have a role in seeing that companies do not remove content that is not in breach of terms of service where there is no legal requirement to do so. In other areas of the Bill where these duties exist, risk assessments are to be conducted and codes of practice are in place. Will there similarly be risk assessments and codes of practice to ensure that companies comply with their freedom of speech obligations?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. As I say, it is really important that people understand right at the beginning, through risk assessments, what they are signing up for and what they can expect. To come back to the point of whether someone is an adult or a child, it is really important that parents lean in when it comes to children’s protections; that is a very important tool in the armoury.

New clause 4 will require providers of category 1 services to ensure that what their terms of service say about their content moderation policies is clear and accessible. Those terms have to be easy for users to understand, and should have sufficient detail, so that users know what to expect, in relation to moderation actions. Providers of category 1 services must apply their terms of service consistently, and they must have in place systems and processes that enable them to enforce their terms of service consistently.

These duties will give users the ability to report any content or account that they suspect does not meet a platform’s terms of service. They will also give users the ability to make complaints about platforms’ moderation actions, and raise concerns if their content is removed in error. Providers will be required to take appropriate action in response to complaints. That could include removing content that they prohibit, or reinstating content removed in error. These duties ensure that providers are made aware of issues to do with their services and require them to take action to resolve them, to keep users safe, and to uphold users’ rights to free speech.

The duties set out in new clauses 3 and 4 will not apply to illegal content, content that is harmful to children or consumer content. That is because illegal content and content that is harmful to children are covered by existing duties in the Bill, and consumer content is already regulated under consumer protection legislation. Companies will also be able to remove any content where they have a legal obligation to do so, or where the user is committing a criminal offence, even if that is not covered in their terms of service.

New clause 5 will require Ofcom to publish guidance to help providers of category 1 services to understand what they need to do to comply with their new duties. That could include guidance on how to make their terms of service clear and easy for users to understand, and how to operate an effective reporting and redress mechanism. The guidance will not prescribe what types of content companies should include in their terms of service, or how they should treat such content. That will be for companies to decide, based on their knowledge of their users, and their brand and commercial incentives, and subject to their other legal obligations.

New clause 6 clarifies terms used in new clauses 3 and 4. It also includes a definition of “Consumer content”, which is excluded from the main duties in new clauses 3 and 4. This covers content that is already regulated by the Competition and Markets Authority and other consumer protection bodies, such as content that breaches the Consumer Protection from Unfair Trading Regulations 2008. These definitions are needed to provide clarity to companies seeking to comply with the duties set out in new clauses 3 and 4.

The remaining amendments to other provisions in the Bill are consequential on the insertion of these new transparency, accountability and free speech duties. They insert references to the new duties in, for example, the provisions about content reporting, enforcement, transparency and reviewing compliance. That will ensure that the duties apply properly to the new measure.

Amendment 30 removes the duty on platforms to include clear and accessible provisions in their terms of service informing users that they have a right of action in court for breach of contract if a platform removes or restricts access to their content in violation of its terms of service. This is so that the duty can be moved to new clause 4, which focuses on ensuring that platforms comply with their terms of service. The replacement duty in new clause 4 will go further than the original duty, in that it will cover suspensions and bans of users as well as restrictions on content.

Amendments 46 and 47 impose a new duty on Ofcom to have regard to the need for it to be clear to providers of category 1 services what they must do to comply with their new duties. These amendments will also require Ofcom to have regard to the extent to which providers of category 1 services are demonstrating, in a transparent and accountable way, how they are complying with their new duties.

Lastly, amendment 95 temporarily exempts video-sharing platforms that are category 1 services from the new terms of service duties, as set out in new clauses 3 and 4, until the Secretary of State agrees that the Online Safety Bill is sufficiently implemented. This approach simultaneously maximises user protections by the temporary continuation of the VSP regime and minimises burdens for services and Ofcom. The changes are central to the Government’s intention to hold companies accountable for their promises. They will protect users in a way that is in line with companies’ terms of service. They are a critical part of the triple shield, which aims to protect adults online. It ensures that users are safe by requiring companies to remove illegal content, enforce their terms of service and provide users with tools to control their online experiences. Equally, these changes prevent arbitrary or random content removal, which helps to protect pluralistic and robust debate online. For those reasons, I hope that Members can support the amendments.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will have a go at that, but I am happy to write to the hon. Lady if I do not respond as fully as she wants. Down-ranking content is a moderation action, as she says, but it is not always done just to restrict access to content; there are many reasons why people might want to do it. Through these changes, we are saying that the content is not actually being restricted; it can still be seen if it is searched for or otherwise encountered. That is consistent with the clarification.

Damian Collins Portrait Damian Collins
- Hansard - -

This is quite an important point. The hon. Member for Aberdeen North was talking about recommendation systems. If a platform chooses not to amplify content, that is presumably not covered. As long as the content is accessible, someone could search and find it. That does not inhibit a platform’s decision, for policy reasons or whatever, not to actively promote it.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will come back to some of the earlier points. At the end of the day, when platforms change their terms and conditions, which they are free to do, they will be judged by their users and indeed the advertisers from whom they make their money. There are market forces—I will use that phrase as well as “commercial imperative”, to get that one in there—that will drive behaviour. It may be the usability of Facebook, or Twitter’s terms and conditions and the approach of its new owner, that will drive those platforms to alternative users. I am old enough to remember Myspace, CompuServe and AOL, which tried to box people into their walled gardens. What happened to them? Only yesterday, someone from Google was saying that the new artificial intelligence chatbot—ChatGPT—may well disrupt Google. These companies, as big as they are, do not have a right to exist. They have to keep innovating. If they get it wrong, then they get it wrong.

Damian Collins Portrait Damian Collins
- Hansard - -

Does my hon. Friend agree that this is why the Bill is structured in the way it is? We have a wide range of priority illegal offences that companies have to meet, so it is not down to Elon Musk to determine whether he has a policy on race hate. They have to meet the legal standards set, and that is why it is so important to have that wide range of priority illegal offences. If companies go beyond that and have higher safety standards in their terms of service, that is checked as well. However, a company cannot avoid its obligations simply by changing its terms of service.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. We are putting in those protections, but we want companies to have due regard to freedom of speech.

I want to clarify a point that my hon. Friend made earlier about guidance to the new accountability, transparency and free speech duties. Companies will be free to set any terms of service that they want to, subject to their other legal obligations. That is related to the conversations that we have just been having. Those duties are there to properly enforce the terms of service, and not to remove content or ban users except in accordance with those terms. There will no platform risk assessments or codes of practices associated with those new duties. Instead, Ofcom will issue guidance on how companies can comply with their duties rather than codes of practice. That will focus on how companies set their terms of service, but companies will not be required to set terms directly for specific types of content or cover risks. I hope that is clear.

To answer the point made by the hon. Member for Pontypridd, I agree with the overall sentiment about how we need to protect freedom of expression.

Damian Collins Portrait Damian Collins
- Hansard - -

I want to be clear on my point. My question was not related to how platforms set their terms of service, which is a matter for them and they are held to account for that. If we are now bringing in requirements to say that companies cannot go beyond terms of service or their duties in the Bill if they are going to moderate content, who will oversee that? Will Ofcom have a role in checking whether platforms are over-moderating, as the Minister referred to earlier? In that case, where those duties exist elsewhere in the Bill, we have codes of practice in place to make sure it is clear what companies should and should not do. We do not seem to be doing that with this issue.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. We have captured that in other parts of the Bill, but I wanted to make that specific bit clear because I am not sure whether I understood or answered my hon. Friend’s question correctly at the time.

Question put and agreed to.

Clause 20, as amended, accordingly ordered to stand part of the Bill.

Clause 21

Record-keeping and review duties

Amendments made: 32, in clause 21, page 23, line 5, leave out “, 10 or 12” and insert “or 10”.

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 33, in clause 21, page 23, line 45, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 34, in clause 21, page 24, line 6, leave out “section” and insert “sections”.

This amendment is consequential on Amendment 35.

Amendment 35, in clause 21, page 24, line 6, at end insert—

“, (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (duties about terms of service).”—(Paul Scully.)

This amendment ensures that providers have a duty to review compliance with the duties set out in NC3 and NC4 regularly, and after making any significant change to the design or operation of the service.

Question proposed, That the clause, as amended, stand part of the Bill.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

We have seen that just from the people from external organisations who have contacted us about the Bill. The amount of expertise that we do not have that they have brought to the table has significantly improved the debate and hopefully the Bill. Even prior to the consultations that have happened, that encouraged the Minister to make the Bill better. Surely that is why the pre-legislative scrutiny Committee looked at the Bill—in order to improve it and to get expert advice. I still think that having specific access to expertise in order to analyse the transparency report has not been covered adequately.

Damian Collins Portrait Damian Collins
- Hansard - -

Annual transparency reporting is an important part of how the system will work. Transparency is one of the most important aspects of how the Online Safety Bill works, because without it companies can hide behind the transparency reports they produce at the moment, which give no transparency at all. For example, Facebook and YouTube report annually that their AI finds 95% of the hate speech they remove, but Frances Haugen said that they removed only 5% of the hate speech. So the transparency report means that they remove 95% of 5%, and that is one of the fundamental problems. The Bill gives the regulator the power to know, and the regulator then has to make informed decisions based on the information it has access to.

--- Later in debate ---
Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

As much as I am keen on the idea of Ofcom special agents conceptually, my concern on the transparency front is that, to appoint a special agent and send them in to look at the data, Ofcom would have to have cause to believe that there was an issue of concern with the data, whereas if that data is more transparently available to the research community, they can then proactively identify things that they can flag to Ofcom as a concern. Without that, we are relying on an annual cycle of Ofcom being able to intervene only when they have a concern, rather than the research community, which is much better placed to make that determination, being able to keep a watching brief on the company.

Damian Collins Portrait Damian Collins
- Hansard - -

That concern would be triggered by Ofcom discovering things as a consequence of user complaint. Although Ofcom is not a complaint resolution company, users can complain to it. Independent academics and researchers may produce studies and reports highlighting problems at any time, so Ofcom does not have to wait through an annual cycle of transparency reporting. At any time, Ofcom can say, “We want to have a deeper look at this problem.” It could be something Ofcom or someone else has discovered, and Ofcom can either research that itself or appoint an outside expert.

As the hon. Member for Warrington North mentioned, very sensitive information might become apparent through the transparency reporting that one might not necessarily wish to make public because it requires further investigation and could highlight a particular flaw that could be exploited by bad actors. I would hope and expect, as I think we all would, that we would have the routine publication of transparency reporting to give people assurance that the platforms are meeting their obligations. Indeed, if Ofcom were to intervene against a platform, it would probably use information gathered and received to provide the rationale for why a fine has been issued or another intervention has been made. I am sure that Ofcom will draw all the time on information gathered through transparency reporting and, where relevant, share it.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

This has been a helpful debate. Everyone was right that transparency must be and is at the heart of the Bill. From when we were talking earlier today about how risk assessments and terms of service must be accessible to all, through to this transparency reporting section, it is important that we hold companies to account and that the reports play a key role in allowing users, Ofcom and civil society, including those in academia, to understand the steps that companies are taking to protect users.

Under clause 65, category 1 services, category 2A search services and category 2B user-to-user services need to publish transparency reports annually in accordance with the transparency report notice from Ofcom. That relates to the points about commerciality that my hon. Friend the Member for Folkestone and Hythe talked about. Ofcom will set out what information is required from companies in their notice, which will also specify the format, manner and deadline for the information to be provided to Ofcom. Clearly, it would not be proportionate to require every service provider within the scope of the overall regulatory framework to produce a transparency report—it is also important that we deal with capacity and proportionality—but those category threshold conditions will ensure that the framework is flexible and future-proofed.

ONLINE SAFETY BILL (First sitting)

Damian Collins Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - - - Excerpts

I wish to add some brief words in support of the Government’s proposals and to build on the comments from Members of all parties.

We know that access to extreme and abusive pornography is a direct factor in violence against women and girls. We see that play out in the court system every day. People claim to have watched and become addicted to this type of pornography; they are put on trial because they seek to play that out in their relationships, which has resulted in the deaths of women. The platforms already have technology that allows them to figure out the age of people on their platforms. The Bill seeks to ensure that they use that for a good end, so I thoroughly support it. I thank the Minister.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

There are two very important and distinct issues here. One is age verification. The platforms ask adults who have identification to verify their age; if they cannot verify their age, they cannot access the service. Platforms have a choice within that. They can design their service so that it does not have adult content, in which case they may not need to build in verification systems—the platform polices itself. However, a platform such as Twitter, which allows adult content on an app that is open to children, has to build in those systems. As the hon. Member for Aberdeen North mentioned, people will also have to verify their identity to access a service such as OnlyFans, which is an adult-only service.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that specific point, I searched on Twitter for the name—first name and surname—of a politician to see what people had been saying, because I knew that he was in the news. The pictures that I saw! That was purely by searching for the name of the politician; it is not as though people are necessarily seeking such stuff out.

Damian Collins Portrait Damian Collins
- Hansard - -

On these platforms, the age verification requirements are clear: they must age-gate the adult content or get rid of it. They must do one or the other. Rightly, the Bill does not specify technologies. Technologies are available. The point is that a company must demonstrate that it is using an existing and available technology or that it has some other policy in place to remedy the issue. It has a choice, but it cannot do nothing. It cannot say that it does not have a policy on it.

Age assurance is always more difficult for children, because they do not have the same sort of ID that adults have. However, technologies exist: for instance, Yoti uses facial scanning. Companies do not have to do that either; they have to demonstrate that they do something beyond self-certification at the point of signing up. That is right. Companies may also demonstrate what they do to take robust action to close the accounts of children they have identified on their platforms.

If a company’s terms of service state that people must be 13 or over to use the platform, the company is inherently stating that the platform is not safe for someone under 13. What does it do to identify people who sign up? What does it do to identify people once they are on the platform, and what action does it then take? The Bill gives Ofcom the powers to understand those things and to force a change of behaviour and action. That is why—to the point made by the hon. Member for Pontypridd—age assurance is a slightly broader term, but companies can still extract a lot of information to determine the likely age of a child and take the appropriate action.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think we are all in agreement, and I hope that the Committee will accept the amendments.

Amendment 1 agreed to.

Amendments made: 2, in clause 11, page 10, line 25, leave out

“(for example, by using age assurance)”.

This amendment omits words which are no longer necessary in subsection (3)(b) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

Amendment 3, in clause 11, page 10, line 26, at end insert—

“(3A) Age assurance to identify who is a child user or which age group a child user is in is an example of a measure which may be taken or used (among others) for the purpose of compliance with a duty set out in subsection (2) or (3).”—(Paul Scully.)

This amendment makes it clear that age assurance measures may be used to comply with duties in clause 11(2) as well as (3) (safety duties protecting children).

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

Does the hon. Lady accept that the amendments would give people control over the bit of the service that they do not currently have control of? A user can choose what to search for and which members to engage with, and can block people. What they cannot do is stop the recommendation feeds recommending things to them. The shields intervene there, which gives user protection, enabling them to say, “I don’t want this sort of content recommended to me. On other things, I can either not search for them, or I can block and report offensive users.” Does she accept that that is what the amendment achieves?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I think that that is what the clause achieves, rather than the amendments that I have tabled. I recognise that the clause achieves that, and I have no concerns about it. It is good that the clause does that; my concern is that it does not take the second step of blocking access to certain features on the platform. For example, somebody could be having a great time on Instagram looking at various people’s pictures or whatever, but they may not want to be bombarded with private messages. They have no ability to turn off the private messaging section.

Damian Collins Portrait Damian Collins
- Hansard - -

They can disengage with the user who is sending the messages. On a Meta platform, often those messages will be from someone they are following or engaging with. They can block them, and the platforms have the ability, in most in-app messaging services, to see whether somebody is sending priority illegal content material to other users. They can scan for that and mitigate that as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is exactly why users should be able to block private messaging in general. Someone on Twitter can say, “I’m not going to receive a direct message from anybody I don’t follow.” Twitter users have the opportunity to do that, but there is not necessarily that opportunity on all platforms. We are asking for those things to be included, so that the provider can say, “You’re using private messaging inappropriately. Therefore, we are blocking all your access to private messaging,” or, “You are being harmed as a result of accessing private messaging. Therefore, we are blocking your access to any private messaging. You can still see pictures on Instagram, but you can no longer receive any private messages, because we are blocking your access to that part of the site.” That is very different from blocking a user’s access to certain kinds of content, for example. I agree that that should happen, but it is about the functionalities and stopping access to some of them.

We are not asking Ofcom to mandate that platforms take this measure; they could still take the slightly more nuclear option of banning somebody entirely from their service. However, if this option is included, we could say, “Your service is doing pretty well, but we know there is an issue with private messaging. Could you please take action to ensure that those people who are using private messaging to harm children no longer have access to private messaging and are no longer able to use the part of the service that enables them to do these things?” Somebody might be doing a great job of making games in Roblox, but they may be saying inappropriate things. It may be proportionate to block that person entirely, but it may be more proportionate to block their access to voice chat, so that they can no longer say those things, or direct message or contact anybody. It is about proportionality and recognising that the service is not necessarily inherently harmful but that specific parts of it could be.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely agree. The hon. Member put that much better than I could. I was trying to formulate that point in my head, but had not quite got there, so I appreciate her intervention. She is right: we should not put the onus on a victim to deal with a situation. Once they have seen a message from someone, they can absolutely block that person, but that person could create another account and send them messages again. People could be able to choose, and to say, “No, I don’t want anyone to be able to send me private messages,” or “I don’t want any private messages from anyone I don’t know.” We could put in those safeguards.

I am talking about adding another layer to the clause, so that companies would not necessarily have to demonstrate that it was proportionate to ban a person from using their service, as that may be too high a bar—a concern I will come to later. They could, however, demonstrate that it was proportionate to ban a person from using private messaging services, or from accessing livestreaming features. There has been a massive increase in self-generated child sexual abuse images, and huge amount has come from livestreaming. There are massive risks with livestreaming features on services.

Livestreaming is not always bad. Someone could livestream themselves showing how to make pancakes. There is no issue with that—that is grand—but livestreaming is being used by bad actors to manipulate children into sharing videos of themselves, and once they are on the internet, they are there forever. It cannot be undone. If we were able to ban vulnerable users—my preferred option would be all children—from accessing livestreaming services, they would be much safer.

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. Lady is talking about extremely serious matters. My expectation is that Ofcom would look at all of a platform’s features when risk-assessing the platform and enforcing safety, and in-app messaging services would not be exempt. Platforms have to demonstrate what they would do to mitigate harmful and abusive behaviour, and that they would take action against the accounts responsible.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Absolutely, I agree, but the problem is with the way the Bill is written. It does not suggest that a platform could stop somebody accessing a certain part of a service. The Bill refers to content, and to the service as a whole, but it does not have that middle point that I am talking about.

Damian Collins Portrait Damian Collins
- Hansard - -

A platform is required to demonstrate to Ofcom what it would do to mitigate activity that would breach the safety duties. It could do that through a feature that it builds in, or it may take a more draconian stance and say, “Rather than turning off certain features, we will just suspend the account altogether.” That could be discussed in the risk assessments, and agreed in the codes of practice.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

What I am saying is that the clause does not actually allow that middle step. It does not explicitly say that somebody could be stopped from accessing private messaging. The only options are being banned from certain content, or being banned from the entire platform.

I absolutely recognise the hard work that Ofcom has done, and I recognise that it will work very hard to ensure that risks are mitigated, but the amendment ensures what the Minister intended with this legislation. I am not convinced that he intended there to be just the two options that I outlined. I think he intended something more in line with what I am suggesting in the amendment. It would be very helpful if the Minister explicitly said something in this Committee that makes it clear that Ofcom has the power to say to platforms, “Your risk assessment says that there is a real risk from private messaging”—or from livestreaming—“so why don’t you turn that off for all users under 18?” Ofcom should be able to do that.

Could the Minister be clear that that is the direction of travel he is hoping and intending that Ofcom will take? If he could be clear on that, and will recognise that the clause could have been slightly better written to ensure Ofcom had that power, I would be quite happy to not push the amendment to a vote. Will the Minister be clear about the direction he hopes will be taken?

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Absolutely. The amendment I tabled regarding the accessibility of terms of service was designed to ensure that if the Government rely on terms of service, children can access those terms of service and are able to see what risks they are putting themselves at. We know that in reality children will not read these things. Adults do not read these things. I do not know what Twitter’s terms of service say, but I do know that Twitter managed to change its terms of service overnight, very easily and quickly. Companies could just say, “I’m a bit fed up with Ofcom breathing down my neck on this. I’m just going to change my terms of service, so that Ofcom will not take action on some of the egregious harm that has been done. If we just change our terms of service, we don’t need to bother. If we say that we are not going to ban transphobia on our platform—if we take that out of the terms of service—we do not need to worry about transphobia on our platform. We can just let it happen, because it is not in our terms of service.”

Damian Collins Portrait Damian Collins
- Hansard - -

Does the hon. Lady agree that the Government are not relying solely on terms of service, but are rightly saying, “If you say in your terms of service that this is what you will do, Ofcom will make sure that you do it”? Ofcom will take on that responsibility for people, making sure that these complex terms of service are understood and enforced, but the companies still have to meet all the priority illegal harms objectives that are set out in the legislation. Offences that exist in law are still enforced on platforms, and risk-assessed by Ofcom as well, so if a company does not have a policy on race hate, we have a law on race hate, and that will apply.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It is absolutely the case that those companies still have to do a risk assessment, and a child risk assessment if they meet the relevant criteria. The largest platforms, for example, will still have to do a significant amount of work on risk assessments. However, every time a Minister stands up and talks about what they are requiring platforms and companies to do, they say, “Companies must stick to their terms of service. They must ensure that they enforce things in line with their terms of service.” If a company is finding it too difficult, it will just take the tough things out of their terms of service. It will take out transphobia, it will take out abuse. Twitter does not ban anyone for abuse anyway, it seems, but it will be easier for Twitter to say, “Ofcom is going to try to hold us for account for the fact that we are not getting rid of people for abusive but not illegal messages, even though we say in our terms of service, ‘You must act with respect’, or ‘You must not abuse other users’. We will just take that out of our terms of service so that we are not held to account for the fact that we are not following our terms of service.” Then, because the abuse is not illegal—because it does not meet that bar—those places will end up being even less safe than they are right now.

For example, occasionally Twitter does act in line with its terms of service, which is quite nice: it does ban people who are behaving inappropriately, but not necessarily illegally, on its platform. However, if it is required to implement that across the board for everybody, it will be far easier for Twitter to say, “We’ve sacked all our moderators—we do not have enough people to be able to do this job—so we will just take it all out of the terms of service. The terms of service will say, ‘We will ban people for sharing illegal content, full stop.’” We will end up in a worse situation than we are currently in, so the reliance on terms of service causes me a big, big problem.

Turning to amendment 100, dealing specifically with the accessibility of this feature for child users, I appreciate the ministerial clarification, and agree that my amendment could have been better worded and potentially causes some problems. However, can the Minister talk more about the level of accessibility? I would like children to be able to see a version of the terms of service that is age-appropriate, so that they understand what is expected of them and others on the platform, and understand when and how they can make a report and how that report will be acted on. The kids who are using Discord, TikTok or YouTube are over 13—well, some of them are—so they are able to read and understand, and they want to know how to make reports and for the reporting functions to be there. One of the biggest complaints we hear from kids is that they do not know how to report things they see that are disturbing.

A requirement for children to have an understanding of how reporting functions work, particularly on social media platforms where people are interacting with each other, and of the behaviour that is expected of them, does not mean that there cannot be a more in-depth and detailed version of the terms of service, laying out potential punishments using language that children may not be able to understand. The amendment would specifically ensure that children have an understanding of that.

We want children to have a great time on the internet. There are so many ace things out there and wonderful places they can access. Lego has been in touch, for example; its website is really pretty cool. We want kids to be able to access that stuff and communicate with their friends, but we also want them to have access to features that allow them to make reports that will keep them safe. If children are making reports, then platforms will say, “Actually, there is real problem with this because we are getting loads of reports about it.” They will then be able to take action. They will be able to have proper risk assessments in place because they will be able to understand what is disturbing people and what is causing the problems.

I am glad to hear the Minister’s words. If he were even more clear about the fact that he would expect children to be able to understand and access information about keeping themselves safe on the platforms, then that would be even more helpful.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

To protect free speech and remove any possibility that the Bill could cause tech companies to censor legal content, I seek to remove the so-called “legal but harmful” duties from the Bill. These duties are currently set out in clauses 12 and 13 and apply to the largest in-scope services. They require services to undertake risk assessments for defined categories of harmful but legal content, before setting and enforcing clear terms of service for each category of content.

I share the concerns raised by Members of this House and more broadly that these provisions could have a detrimental effect on freedom of expression. It is not right that the Government define what legal content they consider harmful to adults and then require platforms to risk assess for that content. Doing so may encourage companies to remove legal speech, undermining this Government’s commitment to freedom of expression. That is why these provisions must be removed.

At the same time, I recognise the undue influence that the largest platforms have over our public discourse. These companies get to decide what we do and do not see online. They can arbitrarily remove a user’s content or ban them altogether without offering any real avenues of redress to users. On the flip side, even when companies have terms of service, these are often not enforced, as we have discussed. That was the case after the Euro 2020 final where footballers were subject to the most appalling abuse, despite most platforms clearly prohibiting that. That is why I am introducing duties to improve the transparency and accountability of platforms and to protect free speech through new clauses 3 and 4. Under these duties, category 1 platforms will only be allowed to remove or restrict access to content or ban or suspend users when this is in accordance with their terms of service or where they face another legal obligation. That protects against the arbitrary removal of content.

Companies must ensure that their terms of service are consistently enforced. If companies’ terms of service say that they will remove or restrict access to content, or will ban or suspend users in certain circumstances, they must put in place proper systems and processes to apply those terms. That will close the gap between what companies say they will do and what they do in practice. Services must ensure that their terms of service are easily understandable to users and that they operate effective reporting and redress mechanisms, enabling users to raise concerns about a company’s application of the terms of service. We will debate the substance of these changes later alongside clause 18.

Clause 55 currently defines

“content that is harmful to adults”,

including

“priority content that is harmful to adults”

for the purposes of this legislation. As this concept would be removed with the removal of the adult safety duties, this clause will also need to be removed.

Damian Collins Portrait Damian Collins
- Hansard - -

My hon. Friend mentioned earlier that companies will not be able to remove content if it is not part of their safety duties or if it was not a breach of their terms of service. I want to be sure that I heard that correctly and to ask whether Ofcom will be able to risk assess that process to ensure that companies are not over-removing content.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. I will come on to Ofcom in a second and respond directly to his question.

The removal of clauses 12, 13 and 55 from the Bill, if agreed by the Committee, will require a series of further amendments to remove references to the adult safety duties elsewhere in the Bill. These amendments are required to ensure that the legislation is consistent and, importantly, that platforms, Ofcom and the Secretary of State are not held to requirements relating to the adult safety duties that we intend to remove from the Bill. The amendments remove requirements on platforms and Ofcom relating to the adult safety duties. That includes references to the adult safety duties in the duties to provide content reporting and redress mechanisms and to keep records. They also remove references to content that is harmful to adults from the process for designating category 1, 2A and 2B companies. The amendments in this group relate mainly to the process for the category 2B companies.

I also seek to amend the process for designating category 1 services to ensure that they are identified based on their influence over public discourse, rather than with regard to the risk of harm posed by content that is harmful to adults. These changes will be discussed when we debate the relevant amendments alongside clause 82 and schedule 11. The amendments will remove powers that will no longer be required, such as the Secretary of State’s ability to designate priority content that is harmful to adults. As I have already indicated, we intend to remove the adult safety duties and introduce new duties on category 1 services relating to transparency, accountability and freedom of expression. While they will mostly be discussed alongside clause 18, amendments 61 to 66, 68 to 70 and 74 will add references to the transparency, accountability and freedom of expression duties to schedule 8. That will ensure that Ofcom can require providers of category 1 services to give details in their annual transparency reports about how they comply with the new duties. Those amendments define relevant content and consumer content for the purposes of the schedule.

We will discuss the proposed transparency and accountability duties that will replace the adult safety duties in more detail later in the Committee’s deliberations. For the reasons I have set out, I do not believe that the current adult safety duties with their risks to freedom of expression should be retained. I therefore urge the Committee that clauses 12, 13 and 55 do not stand part and instead recommend that the Government amendments in this group are accepted.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

Can the hon. Lady tell me where in the Bill, as it is currently drafted—so, unamended—it requires platforms to remove legal speech?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It allows the platforms to do that. It allows them, and requires legal but harmful stuff to be taken into account. It requires the platforms to act—to consider, through risk assessments, the harm done to adults by content that is legal but massively harmful.

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. Lady is right: the Bill does not require the removal of legal speech. Platforms must take the issue into account—it can be risk assessed—but it is ultimately their decision. I think the point has been massively overstated that, somehow, previously, Ofcom had the power to strike down legal but harmful speech that was not a breach of either terms of service or the law. It never had that power.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Why do the Government now think that there is a risk to free speech? If Ofcom never had that power, if it was never an issue, why are the Government bothered about that risk—it clearly was not a risk—to free speech? If that was never a consideration, it obviously was not a risk to free speech, so I am now even more confused as to why the Government have decided that they will have to strip this measure out of the Bill because of the risk to free speech, because clearly it was not a risk in this situation. This is some of the most important stuff in the Bill for the protection of adults, and the Government are keen to remove it.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

No, I will not give way again. The change will ensure that people can absolutely say what they like online, but the damage and harm that it will cause are not balanced by the freedoms that have been won.

Damian Collins Portrait Damian Collins
- Hansard - -

As a Back-Bench Member of Parliament, I recommended that the “legal but harmful” provisions be removed from the Bill. When I chaired the Joint Committee of both Houses of Parliament that scrutinised the draft Bill, it was the unanimous recommendation of the Committee that the “legal but harmful” provisions be removed. As a Minister at the Dispatch Box, I said that I thought “legal but harmful” was a problematic term and we should not use it. The term “legal but harmful” does not exist in the Bill, and has never existed in the Bill, but it has provoked a debate that has caused a huge confusion. There is a belief, which we have heard expressed in debate today, that somehow there are categories of content that Ofcom can deem categories for removal whether they are unlawful or not.

During the Bill’s journey from publication in draft to where we are today, it has become more specific. Rather than our relying on general duties of care, written into the Bill are areas of priority illegal activity that the companies must proactively look for, monitor and mitigate. In the original version of the Bill, that included only terrorist content and child sexual exploitation material, but on the recommendation of the Joint Committee, the Government moved in the direction of writing into the Bill at schedule 7 offences in law that will be the priority illegal offences.

The list of offences is quite wide, and it is more comprehensive than any other such list in the world in specifying exactly what offences are in scope. There is no ambiguity for the platforms as to what offences are in scope. Stalking, harassment and inciting violence, which are all serious offences, as well as the horrible abuse a person might receive as a consequence of their race or religious beliefs, are written into the Bill as priority illegal offences.

There has to be a risk assessment of whether such content exists on platforms and what action platforms should take. They are required to carry out such a risk assessment, although that was never part of the Bill before. The “legal but harmful” provisions in some ways predate that. Changes were made; the offences were written into the Bill, risk assessments were provided for, and Parliament was invited to create new offences and write them into the Bill, if there were categories of content that had not been captured. In some ways, that creates a democratic lock that says, “If we are going to start to regulate areas of speech, what is the legal reason for doing that? Where is the legal threshold? What are the grounds for us taking that decision, if it is something that is not already covered in platforms’ terms of service?”

We are moving in that direction. We have a schedule of offences that we are writing into the Bill, and those priority illegal offences cover most of the most serious behaviour and most of the concerns raised in today’s debate. On top of that, there is a risk assessment of platforms’ terms of service. When we look at the terms of service of the companies—the major platforms we have been discussing—we see that they set a higher bar again than the priority illegal harms. On the whole, platforms do not have policies that say, “We won’t do anything about this illegal activity, race hate, incitement to violence, or promotion or glorification of terrorism.” The problem is that although have terms of service, they do not enforce them. Therefore, we are not relying on terms of service. What we are saying, and what the Bill says, is that the minimum safety standards are based on the offences written into the Bill. In addition, we have risk assessment, and we have enforcement based on the terms of service.

There may be a situation in which there is a category of content that is not in breach of a platform’s terms of service and not included in the priority areas of illegal harm. It is very difficult to think of what that could be—something that is not already covered, and over which Ofcom would not have power. There is the inclusion of the new offences of promoting self-harm and suicide. That captures not just an individual piece of content, but the systematic effect of a teenager like Molly Russell—or an adult of any age—being targeted with such content. There are also new offences for cyber-flashing, and there is Zach’s law, which was discussed in the Chamber on Report. We are creating and writing into the Bill these new priority areas of illegal harm.

Freedom of speech groups’ concern was that the Government could have a secret list of extra things that they also wanted risk-assessed, rather enforcement being clearly based either on the law or on clear terms of service. It is difficult to think of categories of harm that are not already captured in terms of service or priority areas of illegal harm, and that would be on such a list. I think that is why the change was made. For freedom of speech campaigners, there was a concern about exactly what enforcement was based on: “Is it based on the law? Is it based on terms of service? Or is it based on something else?”

I personally believed that the “legal but harmful” provisions in the Bill, as far as they existed, were not an infringement on free speech, because there was never a requirement to remove legal speech. I do not think the removal of those clauses from the Bill suddenly creates a wild west in which no enforcement will take place at all. There will be very effective enforcement based on the terms of service, and on the schedule 7 offences, which deal with the worst kinds of illegal activity; there is a broad list. The changes make it much clearer to everybody—platforms and users alike, and Ofcom—exactly what the duties are, how they are enforced and what they are based on.

For future regulation, we have to use this framework, so that we can say that when we add new offences to the scope of the legislation, they are offences that have been approved by Parliament and have gone through a proper process, and are a necessary addition because terms of service do not cover them. That is a much clearer and better structure to follow, which is why I support the Government amendments.

--- Later in debate ---
Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I could not agree more. I suppose that is why this aspect of the Bill is so important, not just to me but to all those categories of user. I mentioned paragraphs (d) to (f), which would require platforms to assess exactly that risk. This is not about being offended. Personally, I have the skin of a rhino. People can say most things to me and I am not particularly bothered by it. My concern is where things that are said online are transposed into real-life harms. I will use myself as an example. Online, we can see antisemitic and conspiratorial content, covid misinformation, and covid misinformation that meets with antisemitism and conspiracies. When people decide that I, as a Jewish Member of Parliament, am personally responsible for George Soros putting a 5G chip in their arm, or whatever other nonsense they have become persuaded by on the internet, that is exactly the kind of thing that has meant people coming to my office armed with a knife. The kind of content that they were radicalised by on the internet led to their perpetrating a real-life, in-person harm. Thank God—Baruch Hashem—neither I nor my staff were in the office that day, but that could have ended very differently, because of the sorts of content that the Bill is meant to protect online users from.

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. Lady is talking about an incredibly important issue, but the Bill covers such matters as credible threats to life, incitement to violence against an individual, and harassment and stalking—those patterns of behaviour. Those are public order offences, and they are in the Bill. I would absolutely expect companies to risk-assess for that sort of activity, and to be required by Ofcom to mitigate it. On her point about holocaust denial, first, the shield will mean that people can protect themselves from seeing stuff. The further question would be whether we create new offences in law, which can then be transposed across.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I accept the points that the hon. Member raised, but he is fundamentally missing the point. The categories of information and content that these people had seen and been radicalised by would not fall under the scope of public order offences or harassment. The person was not sending me harassing messages before they turned up at my office. Essentially, social media companies and other online platforms have to take measures to mitigate the risk of categories of offences that are illegal, whether or not they are in the Bill. I am talking about what clauses 12 and 13 covered, whether we call it the “legal but harmful” category or “lawful but awful”. Whatever we name those provisions, by taking out of the Bill clauses relating to the “legal but harmful” category, we are opening up an area of harm that already exists, that has a real-world impact, and that the Bill was meant to go some way towards addressing.

The provisions have taken out the risk assessments that need to be done. The Bill says,

“(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults”.

Again, the idea that we are talking about offence, and that the clauses need to be taken out to protect free speech, is fundamentally nonsense.

I have already mentioned holocaust denial, but it is also worth mentioning health-related disinformation. We have already seen real-world harms from some of the covid misinformation online. It led to people including Piers Corbyn turning up outside Parliament with a gallows, threatening to hang hon. Members for treason. Obviously, that was rightly dealt with by the police, but the kind of information and misinformation that he had been getting online and that led him to do that, which is legal but harmful, will now not be covered by the Bill.

I will also raise an issue I have heard about from a number of people dealing with cancer and conditions such as multiple sclerosis. People online try to discourage them from accessing the proper medical interventions for their illnesses, and instead encourage them to take more vitamin B or adopt a vegan diet. There are people who have died because they had cancer but were encouraged online to not access cancer treatment because they were subject to lawful but awful categories of harm.

Online Safety Bill

Damian Collins Excerpts
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Bill is very specific with regard to encryption; this provision will cover solely CSEA and terrorism. It is important that we do not encroach on privacy.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

I welcome my hon. Friend to his position. Under the Bill, is it not the case that if a company refuses to use existing technologies, that will be a failure of the regulatory duties placed on that company? Companies will be required to demonstrate which technology they will use and will have to use one that is available. On encrypted messaging, is it not the case that companies already gather large amounts of information about websites that people visit before and after they send a message that could be hugely valuable to law enforcement?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. Not only is it incumbent on companies to use that technology should it exist; if they hamper Ofcom’s inquiries by not sharing information about what they are doing, what they find and which technologies they are not using, that will be a criminal liability under the Bill.

--- Later in debate ---
John Nicolson Portrait John Nicolson
- View Speech - Hansard - - - Excerpts

I rise to speak to the amendments in my name and those of my right hon. and hon. Friends, which of course I support.

It is welcome to see the Online Safety Bill back in the House. As we have debated this Bill and nursed it, as in my case, through both the Bill Committee and the Joint Committee, we have shone a light into some dark corners and heard some deeply harrowing stories. Who can forget the testimony given to us by Molly Russell’s dad, Ian? As we have heard, in the Public Gallery we have bereaved families who have experienced the most profound losses due to the extreme online harms to which their loved ones have been exposed; representatives of those families are watching the proceedings today. The hon. Member for Pontypridd (Alex Davies-Jones) mentioned that Ian is here, but let me mention the names of the children. Amanda and Stuart Stephens are here, and they are the parents of Olly; Andy and Judy Thomas are here, and they are the parents of Frankie; and Lorin LaFave, the mother of Breck is here, as is Ruth Moss, the mother of Sophie. All have lost children in connection with online harms, and I extend to each our most sincere condolences, as I am sure does every Member of the House. We have thought of them time and time again during the passage of this legislation; we have thought about their pain. All of us hope that this Bill will make very real changes, and we keep in our hearts the memories of those children and other young people who have suffered.

In our debates and Committee hearings, we have done our best to harry the social media companies and some of their secretive bosses. They have often been hiding away on the west coast of the US, to emerge blinking into the gloomy Committee light when they have to answer some questions about their nefarious activities and their obvious lack of concern for the way in which children and others are impacted.

We have debated issues of concern and sometimes disagreement in a way that shows the occasional benefits of cross-House co-operation. I have been pleased to work with friends and colleagues in other parties at every stage of the Bill, not least on Zach’s law, which we have mentioned. The result is a basis of good, much-needed legislation, and we must now get it on to the statute book.

It is unfortunate that the Bill has been so long delayed, which has caused great stress to some people who have been deeply affected by the issues raised, so that they have sometimes doubted our good faith. These delays are not immaterial. Children and young teenagers have grown older in an online world full of self-harm—soon to be illegal harms, we hope. It is a world full of easy-to-access pornography with no meaningful age verification and algorithms that provide harmful content to vulnerable people.

I have been pleased to note that calls from Members on the SNP Benches and from across the House to ensure that specific protection is granted to women and girls online have been heeded. New communications offences on cyber-flashing and intimate image abuse, and similar offences, are to be incorporated. The requirements for Ofcom to consult with the Victims’ Commissioner and the Domestic Abuse Commissioner are very welcome. Reporting tools should also be more responsive.

New clause 28 is an important new clause that SNP Members have been proud to sponsor. It calls for an advocacy body to represent the interests of children. That is vital, because the online world that children experience is ever evolving. It is not the online world that we in this Chamber tend to experience, nor is it the one experienced by most members of the media covering the debate today. We need, and young people deserve, a dedicated and appropriately funded body to look out for them online—a strong, informed voice able to stand up to the representations of big tech in the name of young people. This will, we hope, ensure that regulators get it right when acting on behalf of children online.

I am aware that there is broad support for such a body, including from those on the Labour Benches. We on the SNP Benches oppose the removal of the aspect of the Bill related to legal but harmful material. I understand the free speech arguments, and I have heard Ministers argue that the Government have proposed alternative approaches, which, they say, will give users control over the content that they see online. But adults are often vulnerable, too. Removing measures from the Bill that can protect adults, especially those in a mental health spiral or with additional learning needs, is a dereliction of our duty. An on/off toggle for harmful content is a poor substitute for what was originally proposed.

The legal but harmful discussion was and is a thorny one. It was important to get the language of the Bill right, so that people could be protected from harm online without impinging on freedom of expression, which we all hold dear. However, by sending aspects of the Bill back to Committee, with the intention of removing the legal but harmful provisions, I fear that the Government are simply running from a difficult debate, or worse, succumbing to those who have never really supported this Bill—some who rather approve of the wild west, free-for-all internet. It is much better to rise to the challenge of resolving the conflicts, such as they are, between free speech and legal but harmful. I accept that the Government’s proposals around greater clarity and enforcement of terms and conditions and of transparency in reporting to Ofcom offer some mitigation, but not, in my view, enough.

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. Gentleman will remember that, when we served on the Joint Committee that scrutinised the draft Bill, we were concerned that the term “legal but harmful” was problematic and that there was a lack of clarity. We thought it would be better to have more clarity and enforcement based on priority illegal offences and on the terms of service. Does he still believe that, or has he changed his mind?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

It is a fine debate. Like so much in legislation, there is not an absolute right and an absolute wrong. We heard contradictory evidence. It is important to measure the advantages and the disadvantages. I will listen to the rest of the debate very carefully, as I have done throughout.

As a journalist in a previous life, I have long been a proponent of transparency and open democracy—something that occasionally gets me into trouble. We on the SNP Benches have argued from the outset that the powers proposed for the Secretary of State are far too expensive and wide-reaching. That is no disrespect to the Minister or the new Secretary of State, but they will know that there have been quite a few Culture Secretaries in recent years, some more temperate than others.

In wishing to see a diminution of the powers proposed we find ourselves in good company, not least with Ofcom. I note that there have been some positive shifts in the proposals around the powers of the Secretary of State, allowing greater parliamentary oversight. I hope that these indicate a welcome acknowledgement that our arguments have fallen on fertile Government soil—although, of course, it could be that the Conservative Secretary of State realises that she may soon be the shadow Secretary of State and that it will be a Labour Secretary of State exercising the proposed powers. I hope she will forgive me for that moment’s cynicism.

--- Later in debate ---
Priti Patel Portrait Priti Patel (Witham) (Con)
- View Speech - Hansard - - - Excerpts

Before I speak to specific clauses I pay tribute to all the campaigners, particularly the families who have campaigned so hard to give their loved ones a voice through this Bill and to change our laws. Having had some prior involvement in the early stages of this Bill three years ago as Home Secretary, I also pay tribute to many of the officials and Members of this House on both sides who have worked assiduously on the construction, development and advancement of this Bill. In particular, I pay tribute to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and the work of the Joint Committee; when I was Home Secretary we had many discussions about this important work. I also thank the Minister for the assiduous way in which he has handled interventions and actually furthered the debate with this Bill. There are many Government Departments that have a raft of involvement and engagement.

The victims must be at the heart of everything that we do now to provide safeguards and protections. Children and individuals have lost their lives because of the online space. We know there is a great deal of good in the online space, but also a great deal of harm, and that must unite us all in delivering this legislation. We have waited a long time for this Bill, but we must come together, knowing that this is foundational legislation, which will have to be improved and developed alongside the technology, and that there is much more work to do.

I start by focusing on a couple of the new clauses, beginning with Government new clause 11 on end-to-end encryption. The House will not be surprised by my background in dealing with end-to-end encryption, particularly the harmful content, the types of individuals and the perpetrators who hide behind end-to-end encryption. We must acknowledge the individuals who harm children or who peddle terrorist content through end-to-end encryption while recognising that encryption services are important to protect privacy.

There is great justification for encryption—business transactions, working for the Government and all sorts of areas of importance—but we must acknowledge in this House that there is more work to do, because these services are being used by those who would do harm to our country, threaten our national interest or threaten the safety of young people and children in particular. We know for a fact that there are sick-minded individuals who seek to abuse and exploit children and vulnerable adults. The Minister will know that, and I am afraid that many of us do. I speak now as a constituency Member of Parliament, and one of my first surgery cases back in 2010 was the sad and tragic case of a mother who came to see me because her son had accessed all sorts of content. Thanks to the Bill, that content will now be ruled as harmful. There were other services associated with access that the family could not see and could not get access to, and encryption platforms are part of that.

There are shocking figures, and I suspect that many of my colleagues in the House will be aware of them. Almost 100,000 reports relating to online child abuse were received by UK enforcement agencies in 2021 alone. That is shocking. The House will recognise my experience of working with the National Crime Agency, to which we must pay tribute for its work in this space, as we should to law enforcement more widely. Police officers and all sorts of individuals in law enforcement are, day in, day out, investigating these cases and looking at some of the most appalling images and content, all in the name of protecting vulnerable children, and we must pay tribute to them as well.

It is also really shocking that that figure of 100,000 reports in 2021 alone is a 29% increase on the previous year. The amount of disturbing content is going up and up, and we are, I am afraid, looking only at the tip of the iceberg. So, I think it is absolutely right—and I will always urge the Government and whichever Secretary of State, be they in the Home Office, DMCS or the MOJ—to put the right measures and powers in place so that we act to prevent child sexual abuse and exploitation, prevent terrorist content from being shielded behind the platforms of encryption and, importantly, bring those involved to face justice. End-to-end encryption is one thing, but we need end-to-end justice for victims and the prevention of the most heinous crimes.

This is where we, as a House, must come together. I commend the hon. Member for Rotherham (Sarah Champion) in particular for her work relating to girls, everything to do with the grooming gangs, and the most appalling crimes against individuals, quite frankly. I will always urge colleagues to support the Bill, on which we will need to build going forward.

I think I can speak with experience about the difficulties in drafting legislation—both more broadly and specifically in this area, which is complex and challenging. It is hard to foresee the multiplicity of circumstances. My hon. Friend the Member for Folkestone and Hythe was absolutely right to say in his comments to the SNP spokesman, the hon. Member for Ochil and South Perthshire (John Nicolson), that we have to focus on illegal content. It is difficult to get the balance right between the lawful and harmful. The illegal side is what we must focus on.

I also know that many campaigners and individuals—they are not just campaigners, but families—have given heartbreaking and devastating accounts of their experiences of online harms. As legislators, we owe them this Bill, because although their suffering is not something that we will experience, it must bring about the type of changes that we all want to see for everyone—children, adults and vulnerable individuals.

May I ask the Minister for reassurances on the definition of “best endeavours”? As my right hon. Friend the Member for Basingstoke (Dame Maria Miller) touched on, when it comes to implementation, that will be the area where the rubber hits the road. That is where we will need to know that our collective work will be meaningful and will deliver protections—not just change, but protections. We must be honest about the many serious issues that will arise even after we pass the Bill—be it, God forbid, a major terrorist incident, or cases of child sexual exploitation—and there is a risk that, without clarity in this area, when a serious issue does arise, we may not know whether a provider undertook best endeavours. I think we owe it to everyone to ensure that we run a slide rule over this on every single granular detail.

Cases and issues relating to best endeavours are debated and discussed extensively in court cases, coroner inquests and for social services relating to child safeguarding issues, for example—all right hon. and hon. Members here will have experience of dealing with social services on behalf of their constituents in child protection cases—or, even worse, in serious case reviews or public inquiries that could come in future. I worry that in any response a provider could say that it did its best and had undertaken its best endeavours, as a defence. That would be unacceptable. That would lead those affected to feel as if they suffered an even greater injustice than the violations that they experienced. It is not clear whether best endeavours will be enough to change the culture, behaviour and attitudes of online platforms.

I raise best endeavours in the context of changing attitudes and cultures because in many institutions, that very issue is under live debate right now. That may be in policing, attitudes around women and girls or how we protect other vulnerable groups, even in other services such as the fire service, which we have heard about recently. It is important that we ask those questions and have the scrutiny. We need to hear more about what constitutes best endeavours. Who will hold the providers to account? Ofcom clearly has a role. I know the Minister will do a very earnest and diligent job to provide answers, but the best endeavours principle goes wider than just the Minister on the Front Bench—it goes across the whole of Government. He knows that we will give him every backing to use his sharp elbows—perhaps I can help with my sharp elbows—to ensure that others are held to account.

It will also be for Ofcom to give further details and guidance. As ever, the guidance will be so important. The guidance has to have teeth and statutory powers. It has to be able to put the mirror up and hold people to account. For example, would Ofcom be able, in its notices to providers, to instruct them to use specific technologies and programmes to tackle and end the exposure to exploitation, in relation to end-to-end encryption services, to protect victims? That is an open question, but one that could be put to Ofcom and could be an implementation test. There is no reason why we should not put a series of questions to Ofcom around how it would practically implement.

I would like to ask the Minister why vulnerable adults and victims of domestic abuse and violence against women and girls are not included. We must do everything in this House. This is not about being party political. When it comes to all our work on women and violence against women and girls, there should be no party politics whatsoever. We should ensure that what is right for one group is consistent and that the laws are strengthened. That will require the MOJ, as well as the Home Office, to ensure that the work is joined up in the right kind of way.

It is right that powers are available for dealing with terrorist threats and tackling child sexual abuse thoroughly. There is some good work around terrorist content. There is excellent work in GIFCT, the Global Internet Forum to Counter Terrorism. The technology companies are doing great work. There is international co-operation in this space. The House should take some comfort in the fact that the United Kingdom leads the world in this space. We owe our gratitude to our intelligence and security agencies. I give my thanks to MI5 in particular for its work and to counter-terrorism policing, because they have led the world robustly in this work.

Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. Friend makes an important point about this being a cross-Government effort. The Online Safety Bill creates a regulatory framework for the internet, but we need to make sure that we have the right offences in law clearly defined. Then, it is easy to read them and cross them with legislation. If we do not have that, it is a job for the whole of Government.

--- Later in debate ---
Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Indeed. The way the hon. Gentleman describes his new clause, which I will look at, is absolutely right, but can I just make a more general point because it speaks to the point about legal but harmful? What I really fear with the legal but harmful rule is that we create more and more laws to make content illegal and that, ironically, locks up more and more people, rather than creates structures and systems that will prevent the harm occurring in the first place. So I am not always in favour of new laws simply criminalising individuals. I would love us to have kept to the legal but harmful route.

We can look to Elon Musk’s recent controversial takeover of Twitter. Decisions taken by Twitter’s newest owner—by Elon Musk himself—saw use of the N-word increase by nearly 500% within 12 hours of acquisition. And allowing Donald Trump back on Twitter gives a chilling permission to Trump and others to use the site yet again to incite violence.

The tech giants know that their business models are dangerous. Platforms can train their systems to recognise so-called borderline content and reduce engagement. However, it is for business reasons, and business reasons alone, that they actively choose not to do that. In fact, they do the opposite and promote content known to trigger extreme emotions. These platforms are like a “danger for profit” machine, and the decision to allow that exploitation is coming from the top. Do not take my word for it; just listen to the words of Ian Russell. He has said:

“The only person that I’ve ever come across in this whole world…that thought that content”—

the content that Molly viewed—

“was safe was…Meta.”

There is a huge disconnect between what silicon valley executives think is safe and what we expect, both for ourselves and for our children. By introducing liability for directors, the behaviour of these companies might finally change. Experience elsewhere has shown us that that would prove to be the most effective way of keeping online users safe. New clause 17 would hold directors of a regulated service personally liable on the grounds that they have failed, or are failing, to comply with any duties set in relation to their service, for instance failure that leads to the death of a child. The new clause further states that the decision on who was liable would be made by Ofcom, not the provider, meaning that responsibility could not be shirked.

I say to all Members that if we really want to reduce the amount of harmful abuse online, then making senior directors personally liable is a very good way of achieving it. Some 82% of UK adults agree with us, Labour Front Benchers agree and Back Benchers across the House agree. So I urge the Government to rethink their position on director liability and support new clause 17 as a cross-party amendment. I really think it will make a difference.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

As Members know, there is a tradition in the United States that when the President signs a new Bill into law, people gather around him in the Oval Office, and multiple pens are used and presented to people who had a part in that Bill being drafted. If we required the King to do something similar with this Bill and gave a pen to every Minister, every Member who had served on a scrutiny Committee and every hon. Member who introduced an amendment that was accepted, we would need a lot of pens and it would take a long time. In some ways, however, that shows the House at its best; the Bill’s introduction has been a highly collaborative process.

The right hon. Member for Barking (Dame Margaret Hodge) was kind in her words about me and my right hon. Friend the Member for Croydon South (Chris Philp). I know that my successor will continue in the same tradition and, more importantly, that he is supported by a team of officials who have dedicated, in some cases, years of their career to the Bill, who care deeply about it and who want to see it introduced with success. I had better be nice to them because some of them are sitting in the Box.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. Friend raises a very good question. As well as having a named individual with criminal liability for the supplying of information, should there be somebody who is accountable within a company, whether that comes with criminal sanctions or not—somebody whose job it is to know? As all hon. Members know if they have served on the Digital, Culture, Media and Sport Committee, which I chaired, on the Public Accounts Committee or on other Select Committees that have questioned people from the big tech companies, the frustrating thing is that no matter who they put up, it never seems to be the person who actually knows.

There needs to be someone who is legally liable, whether or not they have criminal liability, and is the accountable officer. In the same way as in a financial institution, it is really important to have someone whose job it is to know what is going on and who has certain liabilities. The Bill gives Ofcom the power to seek information and to appoint experts within a company to dig information out and work with the company to get it, but the companies need to feel the same sense of liability that a bank would if its systems had been used to launder money and it had not raised a flag.

Damian Collins Portrait Damian Collins
- Hansard - -

I will dare to give way to yet another former Committee Chair—the former chair of the Public Accounts Committee.

Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I draw all hon. Members’ attention to issues relating to Barclays Bank in the wake of the economic crisis. An authority—I think it was the Serious Fraud Office—attempted to hold both the bank and its directors to account, but it failed because there was not a corporate criminal liability clause that worked. It was too difficult. Putting such a provision in the Bill would be a means of holding individual directors as well as companies to account, whatever standard of proof was used.

Damian Collins Portrait Damian Collins
- Hansard - -

I thank the right hon. Lady for that information.

Let me move on to the debate about encryption, which my right hon. Friend the Member for Haltemprice and Howden has mentioned. I think it is important that Ofcom and law enforcement agencies be able to access information from companies that could be useful in prosecuting cases related to terrorism and child sexual exploitation. No one is suggesting that encrypted messaging services such as WhatsApp should be de-encrypted, and there is no requirement in the Bill for encryption to end, but we might ask how Meta makes money out of WhatsApp when it appears to be free. One way in which it makes money is by gathering huge amounts of data and information about the people who use it, about the names of WhatsApp groups and about the websites people visit before and after sending messages. It gathers a lot of background metadata about people’s activity around using the app and service.

If someone has visited a website on which severe illegal activity is taking place and has then used a messaging service, and the person to whom they sent the message has done the same, it should be grounds for investigation. It should be easy for law enforcement to get hold of the relevant information without the companies resisting. It should be possible for Ofcom to ask questions about how readily the companies make that information available. That is what the Government seek to do through their amendments on encryption. They are not about creating a back door for encryption, which could create other dangers, and not just on freedom of expression grounds: once a back door to a system is created, even if it is only for the company itself or for law enforcement, other people tend to find their way in.

Ian Paisley Portrait Ian Paisley (North Antrim) (DUP)
- Hansard - - - Excerpts

I thank the hon. Member for jointly sponsoring my private Member’s Bill, the Digital Devices (Access for Next of Kin) Bill. Does he agree that the best way to make progress is to ensure open access for the next of kin to devices that a deceased person leaves behind?

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. Member makes an important point. Baroness Kidron’s amendment has been referred to; I anticipate that future amendments in the House of Lords will also seek to address the issue, which our Joint Committee looked at carefully in our pre-legislative scrutiny.

It should be much easier than it has been for the Russell family and the coroner to gain access to such important information. However, depending on the nature of the case, there may well be times when it would be wrong for families to have access. I think there has to be an expedited and official process through which the information can be sought, rather than a general provision, because some cases are complicated. There should not be a general right in law, but it needs to be a lot easier than it is. Companies should make the information available much more readily than they have done. The Molly Russell inquest had to be delayed for four months because of the late release of thousands of pages of information from Meta to the coroner. That is clearly not acceptable either.

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled an amendment relating to small and risky platforms. The categorisation of platforms on the basis of size was linked to duties under the “legal but harmful” provisions, which we expect now to change. The priority illegal harms apply to platforms of all sizes. Surely when illegal activity is taking place on any platform of any size—I hope that the Minister will clarify this later—Ofcom must have the right to intervene and start asking questions. I think that, in practice, that is how we should expect the system to work.

Like other Members who served on the Joint Committee —I am thinking particularly of my hon. Friends the Members for Watford (Dean Russell) and for Stourbridge (Suzanne Webb), both of whom spoke so passionately about this subject, and the hon. Member for Ochil and South Perthshire (John Nicolson) raised it as well—I was delighted to see that the Government had tabled amendments to cover Zach’s law. The fact that someone can deliberately seek out a person with epilepsy and target that person with flashing images with the intention of causing a seizure is a terrible example of the way in which systems can be abused. It is wrong for the platforms to be neutral and have no obligation to identify and stop that action, but the action is wrong in practice as well, and it demonstrates the need for us to ensure that the law keeps pace with the nature of new offences. I was very proud to meet Zach and his mother in October. I said to them then that their work had changed the law, and I am glad that the Government have tabled those amendments.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

May I pay tribute to my hon. Friend for his chairmanship of the Joint Committee last year? We covered a wide range of challenging ethical, moral and technical decisions, with work across both Houses, and I think that the decisions contained in our report informed many of the Government amendments, but it was my hon. Friend’s chairmanship that helped to guide us through that period.

Damian Collins Portrait Damian Collins
- Hansard - -

I am grateful to my hon. Friend for what he has said, and for his significant work on the Committee.

There is a great deal that we could say about this Bill, but let me end by touching on an important topic that I think my hon. Friend the Member for Dover (Mrs Elphicke) will speak about later: the way in which social media platforms are used by people trafficking gangs to recruit those who can help them with bringing people into the country in small boats. It was right that the Government included immigration offences in the list of priority legal harms in schedule 7. It was also right that, following a recommendation from the Joint Committee, they included fraud and scam ads in the scope of the Bill.

We have already accepted, in principle, that advertising can be within the Bill’s scope in certain circumstances, and that priority legal harms can be written into the Bill and identified as such. As I understand it, my hon. Friend’s amendment seeks to bring advertising services—not just organic posts on social media platforms—into the Bill’s scope as well. I know that the Government want to consider illegal activity in advertising as part of the online advertising review, but I hope that this could be an expedited process running in parallel with the Bill as it completes its stages. Illegal activity in advertising would not be allowed in the offline world. Newspaper editors are legally liable for what appears in their papers, and broadcasters can lose their licence if they allow illegal content to feature in advertising. We do not yet have the same enforcement mechanism through the advertising industry with the big online platforms, such as Google and Facebook, where the bulk of display advertising now goes. Their advertising market is bigger than the television advertising market. We are seeing serious examples of illegal activity, and it cannot be right that while such examples cannot be posted on a Facebook page, if money is put behind them and they are run as advertisements they can.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

My hon. Friend is making a very thoughtful speech. This is an important point, because it relates to criminality fuelled by online activity. We have discussed that before in the context of advertising. Tools already exist throughout Government to pick up such criminality, but we need the Bill to integrate them and drive the right outcomes—to stop this criminality, to secure the necessary prosecutions, and to bring about the deterrent effect that my hon. Friend the Member for Dover (Mrs Elphicke) is pursuing.

Natalie Elphicke Portrait Mrs Natalie Elphicke (Dover) (Con)
- Hansard - - - Excerpts

Will my right hon. Friend give way?

Damian Collins Portrait Damian Collins
- Hansard - -

Of course.

Natalie Elphicke Portrait Mrs Elphicke
- Hansard - - - Excerpts

I am grateful to my right hon. Friend raising this and for his support in this important area that affects our constituencies so much. I will be speaking later to the details of this, which go beyond the advertising payment to the usage, showing and sharing of this. As he has mentioned schedule 7, does he agree that there is—as I have set out in my amendment—a strong case for making sure that it covers all those illegal immigration and modern slavery offences, given the incredible harm that is being caused and that we see on a day-to-day basis?

Damian Collins Portrait Damian Collins
- Hansard - -

I agree with my hon. Friend, which is why I think it is important that immigration offences were included in schedule 7 of the Bill. I think this is something my right hon. Friend the Member for Croydon South felt strongly about, having been Immigration Minister before he was a tech Minister. It is right that this has been included in the scope of the Bill and I hope that when the code of practice is developed around that, the scope of those offences will be made clear.

On whether advertising should be included as well as other postings, it may well be that at this time the Online Safety Bill is not necessarily the vehicle through which that needs to be incorporated. It could be done separately through the review of the online advertising code. Either way, these are loopholes that need to be closed, and the debate around the Online Safety Bill has brought about a recognition of what offences can be brought within the regulatory scope of the Bill and where Ofcom can have a role in enforcing those measures. Indeed, the measures on disinformation in the National Security Bill are good example of that. In some ways it required the National Security Bill to create the offence, and then the offence could be read across into the Online Safety Bill and Ofcom could play a role in regulating the platforms to ensure that they complied with requests to take down networks of Russian state-backed disinformation. Something similar could work with immigration offences as well, but whether it is done that way or through the online advertising review or through new legislation, this is a loophole that needs to be closed.

Sarah Champion Portrait Sarah Champion (Rotherham) (Lab)
- View Speech - Hansard - - - Excerpts

I am learning so much sitting here. I am going to speak just on child protection, but all of us are vulnerable to online harms, so I am really grateful to hon. Members across the House who are bringing their specialisms to this debate with the sole aim of strengthening this piece of legislation to protect all of us. I really hope the Government listen to what is being said, because there seems to be a huge amount of consensus on this.

The reason I am focusing on child protection is that every police officer in this field that I talk to says that, in almost every case, abusers are now finding children first through online platforms. We cannot keep up with the speed or the scale of this, so I look to this Bill to try to do so much more. My frustration is that when the Bill first started, we were very much seen as a world leader in this field, but now the abuse has become so prolific, other countries have stepped in and we are sadly lagging behind, so I really hope the Minister does everything he can to get this into law as soon as possible.

Although there are aspects of the Bill that go a long way towards tackling child abuse online, it is far from perfect. I want to speak on a number of specific ways in which the Minister can hopefully improve it. The NSPCC has warned that over 100 online grooming and child abuse image crimes are likely to be recorded every day while we wait for this crucial legislation to pass. Of course, that is only the cases that are recorded. The number is going to be far greater than that. There are vital protections in the Bill, but there is a real threat that the use of virtual private networks—VPNs—could undermine the effectiveness of these measures. VPNs allow internet users to hide their private information, such as their location and data. They are commonly used, and often advertised, as a way for people to protect their data or watch online content. For example, on TV services such as Netflix, people might be able to access something only in the US, so they could use a VPN to circumnavigate that to enable them to watch it in this country.

During the Bill’s evidence sessions, Professor Clare McGlynn said that 75% of children aged 16 and 17 used, or knew how to use, a VPN, which means that they can avoid age verification controls. So if companies use age assurance tools, as listed in the safety duties of this Bill, there is no guarantee that they will provide the protections that are needed. I am also concerned that the use of VPNs could act as a barrier to removing indecent or illegal material from the internet. The Internet Watch Foundation uses a blocking list to remove this content from internet service providers, but users with a VPN are usually not protected through the provisions they use. It also concerns me that a VPN could be used in court to circumnavigate this legislation, which is very much based in the UK. Have the Government tested what will happen if someone uses a VPN to give the appearance of being overseas?

My new clause 54 would require the Secretary of State to publish, within six months of the Bill’s passage, a report on the effect of VPN use on Ofcom’s ability to enforce the requirements under clause 112. If VPNs cause significant issues, the Government must identify those issues and find solutions, rather than avoiding difficult problems.

New clause 28 would establish a user advocacy body to represent the interests of children in regulatory decisions. Children are not a homogenous group, and an advocacy body could reflect their diverse opinions and experiences. This new clause is widely supported in the House, as we have heard, and the NSPCC has argued that it would be an important way to counterbalance the attempts of big tech companies to reduce their obligations, which are placing their interests over children’s needs.

I would like to see more third sector organisations consulted on the code of practice. The Internet Watch Foundation, which many Members have discussed, already has the necessary expertise to drastically reduce the amount of child sexual abuse material on the internet. The Government must work with the IWF and build on its knowledge of web page blocking and image hashing.

Girls in particular face increased risk on social media, with the NSPCC reporting that nearly a quarter of girls who have taken a nude photo have had their image sent to someone else online without their permission. New clauses 45 to 50 would provide important protections to women and girls from intimate image abuse, by making the non-consensual sharing of such photos illegal. I am pleased that the Government have announced that they will look into introducing these measures in the other place, but we are yet to see any measures to compare with these new clauses.

In the face of the huge increase in online abuse, victims’ services must have the necessary means to provide specialist support. Refuge’s tech abuse team, for example, is highly effective at improving outcomes for thousands of survivors, but the demand for its services is rapidly increasing. It is only right that new clause 23 is instated so that a good proportion of the revenue made from the Bill’s provisions goes towards funding these vital services.

The landmark report by the independent inquiry into child sexual abuse recently highlighted that, between 2017-18 and 2020-21, there was an approximately 53% rise in recorded grooming offences. With this crime increasingly taking place online, the report emphasised that internet companies will need more moderators to aid technology in identifying this complex type of abuse. I urge the Minister to also require internet companies to provide sufficient and meaningful support to those moderators, who have to view and deal with disturbing images and videos on a daily basis. They, as well as the victims of these horrendous crimes, deserve our support.

I have consistently advocated for increased prevention of abuse, particularly through education in schools, but we must also ensure that adults, particularly parents, are educated about the threats online. Internet Matters found that parents underestimate the extent to which their children are having negative experiences online, and that the majority of parents believe their 14 to 16-year-olds know more about technology than they do.

The example that most sticks in my mind was provided by the then police chief in charge of child protection, who said, “What is happening on a Sunday night is that the family are sitting in the living room, all watching telly together. The teenager is online, and is being abused online.” In his words, “You wouldn’t let a young child go and open the door without knowing who is there, but that is what we do every day by giving them their iPad.”

If parents, guardians, teachers and other professionals are not aware of the risks and safeguards, how are they able to protect children online? I strongly encourage the Government to accept new clauses 29 and 30, which would place an additional duty on Ofcom to promote media literacy. Minister, you have the potential—

--- Later in debate ---
David Davis Portrait Mr David Davis
- View Speech - Hansard - - - Excerpts

I do not agree with every detail of what the hon. Member for Rotherham (Sarah Champion) said, but I share her aims. She has exactly the right surname for what she does in standing up for children.

To avoid the risk of giving my Whip a seizure, I congratulate the Government and the Minister on all they have done so far, both in delaying the Bill and in modifying their stance.

My hon. Friend the Member for Solihull (Julian Knight), who is no longer in the Chamber, said that this is five Bills in one and should have had massively more time. At the risk of sounding like a very old man, there was a time when this Bill would have had five days on Report. That is what should have happened with such a big Bill.

Opposition Members will not agree, but I am grateful that the Government decided to remove the legal but harmful clause. The simple fact is that the hon. Member for Pontypridd (Alex Davies-Jones) and I differ not in our aim—my new clause 16 is specifically designed to protect children—but on the method of achieving it. Once upon a time, there was a tradition that this Chamber would consider a Companies Bill every year, because things change over time. We ought to have a digital Bill every year, specifically to address not legal but harmful but, “Is it harmful enough to be made illegal?” Obviously, self-harm material is harmful enough to be made illegal.

The hon. Lady and I have similar aims, but we have different perspectives on how to attack this. My perspective is as someone who has seen many pieces of legislation go badly wrong despite the best of intentions.

The Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), knows he is a favourite of mine. He did a fantastic job in his previous role. I think this Bill is a huge improvement, but he has a lot more to do, as he recognises with the Bill returning to Committee.

One area on which I disagree with many of my hon. and right hon. Friends is the question of encryption. The Bill allows Ofcom to issue notices directing companies to use “accredited technology,” but it might as well say “magic,” because we do not know what is meant by “accredited technology.” Clause 104 will create a pressure to undermine the end-to-end encryption that is not only desirable but crucial to our telecommunications. The clause sounds innocuous and legalistic, especially given that the notices will be issued to remove terrorist or child sexual exploitation content, which we all agree has no place online.

Damian Collins Portrait Damian Collins
- Hansard - -

Rather than it being magic, does my right hon. Friend agree that a company could not ignore it if we demystified the process? By saying there is an existing technology that is available and proven to work, the company would have to explain why it is not using that technology or something better.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

I will come back to that in some detail.

The first time I used encryption it was one-time pads and Morse, so it was a long time ago. The last time was much more recent. The issue here is that clause 104 causes pressure by requiring real-time decryption. The only way to do that is by either having it unencrypted on the server, having it weakly encrypted or creating a back door. I am talking not about metadata, which I will come back to in a second, but about content. In that context, if the content needs to be rapidly accessible, it is bound to lead to weakened encryption.

This is perhaps a debate for a specialist forum, but it is very dangerous in a whole series of areas. What do we use encryption for? We use it for banking, for legal and privileged conversations, and for conversations with our constituents and families. I could go on and on about the areas in which encryption matters.

--- Later in debate ---
David Davis Portrait Mr Davis
- Hansard - - - Excerpts

I very much agree with my hon. Friend on that. He and I have been allies in the past—and sometimes opponents—and he has often been far ahead of other people. I am afraid that I do not remember the example from the 1970s, as that was before even my time here, but I remember the intervention he made in the 1990s and the fuss it caused. From that point of view, I absolutely agree with him. My new clause is clearly worded and I hope the House will give it proper consideration. It is important that we put something in the Bill on this issue, even if the Government, quite properly, amend it later.

I wish to raise one last point, which has come up as we have talked through these issues. I refer to the question of individual responsibility. One or two hon. Ladies on the Opposition Benches have cited algorithmic outcomes. As I said to the right hon. Member for Barking, I am worried about how we place the responsibility, and how it would lead the courts to behave, and so on. We will debate that in the next few days and when the Bill comes back again.

There is one other issue that nothing in this Bill covers, and I am not entirely sure why. Much of the behaviour pattern is algorithmic and it is algorithmic with an explicit design. As a number of people have said, it is designed as clickbait; it is designed to bring people back. We may get to a point, particularly if we come back to this year after year, of saying, “There are going to be rules about your algorithms, so you have to write it into the algorithm. You will not use certain sorts of content, pornographic content and so on, as clickbait.” We need to think about that in a sophisticated and subtle way. I am looking at my hon. Friend the Member for Folkestone and Hythe (Damian Collins), the ex-Chairman of the Select Committee, on this issue. If we are going to be the innovators—and we are the digital world innovators— we have to get this right.

Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. Friend is right to raise this important point. The big area here is not only clickbait, but AI-generated recommendation tools, such as a news feed on Facebook or “next up” on YouTube. Mitigating the illegal content on the platforms is not just about content moderation and removal; it is about not promoting.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

My hon. Friend is exactly right about that. I used the example of clickbait as shorthand. The simple truth is that “AI-generated” is also a misnomer, because these things are not normally AI; they are normally algorithms written specifically to recommend and to maximise returns and revenue. We are not surprised at that. Why should we be? After all, these are commercial companies we are talking about and that is what they are going to do. Every commercial company in the world operates within a regulatory framework that prevents them from making profits out of antisocial behaviour.

Online Harms

Damian Collins Excerpts
Wednesday 26th October 2022

(2 years ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Collins Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Damian Collins)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Dowd. This is my first appearance as a Minister in Westminster Hall, and your first appearance in the Chair, so we are both making our debuts. I hope we have long and successful reigns in our respective roles.

It is a great pleasure to respond to the debate secured by my right hon. Friend the Member for East Hampshire (Damian Hinds) and to his excellent opening speech. He feels strongly about these issues—as he did both in Government and previously as a member of the Digital, Culture, Media and Sport Committee—and he has spoken up about them. I enjoyed working with him when he was a Minister at the Home Office and I chaired the prelegislative scrutiny Committee, which discussed many important features of the Online Safety Bill. One feature of the Bill, of course, is the inclusion of measures on fraud and scam advertising, which was a recommendation of the Joint Committee. It made my life easier that, by the time I became a Minister in the Department, the Government had already accepted that recommendation and introduced the exemption, and I will come on to talk about that in more detail.

My right hon. Friend, the hon. Member for Pontypridd (Alex Davies-Jones) and other Members raised the case of Molly Russell, and it is important to reflect on that case. I share the sentiments expressed about the tragedy of Molly’s death, its avoidable nature and the tireless work of the Russell family, and particularly her father, Ian Russell, whom I have met several times to discuss this. The Russell family pursued a very difficult and complicated case, which required a huge release of evidence from the social media companies, particularly Instagram and Pinterest, to demonstrate the sort of content to which Molly Russell was exposed.

One of the things Ian Russell talks about is the work done by the investigating officers in the coroner’s inquest. Tellingly, the inquest restricted the amount of time that people could be exposed to the content that Molly was exposed to, and ensured that police officers who were investigating were not doing so on their own. Yet that was content that a vulnerable teenage girl saw repeatedly, on her own, in isolation from those who could have helped her.

When online safety issues are raised with social media companies, they say things like, “We make this stuff very hard to find.” The lived experience of most teenagers is not searching for such material; it is such material being selected by the platforms and targeted at the user. When someone opens TikTok, their first exposure is not to content that they have searched for; it is to content recommended to them by TikTok, which data-profiles the user and chooses things that will engage them. Those engagement-based business models are at the heart of the way the Online Safety Bill works and has to work. If platforms choose to recommend content to users to increase their engagement with the platform, they make a business decision. They are selecting content that they think will make a user want to return more frequently and stay on the platform for longer. That is how free apps make money from advertising: by driving engagement.

It is a fair criticism that, at times, the platforms are not effective enough at recognising the kinds of engagement tools they are using, the content that is used to engage people and the harm that that can do. For a vulnerable person, the sad truth is that their vulnerability will probably be detected by the AI that drives the recommendation tools. That person is far more likely to be exposed to content that will make their vulnerabilities worse. That is how a vulnerable teenage girl can be held by the hand—by an app’s AI recommendation tools—and walked from depression to self-harm and worse. That is why regulating online safety is so important and why the protection of children is so fundamental to the Bill. As hon. Members have rightly said, we must also ensure that we protect adults from some of the illegal and harmful activity on the platforms and hold those platforms to account for the business model they have created.

I take exception to the suggestion from the hon. Member for Pontypridd that this is a content-moderation Bill. It is not; it is a systems Bill. The content that we use, and often refer to, is an exemplar of the problem; it is an exemplar of things going wrong. On all the different areas of harm that are listed in the Bill, particularly the priority legal offences in schedule 7, our challenge to the companies is: “You have to demonstrate to the regulator that you have appropriate systems in place to identify this content, to ensure that you are not amplifying or recommending it and to mitigate it.” Mitigation could be suppressing the content—not letting it be amplified by their tools—removing it altogether or taking action against the accounts that post it. It is the regulator’s job to work with the companies, assess the risk, create codes of practice and then hold the companies to account for how they work.

There is criminal liability for the companies if they refuse to co-operate with the regulator. If they refuse to share information or evidence asked for by the regulator, a named company director will be criminally liable. That was in the original Bill. The recommendation in the Joint Committee report was that that should be commenced within months of the Bill being live; originally it was going to be two years. That is in the Bill today, and it is important that it is there so that companies know they have to comply with requests.

The hon. Member for Pontypridd is right to say that the Bill is world-leading, in the sense that it goes further than other people’s Bills, but other Bills have been enacted elsewhere in the world. That is why it is important that we get on with this.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister is right to say that we need to get on with this. I appreciate that he is not responsible for the business of this House, but his party and his Government are, so will he explain why the Bill has been pulled from the timetable next week, if it is such an important piece of legislation?

Damian Collins Portrait Damian Collins
- Hansard - -

As the hon. Lady knows, I can speak to the Bill; I cannot speak to the business of the House—that is a matter for the business managers in the usual way. Department officials—some here and some back at the Department—have been working tirelessly on the Bill to ensure we can get it in a timely fashion. I want to see it complete its Commons stages and go to the House of Lords as quickly as possible. Our target is to ensure that it receives safe passage in this Session of Parliament. Obviously, I cannot talk to the business of the House, which may alter as a consequence of the changes to Government.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that point, will the Minister assure us that he will push for the Bill to come back? Will he make the case to the business managers that the Bill should come back as soon as possible, in order to fulfil his aim of having it pass in this Session of Parliament?

Damian Collins Portrait Damian Collins
- Hansard - -

As the hon. Lady knows, I cannot speak to the business of the House. What I would say is that the Department has worked tirelessly to ensure the safe passage of the Bill. We want to see it on the Floor of the House as quickly as possible—our only objective is to ensure that that happens. I hope that the business managers will be able to confirm shortly when that will be. Obviously, the hon. Lady can raise the issue herself with the Leader of the House at the business statement tomorrow.

Jonathan Lord Portrait Mr Jonathan Lord (Woking) (Con)
- Hansard - - - Excerpts

Could the Minister address the serious issue raised my hon. Friend the Member for Hexham (Guy Opperman)? There can be no excuse for search engines to give a prominent place, or indeed any place, to fake Aviva sites—scamming sites—once those have been brought to their attention. Likewise, unacceptable scam ads for Aviva, Martin Lewis or whoever are completely avoidable if decent checks are in place. Will the Government address those issues in the Bill and in other ways?

Damian Collins Portrait Damian Collins
- Hansard - -

I am grateful to my hon. Friend. The answer is yes, absolutely. It was always the case with the Bill that illegal content, including fraud, was in scope. The question in the original draft Bill was that that did not include advertising. Advertising can be in the form of display advertising that can be seen on social media platforms; for search services, it can also be boosted search returns. Under the Bill, known frauds and scams that have been identified should not appear in advertising on regulated platforms. That change was recommended by the Joint Committee, and the Government accepted it. It is really important that that is the case, because the company should have a liability; we cannot work just on the basis that the offence has been committed by the person who has created the advert and who is running the scam. If an intermediary platform is profiting out of someone else’s illegal activity, that should not be allowed. It would be within Ofcom’s regulatory powers to identify whether that is happening and to see that platforms are taking action against it. If not, those companies will be failing in their safety duties, and they will be liable for very large fines that can be levied against them for breaching their obligations, as set out in the Online Safety Bill, which can be up to 10% of global revenues in any one year. That power will absolutely be there.

Some companies could choose to have systems in place to make it less likely that scam ads would appear on their platforms. Google has a policy under which it works with the Financial Conduct Authority and does not accept financial product advertising from organisations that are not FCA accredited. That has been quite an effective way to filter out a lot of potential scam ads before they appear. Whether companies have policies such as that, or other ways of doing these things, they will have to demonstrate to Ofcom that those are effective. [Interruption.] Does my hon. Friend the Member for Hexham (Guy Opperman) want to come in on that? I can see him poised to spring forward.

Guy Opperman Portrait Guy Opperman
- Hansard - - - Excerpts

No, keep going.

Damian Collins Portrait Damian Collins
- Hansard - -

I would like to touch on some of the other issues that have been raised in the debate. The hon. Member for Leeds East (Richard Burgon) and others made the point about smaller, high-risk platforms. All platforms, regardless of size, have to meet the illegal priority harm standard. For the worst offences, they will already have to produce risk assessments and respond to Ofcom’s request for information. Given that, I would suspect that, if Ofcom had a suspicion that serious illegal activity, or other activity that was causing serious concern, was taking place on a smaller platform, it would have powers to investigate and would probably find that the platform was in breach of those responsibilities. It is not the case that if a company is not a category 1 company, it is not held to account under the illegal priority harms clauses of the Bill. Those clauses cover a wide range of offences, and it is important—this was an important amendment to the Bill recommended by the Joint Committee—that those offences were written into the Bill so that people can see what they are.

The hon. Member for Pontypridd raised the issue of violence against women and girls, but what I would say is that violence against everyone is included in the Bill. The offences of promoting or inciting violence, harassment, stalking and sending unsolicited sexual images are all included in the Bill. The way the schedule 7 offences work is that the schedule lists existing areas of law. Violence against women and girls is covered by lots of different laws; that is why there is not a single offence for it and why it is not listed. That does not mean that we do not take it seriously. As I said to the hon. Lady when we debated this issue on the first day of Report, we all understand that women and girls are far more likely to be victims of abuse online, and they are therefore the group that should benefit the most from the provisions in the Bill.

The hon. Member for Coventry North West (Taiwo Owatemi) spoke about cyber-bullying. Again, offences relating to harassment are included in the Bill. This is also an important area where the regulator’s job is to ensure that companies enforce their own terms of service. For example, TikTok, which is very popular with younger users, has in place quite strict policies on preventing bullying, abuse and intimidation on its services. But does it enforce that effectively? So far, we have largely relied on the platforms self-declaring whether that is the case; we have never had the ability to really know. Now Ofcom will have that power, and it will be able to challenge companies such as TikTok. I have raised with TikTok as well my concern about the prevalence of blackout challenge content, which remains on that platform and which has led to people losing their lives. Could TikTok be more effective at removing more of that content? We will now have the regulatory power to investigate—to get behind the curtain and to see what is really going on.

Peter Dowd Portrait Peter Dowd (in the Chair)
- Hansard - - - Excerpts

Minister, can I just say that there may be votes very shortly? That means that we will be suspending the sitting and coming back, so if you can—

Damian Collins Portrait Damian Collins
- Hansard - -

Wrap it up in the next—

Damian Collins Portrait Damian Collins
- Hansard - -

I will just touch on a couple of other points that have been raised. My hon. Friend the Member for Barrow and Furness (Simon Fell) and other Members raised the point about the abuse of footballers. The abuse suffered by England footballers after the final of the European championship is a very good example. Some people have been charged and prosecuted for what they posted. It was a known-about risk; it was avoidable. The platform should have done more to stop it. This Bill will make sure that they do.

That shows that we have many offences where there is already a legal threshold, and we want them to be included in the regulatory systems. For online safety standards, it is important that the minimum thresholds are based on our laws. In the debate on “legal but harmful”, one of the key points to consider, and one that many Members have brought up, is what we base the thresholds on. To base them on the many offences that we already have written into law is, I think, a good starting point. We understand what those thresholds are. We understand what illegal activity is. We say to the platforms, “Your safety standards must, at a minimum, be at that level.” Platforms do go further in their terms of service. Most terms of service, if properly enforced, would deal with most of the sorts of content that we have spoken about. That is why, if the platforms are to enforce their terms of service properly, the provisions on traceability and accountability are so important. I believe that that will capture the offences that we need.

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) rightly said—if I may paraphrase slightly—that we should not let the perfect be the enemy of the good. There will always be new things that we wish to add and new offences that we have not yet thought about that we need to include, and the structure of the Bill creates the framework for that. In the future, Parliament can create new offences that can form part of the schedule of priority illegal offences. On priority harms, I would say that that is the stuff that the platforms have to proactively look for. Anything illegal could be considered illegal online, and the regulators could take action against it.

Let me finish by thanking all the Members here, including my hon. Friend the Member for Gosport (Dame Caroline Dinenage), another former Minister. A very knowledgeable and distinguished group of Members have taken part in this debate. Finally, I thank the officials at the Department. Until someone is actually in the Department, they can never quite know what work is being done—that is the nature of Government—but I know how personally dedicated those officials are to the Bill. They have all gone the extra mile in the work they are doing for it. For their sakes and all of ours, I want to make sure that we pass it as soon as possible.

Question put and agreed to.

Resolved,

That this House has considered online harms.

BBC Licence Fee Non-Payment (Decriminalisation for over 75s) Bill

Damian Collins Excerpts
Christopher Chope Portrait Sir Christopher Chope
- Hansard - - - Excerpts

I agree. That is why I hope the Ministry of Justice, which is concerned about delays in the magistrates courts, will be saying, “How ridiculous that our magistrates courts should be taken up with cases of BBC licence fee non-payment.”

We talk about bureaucracy and the shortage of people in this country to engage in productive employment. The BBC has said that it wishes to return to the pre-pandemic level of visits to people’s homes in relation to the licence fee. In 2020-21, licensing officers from the BBC visited 671,500 homes, and 62,000 residents were found to have been using the BBC not in accordance with the rules. What an enormous volume of activity that involved—activity that I think we should be able to dispense with, and we would be able to dispense with it if we dispensed with the BBC licence fee, but we could take a staging point halfway if we prevented the BBC from being able to prosecute these normally hapless people.

In February 2020, the Government launched a consultation on the issue of decriminalisation. It took about a year for the results to be published. In their response, the Government were pretty damning about the criminalisation of those who do not pay the licence fee. Paragraph 70 of the report from the Department for Digital, Culture, Media and Sport states:

“After considering the consultation responses, the government remains concerned that criminal prosecution is, as a matter of principle, an unfair and disproportionate approach to enforcement of TV licence evasion in a modern public service broadcasting system.”

So there we have it, Madam Deputy Speaker. Paragraph 76 states:

“Against this background, the government therefore intends to continue assessing these potential impacts of an alternative sanction on licence fee payers. On this basis, while no final decision has been taken at this time, the government will keep the issue of decriminalisation under active consideration as part of the roadmap of reform of the BBC discussed below.”

I am delighted to see the Secretary of State on the Front Bench, but I hope that the Government are indeed “actively” dealing with this issue.

Damian Collins Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Damian Collins)
- View Speech - Hansard - -

I thank my hon. Friend for what he has said, but, for the record, I am not the Secretary of State, although I am a Minister in the Department.

Christopher Chope Portrait Sir Christopher Chope
- Hansard - - - Excerpts

I am so sorry. In that case, the hon. Gentleman is even more welcome to his position. It is hard to keep up with some of the changes that are taking place on the Front Bench at the moment.

This issue needs to be addressed, and it is good to know that the Government are still considering it, but another year has passed and there is not much indication—not much that I have received, anyway—that the “active consideration” of the issue of decriminalisation is reaching any conclusion. In the meantime, as I have said, people are being prosecuted up and down the country, and people aged over 75 who thought they were going to have a free television licence are particularly vulnerable to such activity.

This is an important issue. Apparently a mid-term review of the BBC charter is taking place this year. We are told that the licence fee will remain at £159 until the beginning of April 2024. That means that if there were to be a general election after that, in 2024, people would be asking, “Why has the BBC licence fee just increased?” I am not sure it is very good timing, but that is the plan. The BBC is expected to receive £3.7 billion in licence fee funding this year. Why are people not more exercised about this? It is a television tax, and it is more than twice the cost of reducing the top rate of tax from 45p in the pound, about which there was a big argument at the Conservative party conference.

Putting it all in context, and as a party in favour of supporting hard-working families, I would have thought we would be taking action to commit ourselves to doing away with the television tax and, in the meantime, doing away with the criminalisation of those who do not pay the television tax.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

Just to confirm for the record that it was, of course, the BBC’s decision to end free television licences for the over-75s. It was ultimately the BBC’s decision.

Christopher Chope Portrait Sir Christopher Chope
- Hansard - - - Excerpts

So it was the BBC’s fault. My reading of it is that there was an attempt to cast responsibility on to the BBC, but ultimately it was the Government who enabled the BBC to put back in place a television licence fee—

Damian Collins Portrait Damian Collins
- Hansard - -

The BBC agreed to a financial settlement with the previous Government that provided transitional funding, after which the BBC would take on responsibility. That was always the case, and it was the deal the BBC signed up to at the time.

Christopher Chope Portrait Sir Christopher Chope
- Hansard - - - Excerpts

Okay, so what happened? Did the BBC go back on the deal? If so, what was the sanction against the BBC? Why are we continuing to indulge the BBC as we are, by enforcing the £3.7 billion television tax paid to the BBC?

We have also given the BBC additional powers to raise the borrowing limits for its commercial activities, which are a great success. The BBC is selling a lot of important stuff overseas. Why do we need to subsidise that with taxpayers’ money? Why do we not let the BBC run its commercial arm with freedom, and without imposing additional costs on the hard-pressed taxpayer?

I have made a short point and, unfortunately, there is not time for the Minister to respond. We will have to continue the Second Reading of this important Bill on another occasion, when I hope the Minister will be able to respond in extenso.

Damian Collins Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Damian Collins)
- Hansard - -

In the short time I have, I will address the concessionary licence fee for the over-75s and provide the necessary context for a range of relevant issues, including the BBC’s decision to end free TV licences for the over-75s, the Government’s work on decriminalising TV licence fee evasion and our broader road map for BBC reform, including our intention to review the licence fee funding model.

The House will no doubt be aware that, in the 2015 funding settlement, the Government agreed with the BBC that the responsibility for the over-75s concession should transfer to the BBC. The Government and the BBC agreed to make that change. Alongside that, the Government also closed the iPlayer loophole, committed to increase the licence fee in line with inflation and reduced a number of other spending commitments. To help with the financial planning, the Government agreed to provide phased transitional funding over two years to gradually—

Oral Answers to Questions

Damian Collins Excerpts
Thursday 20th October 2022

(2 years ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
John Whittingdale Portrait Sir John Whittingdale (Maldon) (Con)
- Hansard - - - Excerpts

7. What steps her Department is taking to increase the transparency and accountability of technology platforms.

Damian Collins Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Damian Collins)
- View Speech - Hansard - -

The Government are driving forward a digital regulatory approach that unlocks growth and boosts trust. As part of that, we are taking steps to improve transparency and accountability, including through the Online Safety Bill; data protection legislation that maintains rules for responsible usage; and digital markets legislation, which will promote competition in digital markets.

John Whittingdale Portrait Sir John Whittingdale
- View Speech - Hansard - - - Excerpts

Does my hon. Friend share my concern at the recent behaviour of PayPal in arbitrarily removing certain accounts of campaigning and journalistic organisations without any warning or explanation? Will he consider how the Online Safety Bill can give greater protection for free speech by increasing the accountability of PayPal, Facebook and the other giant tech platforms?

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

Absolutely. I agree with my right hon. Friend: it is really important that big tech platforms are transparent and accountable to their users in their terms of service for how they trade. That is important in the principle of how the Online Safety Bill works, both in protecting freedom of speech and in ensuring that companies enforce their platform policies correctly. In terms of digital markets, it is also important that customers know what fair access they have to markets and that they will be treated fairly by platforms, and that the platforms make clear what their terms of service are.

Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Hansard - - - Excerpts

8. Whether she is taking steps to give people in the UK more control over their data.

Digital, Culture, Media and Sport

Damian Collins Excerpts
Thursday 21st July 2022

(2 years, 3 months ago)

Ministerial Corrections
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
The following is an extract from the debate on Report of the Online Safety Bill on 12 July 2022.
Damian Collins Portrait Damian Collins
- Hansard - -

If something is illegal offline, it is illegal online as well. There are priority areas where the company must proactively look for that. There are also non-priority areas where the company should take action against anything that is an offence in law and meets the criminal threshold online. The job of the regulator is to hold them to account for that.

[Official Report, 12 July 2022, Vol. 718, c. 161.]

Letter of correction from the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins)

Errors have been identified in my response to the hon. and learned Member for Edinburgh South West (Joanna Cherry).

The correct response should have been:

Damian Collins Portrait Damian Collins
- Hansard - -

If something is illegal offline, it is illegal online as well. There are priority areas where the company must proactively look for that. There are also non-priority areas where the company should take action against anything that is an offence in law which has an individual victim and meets the criminal threshold online. The job of the regulator is to hold them to account for that.

The following is a further extract from the debate on Report of the Online Safety Bill on 12 July 2022.

Damian Collins Portrait Damian Collins
- Hansard - -

On the amendments that the former Minister, my hon. Friend the Member for Croydon South (Chris Philp), spoke to, the word “consistently” has not been removed from the text. There is new language that follows the use of “consistently”, but the use of that word will still apply in the context of the companies’ duties to act against illegal content.

[Official Report, 12 July 2022, Vol. 718, c. 209.]

Letter of correction from the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins)

Errors have been identified in my intervention on the hon. Member for Enfield North (Feryal Clark).

The correct response should have been:

Damian Collins Portrait Damian Collins
- Hansard - -

On the amendments that the former Minister, my hon. Friend the Member for Croydon South (Chris Philp), spoke to, the word “consistently” has not been removed from the text. There is new language that follows the use of “consistently”, and the use of that word will still apply across the safety duties and in particular for category 1 platforms.

The following is a further extract from the debate on Report of the Online Safety Bill on 12 July 2022.

Damian Collins Portrait Damian Collins
- Hansard - -

A number of Members raised the issue of freedom of speech provisions, particularly my hon. Friend the Member for Windsor (Adam Afriyie) at the end of his excellent speech. We have sought to bring, in the Government amendments, additional clarity to the way the legislation works, so that it is absolutely clear what the priority legal offences are.

[Official Report, 12 July 2022, Vol. 718, c. 218.]

Letter of correction from the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins)

Errors have been identified in my response to my hon. Friend the Member for Windsor (Adam Afriyie).

The correct response should have been:

Damian Collins Portrait Damian Collins
- Hansard - -

A number of Members raised the issue of freedom of speech provisions, particularly my hon. Friend the Member for Windsor (Adam Afriyie) at the end of his excellent speech. We have sought to bring, in the Government’s commitments via written ministerial statement, additional clarity to the way the legislation works, so that it is absolutely clear what the priority categories of harmful content are.

The following is a further extract from the debate on Report of the Online Safety Bill on 12 July 2022.

Damian Collins Portrait Damian Collins
- Hansard - -

The Bill absolutely addresses the sharing of non-consensual images in that way, so that would be something the regulator should take enforcement action against—

[Official Report, 12 July 2022, Vol. 718, c. 259.]

Letter of correction from the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins)

Errors have been identified in my response to the hon. Member for Birmingham, Yardley (Jess Phillips)

The correct response should have been:

Damian Collins Portrait Damian Collins
- Hansard - -

The Bill absolutely addresses the non-consensual sharing of intimate images in that way, so that would be something the regulator should take enforcement action against—